www.transicionestructural.NET es un nuevo foro, que a partir del 25/06/2012 se ha separado de su homónimo .COM. No se compartirán nuevos mensajes o usuarios a partir de dicho día.
0 Usuarios y 4 Visitantes están viendo este tema.
CitarTesla Reinvents Carmaking With Quiet BreakthroughPosted by BeauHD on Thursday September 14, 2023 @06:30PM from the no-easy-feat dept.An anonymous reader quotes a report from Reuters:CitarTesla has combined a series of innovations to make a technological breakthrough that could transform the way it makes electric vehicles and help Elon Musk achieve his aim of halving production costs, five people familiar with the move said. The company pioneered the use of huge presses with 6,000 to 9,000 tons of clamping pressure to mold the front and rear structures of its Model Y in a "gigacasting" process that slashed production costs and left rivals scrambling to catch up. In a bid to extend its lead, Tesla is closing in on an innovation that would allow it to die cast nearly all the complex underbody of an EV in one piece, rather than about 400 parts in a conventional car, the people said.The know-how is core to Tesla's "unboxed" manufacturing strategy unveiled by Chief Executive Musk in March, a linchpin of his plan to churn out tens of millions of cheaper EVs in the coming decade, and still make a profit, the sources said. While Tesla has said its unboxed model involves producing large sub-assemblies of a car at the same time and then snapping them together, the size and make-up of the modular blocks is still the subject of speculation. Two of the sources said Tesla's previously unreported new design and manufacturing techniques meant the company could develop a car from the ground up in 18 to 24 months, while most rivals can currently take anywhere from three to four years.The five people said a single large frame -- combining the front and rear sections with the middle underbody where the battery is housed -- could be used in Tesla's small EV which it aims to launch with a price tag of $25,000 by the middle of the decade. Tesla was expected to make a decision on whether to die cast the platform in one piece as soon as this month, three of the sources said, though even if they do press ahead the end product could change during the design validation process. The breakthrough Tesla has made centers on the how the giant molds for such a large part are designed and tested for mass production, and how casts can incorporate hollow subframes with internal ribs to cut weight and boost crashworthiness.To overcome the obstacles associated with this manufacturing technique, Tesla is collaborating with firms that use 3D printing technology to create sand molds for casting, which is cost-effective and allows for rapid design iterations. The sand casting process significantly reduces design cycle times compared to traditional metal mold prototypes.Tesla also plans to use solid sand cores within the molds to create hollow subframes, addressing weight and crashworthiness concerns. However, there is still a decision to be made regarding the type of press to use for casting large body parts, with trade-offs between productivity and quality.Saludos.
Tesla Reinvents Carmaking With Quiet BreakthroughPosted by BeauHD on Thursday September 14, 2023 @06:30PM from the no-easy-feat dept.An anonymous reader quotes a report from Reuters:CitarTesla has combined a series of innovations to make a technological breakthrough that could transform the way it makes electric vehicles and help Elon Musk achieve his aim of halving production costs, five people familiar with the move said. The company pioneered the use of huge presses with 6,000 to 9,000 tons of clamping pressure to mold the front and rear structures of its Model Y in a "gigacasting" process that slashed production costs and left rivals scrambling to catch up. In a bid to extend its lead, Tesla is closing in on an innovation that would allow it to die cast nearly all the complex underbody of an EV in one piece, rather than about 400 parts in a conventional car, the people said.The know-how is core to Tesla's "unboxed" manufacturing strategy unveiled by Chief Executive Musk in March, a linchpin of his plan to churn out tens of millions of cheaper EVs in the coming decade, and still make a profit, the sources said. While Tesla has said its unboxed model involves producing large sub-assemblies of a car at the same time and then snapping them together, the size and make-up of the modular blocks is still the subject of speculation. Two of the sources said Tesla's previously unreported new design and manufacturing techniques meant the company could develop a car from the ground up in 18 to 24 months, while most rivals can currently take anywhere from three to four years.The five people said a single large frame -- combining the front and rear sections with the middle underbody where the battery is housed -- could be used in Tesla's small EV which it aims to launch with a price tag of $25,000 by the middle of the decade. Tesla was expected to make a decision on whether to die cast the platform in one piece as soon as this month, three of the sources said, though even if they do press ahead the end product could change during the design validation process. The breakthrough Tesla has made centers on the how the giant molds for such a large part are designed and tested for mass production, and how casts can incorporate hollow subframes with internal ribs to cut weight and boost crashworthiness.To overcome the obstacles associated with this manufacturing technique, Tesla is collaborating with firms that use 3D printing technology to create sand molds for casting, which is cost-effective and allows for rapid design iterations. The sand casting process significantly reduces design cycle times compared to traditional metal mold prototypes.Tesla also plans to use solid sand cores within the molds to create hollow subframes, addressing weight and crashworthiness concerns. However, there is still a decision to be made regarding the type of press to use for casting large body parts, with trade-offs between productivity and quality.
Tesla has combined a series of innovations to make a technological breakthrough that could transform the way it makes electric vehicles and help Elon Musk achieve his aim of halving production costs, five people familiar with the move said. The company pioneered the use of huge presses with 6,000 to 9,000 tons of clamping pressure to mold the front and rear structures of its Model Y in a "gigacasting" process that slashed production costs and left rivals scrambling to catch up. In a bid to extend its lead, Tesla is closing in on an innovation that would allow it to die cast nearly all the complex underbody of an EV in one piece, rather than about 400 parts in a conventional car, the people said.The know-how is core to Tesla's "unboxed" manufacturing strategy unveiled by Chief Executive Musk in March, a linchpin of his plan to churn out tens of millions of cheaper EVs in the coming decade, and still make a profit, the sources said. While Tesla has said its unboxed model involves producing large sub-assemblies of a car at the same time and then snapping them together, the size and make-up of the modular blocks is still the subject of speculation. Two of the sources said Tesla's previously unreported new design and manufacturing techniques meant the company could develop a car from the ground up in 18 to 24 months, while most rivals can currently take anywhere from three to four years.The five people said a single large frame -- combining the front and rear sections with the middle underbody where the battery is housed -- could be used in Tesla's small EV which it aims to launch with a price tag of $25,000 by the middle of the decade. Tesla was expected to make a decision on whether to die cast the platform in one piece as soon as this month, three of the sources said, though even if they do press ahead the end product could change during the design validation process. The breakthrough Tesla has made centers on the how the giant molds for such a large part are designed and tested for mass production, and how casts can incorporate hollow subframes with internal ribs to cut weight and boost crashworthiness.
Microsoft Needs So Much Power to Train AI That It's Considering Small Nuclear ReactorsPosted by EditorDavid on Saturday September 30, 2023 @11:34PM from the seeking-a-reaction dept.An anonymous reader shares this report from Futurism:CitarTraining large language models is an incredibly power-intensive process that has an immense carbon footprint. Keeping data centers running requires a ludicrous amount of electricity that could generate substantial amounts of greenhouse emissions — depending, of course, on the energy's source. Now, the Verge reports, Microsoft is betting so big on AI that its pushing forward with a plan to power them using nuclear reactors. Yes, you read that right; a recent job listing suggests the company is planning to grow its energy infrastructure with the use of small modular reactors (SMR)...But before Microsoft can start relying on nuclear power to train its AIs, it'll have plenty of other hurdles to overcome. For one, it'll have to source a working SMR design. Then, it'll have to figure out how to get its hands on a highly enriched uranium fuel that these small reactors typically require, as The Verge points out. Finally, it'll need to figure out a way to store all of that nuclear waste long term...Other than nuclear fission, Microsoft is also investing in nuclear fusion, a far more ambitious endeavor, given the many decades of research that have yet to lead to a practical power system. Nevertheless, the company signed a power purchase agreement with Helion, a fusion startup founded by OpenAI CEO Sam Altman earlier this year, with the hopes of buying electricity from it as soon as 2028.
Training large language models is an incredibly power-intensive process that has an immense carbon footprint. Keeping data centers running requires a ludicrous amount of electricity that could generate substantial amounts of greenhouse emissions — depending, of course, on the energy's source. Now, the Verge reports, Microsoft is betting so big on AI that its pushing forward with a plan to power them using nuclear reactors. Yes, you read that right; a recent job listing suggests the company is planning to grow its energy infrastructure with the use of small modular reactors (SMR)...But before Microsoft can start relying on nuclear power to train its AIs, it'll have plenty of other hurdles to overcome. For one, it'll have to source a working SMR design. Then, it'll have to figure out how to get its hands on a highly enriched uranium fuel that these small reactors typically require, as The Verge points out. Finally, it'll need to figure out a way to store all of that nuclear waste long term...Other than nuclear fission, Microsoft is also investing in nuclear fusion, a far more ambitious endeavor, given the many decades of research that have yet to lead to a practical power system. Nevertheless, the company signed a power purchase agreement with Helion, a fusion startup founded by OpenAI CEO Sam Altman earlier this year, with the hopes of buying electricity from it as soon as 2028.
Antimatter Feels Gravity Just like MatterSeptember 27, 2023• Physics 16, 167The first direct observations of antihydrogen atoms falling in Earth’s gravity show that they experience gravity in the same way as ordinary matter does.Figure captionThrow a ball into the air and the pull of Earth’s gravity will bring it crashing back down. But what about a ball of antimatter? Will it fall in the same way, or does it somehow experience gravity differently? Physicists have been exploring such questions for nearly a century but, until now, there had been no direct experimental test of antimatter in free fall. Releasing the results of observations of free-falling antihydrogen atoms, the Antihydrogen Laser Physics Apparatus (ALPHA) Collaboration at CERN in Switzerland shows that the particles experience the same gravitational pull as ordinary matter as they accelerate to Earth [1]. The collaboration says that the experiments are a landmark test of the weak equivalence principle, which states that all types of mass should react equivalently to the force of gravity.“There’s no theoretical reason to expect [antimatter] to do anything else but fall with a regular acceleration,” says Holger Müller, a physicist at the University of California, Berkeley, who was not involved in the study. Still, he is pleased to see the expectation confirmed. “There is just no substitute for direct observation,” he says.Antihydrogen is composed of one antiproton and one positron—the antiparticle of an electron—making it the simplest neutrally charged antimatter atom. As such, it is an ideal system for probing gravity, as other forces can be ignored: if the researchers had instead used a charged particle, for example, electric forces would have come into play and—because they are stronger—would have overpowered gravity’s pull. “Making a gravity measurement, you’re just overwhelmed by a bunch of forces you can’t control,” says Will Bertsche, a physicist at the University of Manchester, UK, and a member of the ALPHA Collaboration. “You need some antimatter that is neutral.”At the center of the ALPHA-g setup used for the free-fall experiments sits a magnetic trap composed of a superconducting magnet, which generates a magnetic field in the radial direction, and two electromagnetic magnets, called mirror coils, which generate fields in the vertical direction. These three magnets are aligned such that they trap antihydrogen atoms at 0.5 K between the two mirror coils. Detectors designed to reconstruct particle trajectories surround the trap.In order to subject the trapped antihydrogen atoms to gravity, the researchers weakened the magnetic fields holding them in place over a period of 20 seconds. As that happened, the antihydrogen atoms, which were jumping around inside the trap, streamed out of the device, moving both up and down. The team then detected the antiparticles through the energy released when they annihilated with matter particles.Because gravity is such a weak force, some of the antihydrogen atoms that initially move up when released should continue to rise, just like baseballs thrown into the air, which the collaboration sees in its results. But the researchers also find that more antihydrogen atoms exit from the bottom of the trap, the expected result if gravity’s effect on antimatter and ordinary matter is the same. Statistical analysis of the observations put the gravitational acceleration of the antimatter particle to be within one standard deviation of that of ordinary matter.While previous experiments performed by others have explored antimatter’s interaction with gravity, those tests were all indirect. “It was just thrilling to see that the predictions that we had made, in some cases a decade earlier, actually turned out to be true,” says Joel Fajans, a physicist at the University of California, Berkeley, and a member of the ALPHA Collaboration. Müller agrees. “Finally, this [experiment] has happened,” he says. “I’m just happy that this [result] exists.”Now that the collaboration has confirmed that the experiments work, the researchers plan to upgrade their setup before running further measurements. They also plan to develop computer simulations that better predict the behavior of antimatter atoms exposed to gravity. With both advances in place, the hope is that the ALPHA team will reach its ultimate goal—obtaining the first precise measurement of the weight of an antimatter atom. Time will tell.–Allison GaspariniAllison Gasparini is a freelance science writer based in Santa Cruz, CA.References E. K. Anderson et al., “Observation of the effect of gravity on the motion of antimatter,” Nature 621, 716 (2023).http://dx.doi.org/10.1038/s41586-023-06527-1
10,187,163 views May 1, 2018This is the original landscape-format version of the short movie Cosmic Eye, designed by astrophysicist Danail Obreschkow. The movie zooms through all well-known scales of the universe from minuscule elementary particles out to the gigantic cosmic web. This project was inspired by a progression of increasingly accurate graphical representations of the scales of the universe, including the classical essay "Cosmic View" by Kees Boeke (1957), the short movie "Cosmic Zoom" by Eva Szasz (1968), and the legendary movie "Powers of Ten" by Charles and Ray Eames (1977). Cosmic Eye takes these historical visualisations to the state-of-the-art using real photographs obtained with modern detectors, telescopes, and microscopes. Other views are renderings of modern computer models. Vector-based blending techniques are used to create a seamless zoom.This 2018-version of Cosmic Eye contains improved graphics and minor technical corrections compared to the 2011-version in portrait format.
Hydro Dams Are Struggling To Handle the World's IntensifyingPosted by BeauHD on Saturday October 14, 2023 @03:00AM from the predicting-the-future dept.Saqib Rahim reports via Wired:CitarIt's been one of the wettest years in California since records began. From October 2022 to March 2023, the state was blasted by 31 atmospheric rivers -- colossal bands of water vapor that form above the Pacific and become firehoses when they reach the West Coast. What surprised climate scientists wasn't the number of storms, but their strength and rat-a-tat frequency. The downpours shocked a water system that had just experienced the driest three years in recorded state history, causing floods, mass evacuations, and at least 22 deaths.Swinging between wet and dry extremes is typical for California, but last winter's rain, potentially intensified by climate change, was almost unmanageable. Add to that the arrival of El Nino, and more extreme weather looks likely for the state. This is going to make life very difficult for the dam operators tasked with capturing and controlling much of the state's water. Like most of the world's 58,700 large dams, those in California were built for yesterday's more stable climate patterns. But as climate change taxes the world's water systems -- affecting rainfall, snowmelt, and evaporation -- it's getting tough to predict how much water gets to a dam, and when. Dams are increasingly either water-starved, unable to maintain supplies of power and water for their communities, or overwhelmed and forced to release more water than desired -- risking flooding downstream.But at one major dam in Northern California, operators have been demonstrating how to not just weather these erratic and intense storms, but capitalize on them. Management crews at New Bullards Bar, built in 1970, entered last winter armed with new forecasting tools that gave unprecedented insight into the size and strength of the coming storms -- allowing them to strategize how to handle the rain. First, they let the rains refill their reservoir, a typical move after a long drought. Then, as more storms formed at sea, they made the tough choice to release some of this precious hoard through their hydropower turbines, confident that more rain was coming. "I felt a little nervous at first," says John James, director of resource planning at Yuba Water Agency in northern California. Fresh showers soon validated the move. New Bullards Bar ended winter with plumped water supplies, a 150 percent boost in power generation, and a clean safety record. The strategy offers a glimpse of how better forecasting can allow hydropower to adapt to the climate age.
It's been one of the wettest years in California since records began. From October 2022 to March 2023, the state was blasted by 31 atmospheric rivers -- colossal bands of water vapor that form above the Pacific and become firehoses when they reach the West Coast. What surprised climate scientists wasn't the number of storms, but their strength and rat-a-tat frequency. The downpours shocked a water system that had just experienced the driest three years in recorded state history, causing floods, mass evacuations, and at least 22 deaths.Swinging between wet and dry extremes is typical for California, but last winter's rain, potentially intensified by climate change, was almost unmanageable. Add to that the arrival of El Nino, and more extreme weather looks likely for the state. This is going to make life very difficult for the dam operators tasked with capturing and controlling much of the state's water. Like most of the world's 58,700 large dams, those in California were built for yesterday's more stable climate patterns. But as climate change taxes the world's water systems -- affecting rainfall, snowmelt, and evaporation -- it's getting tough to predict how much water gets to a dam, and when. Dams are increasingly either water-starved, unable to maintain supplies of power and water for their communities, or overwhelmed and forced to release more water than desired -- risking flooding downstream.But at one major dam in Northern California, operators have been demonstrating how to not just weather these erratic and intense storms, but capitalize on them. Management crews at New Bullards Bar, built in 1970, entered last winter armed with new forecasting tools that gave unprecedented insight into the size and strength of the coming storms -- allowing them to strategize how to handle the rain. First, they let the rains refill their reservoir, a typical move after a long drought. Then, as more storms formed at sea, they made the tough choice to release some of this precious hoard through their hydropower turbines, confident that more rain was coming. "I felt a little nervous at first," says John James, director of resource planning at Yuba Water Agency in northern California. Fresh showers soon validated the move. New Bullards Bar ended winter with plumped water supplies, a 150 percent boost in power generation, and a clean safety record. The strategy offers a glimpse of how better forecasting can allow hydropower to adapt to the climate age.
AI Could Predict Heart Attack Risk Up To 10 Years in the Future, Finds Oxford StudyPosted by msmash on Tuesday November 14, 2023 @11:00AM from the breakthroughs dept.AI could be used to predict if a person is at risk of having a heart attack up to 10 years in the future, a study has found. From a report:CitarThe technology could save thousands of lives while improving treatment for almost half of patients, researchers at the University of Oxford said. The study, funded by the British Heart Foundation (BHF), looked at how AI might improve the accuracy of cardiac CT scans, which are used to detect blockages or narrowing in the arteries.Prof Charalambos Antoniades, chair of cardiovascular medicine at the BHF and director of the acute multidisciplinary imaging and interventional centre at Oxford, said: "Our study found that some patients presenting in hospital with chest pain -- who are often reassured and sent back home -- are at high risk of having a heart attack in the next decade, even in the absence of any sign of disease in their heart arteries. Here we demonstrated that providing an accurate picture of risk to clinicians can alter, and potentially improve, the course of treatment for many heart patients."About 350,000 people in the UK have a CT scan each year but, according to the BHF, many patients later die of heart attacks due to their failure in picking up small, undetectable narrowings. Researchers analysed the data of more than 40,000 patients undergoing routine cardiac CT scans at eight UK hospitals, with a median follow-up time of 2.7 years. The AI tool was tested on a further 3,393 patients over almost eight years and was able to accurately predict the risk of a heart attack. AI-generated risk scores were then presented to medics for 744 patients, with 45% having their treatment plans altered by medics as a result.
The technology could save thousands of lives while improving treatment for almost half of patients, researchers at the University of Oxford said. The study, funded by the British Heart Foundation (BHF), looked at how AI might improve the accuracy of cardiac CT scans, which are used to detect blockages or narrowing in the arteries.Prof Charalambos Antoniades, chair of cardiovascular medicine at the BHF and director of the acute multidisciplinary imaging and interventional centre at Oxford, said: "Our study found that some patients presenting in hospital with chest pain -- who are often reassured and sent back home -- are at high risk of having a heart attack in the next decade, even in the absence of any sign of disease in their heart arteries. Here we demonstrated that providing an accurate picture of risk to clinicians can alter, and potentially improve, the course of treatment for many heart patients."About 350,000 people in the UK have a CT scan each year but, according to the BHF, many patients later die of heart attacks due to their failure in picking up small, undetectable narrowings. Researchers analysed the data of more than 40,000 patients undergoing routine cardiac CT scans at eight UK hospitals, with a median follow-up time of 2.7 years. The AI tool was tested on a further 3,393 patients over almost eight years and was able to accurately predict the risk of a heart attack. AI-generated risk scores were then presented to medics for 744 patients, with 45% having their treatment plans altered by medics as a result.
Google DeepMind's Weather AI Can Forecast Extreme Weather Faster and More AccuratelyPosted by msmash on Tuesday November 14, 2023 @01:40PM from the breakthroughs dept.In research published in Science today, Google DeepMind's model, GraphCast, was able to predict weather conditions up to 10 days in advance, more accurately and much faster than the current gold standard. From a report:CitarGraphCast outperformed the model from the European Centre for Medium-Range Weather Forecasts (ECMWF) in more than 90% of over 1,300 test areas. And on predictions for Earth's troposphere -- the lowest part of the atmosphere, where most weather happens -- GraphCast outperformed the ECMWF's model on more than 99% of weather variables, such as rain and air temperature. Crucially, GraphCast can also offer meteorologists accurate warnings, much earlier than standard models, of conditions such as extreme temperatures and the paths of cyclones. In September, GraphCast accurately predicted that Hurricane Lee would make landfall in Nova Scotia nine days in advance, says Remi Lam, a staff research scientist at Google DeepMind. Traditional weather forecasting models pinpointed the hurricane to Nova Scotia only six days in advance.[...] Traditionally, meteorologists use massive computer simulations to make weather predictions. They are very energy intensive and time consuming to run, because the simulations take into account many physics-based equations and different weather variables such as temperature, precipitation, pressure, wind, humidity, and cloudiness, one by one. GraphCast uses machine learning to do these calculations in under a minute. Instead of using the physics-based equations, it bases its predictions on four decades of historical weather data. GraphCast uses graph neural networks, which map Earth's surface into more than a million grid points. At each grid point, the model predicts the temperature, wind speed and direction, and mean sea-level pressure, as well as other conditions like humidity. The neural network is then able to find patterns and draw conclusions about what will happen next for each of these data points.
GraphCast outperformed the model from the European Centre for Medium-Range Weather Forecasts (ECMWF) in more than 90% of over 1,300 test areas. And on predictions for Earth's troposphere -- the lowest part of the atmosphere, where most weather happens -- GraphCast outperformed the ECMWF's model on more than 99% of weather variables, such as rain and air temperature. Crucially, GraphCast can also offer meteorologists accurate warnings, much earlier than standard models, of conditions such as extreme temperatures and the paths of cyclones. In September, GraphCast accurately predicted that Hurricane Lee would make landfall in Nova Scotia nine days in advance, says Remi Lam, a staff research scientist at Google DeepMind. Traditional weather forecasting models pinpointed the hurricane to Nova Scotia only six days in advance.[...] Traditionally, meteorologists use massive computer simulations to make weather predictions. They are very energy intensive and time consuming to run, because the simulations take into account many physics-based equations and different weather variables such as temperature, precipitation, pressure, wind, humidity, and cloudiness, one by one. GraphCast uses machine learning to do these calculations in under a minute. Instead of using the physics-based equations, it bases its predictions on four decades of historical weather data. GraphCast uses graph neural networks, which map Earth's surface into more than a million grid points. At each grid point, the model predicts the temperature, wind speed and direction, and mean sea-level pressure, as well as other conditions like humidity. The neural network is then able to find patterns and draw conclusions about what will happen next for each of these data points.
CitarGoogle DeepMind's Weather AI Can Forecast Extreme Weather Faster and More AccuratelyPosted by msmash on Tuesday November 14, 2023 @01:40PM from the breakthroughs dept.In research published in Science today, Google DeepMind's model, GraphCast, was able to predict weather conditions up to 10 days in advance, more accurately and much faster than the current gold standard. From a report:CitarGraphCast outperformed the model from the European Centre for Medium-Range Weather Forecasts (ECMWF) in more than 90% of over 1,300 test areas. And on predictions for Earth's troposphere -- the lowest part of the atmosphere, where most weather happens -- GraphCast outperformed the ECMWF's model on more than 99% of weather variables, such as rain and air temperature. Crucially, GraphCast can also offer meteorologists accurate warnings, much earlier than standard models, of conditions such as extreme temperatures and the paths of cyclones. In September, GraphCast accurately predicted that Hurricane Lee would make landfall in Nova Scotia nine days in advance, says Remi Lam, a staff research scientist at Google DeepMind. Traditional weather forecasting models pinpointed the hurricane to Nova Scotia only six days in advance.[...] Traditionally, meteorologists use massive computer simulations to make weather predictions. They are very energy intensive and time consuming to run, because the simulations take into account many physics-based equations and different weather variables such as temperature, precipitation, pressure, wind, humidity, and cloudiness, one by one. GraphCast uses machine learning to do these calculations in under a minute. Instead of using the physics-based equations, it bases its predictions on four decades of historical weather data. GraphCast uses graph neural networks, which map Earth's surface into more than a million grid points. At each grid point, the model predicts the temperature, wind speed and direction, and mean sea-level pressure, as well as other conditions like humidity. The neural network is then able to find patterns and draw conclusions about what will happen next for each of these data points.Saludos.
Estas cosas me preocupan.A ver si me explico. Me alegro de que tengamos formas mejores y más eficientes de predecir el tiempo.El problema es que, hasta donde yo entiendo (y si no es así, los que saben más que me corrijan), realmente la IA no es inteligente, puesto que si lo fuera nos podría contar cómo ha llegado al resultado. Y entiendo que no es así, la IA no nos puede explicar por qué va a llover mañana. Por lo tanto ni la IA ni nosotros sabemos por qué pasan las cosas o cómo hemos llegado a ese resultado.No sé si ven el problema que yo veo.
Cita de: Saturio en Noviembre 15, 2023, 01:09:18 amEstas cosas me preocupan.A ver si me explico. Me alegro de que tengamos formas mejores y más eficientes de predecir el tiempo.El problema es que, hasta donde yo entiendo (y si no es así, los que saben más que me corrijan), realmente la IA no es inteligente, puesto que si lo fuera nos podría contar cómo ha llegado al resultado. Y entiendo que no es así, la IA no nos puede explicar por qué va a llover mañana. Por lo tanto ni la IA ni nosotros sabemos por qué pasan las cosas o cómo hemos llegado a ese resultado.No sé si ven el problema que yo veo.Ese es el quid de la cuestión con todo el tema de las IAs (aunque, en mi opinión, denominar IAs a modelos como el de GraphCast no ayudan precisamente a centrar el debate).¿Es cierto que no sabemos con exactitud como la red neuronal realiza la predicción? Evidentemente.¿Debemos por ello descartar el uso de este tipo de modelos? Yo creo que no.Saludos.