Global sea levels are about eight inches higher today than they were in 1880, and they are expected to rise another two to seven feet during this century. At the same time, some 5 million people in the U.S. live in 2.6 million coastal homes situated less than 4 feet above high tide.你告诉吗,今天的全球海平面要比1880年的时候高达8英寸,而就在本世纪内,全球海平面预计还将下跌2到7英尺。另外,美国沿海地区有260万户家庭的500余万人口的住宅,在海水满潮时,只高达海平面将近4英尺。Do the math: Climate change is a problem, whatever its cause.毫无疑问,气候变化是个大问题,不管造成它的原因是什么。The problem? Actually making those complex calculations is an extremely challenging proposition. To understand the impact of climate change at the local level, you’ll need more than back-of-the-napkin mathematics.那么如何计算出来气候对环境的影响呢?事实上,要展开这些简单的计算出来,是一个极具挑战性的课题。
要想要理解气候变化对一国一地的影响水平,意味著不是在一张餐巾纸上写写画画就能算得出来的。You’ll need big data technology.这时你就必须大数据技术了。
Surging Seas is an interactive map and tool developed by the nonprofit Climate Central that shows in graphic detail the threats from sea-level rise and storm surges to all of the 3,000-plus coastal towns, cities, counties and states in the continental United States. With detail down to neighborhood scale—search for a specific location or zoom down as necessary—the tool matches areas with flooding risk timelines and provides links to fact sheets, data downloads, action plans, embeddable widgets, and other items.“下降的海平面”(Surging Seas)是由非盈利的组织“气候中心”(Climate Central)研发的一款互动式地图工具,它用图形的形式详尽刻画了海平面下降和风暴潮给美国大陆沿海3000多个城市、城镇和农村导致的威胁。它的细节可以准确到每一个街区——你可以搜寻一个特定的地理位置,或是按照必须之后增大目标范围。这个工具不会与不存在洪泛风险的地区展开给定,并且获取涉及动态报导、数据iTunes、行动计划、嵌入小工具和其它涉及事项的链接。
It’s the kind of number-crunching that was all but impossible only a few years ago.这种数据处理方式意味着在几年前还是不有可能构建的。‘Just as powerful, just as big’能力有多大,艰难就多大“Our strategy is to tell people about their climate locally in ways they can understand, and the only way to do that is with big data analysis,” said Richard Wiles, vice president for strategic communications and director of research with Climate Central. “Big data allows you to say simple, clear things.”气候中心的战略交流副总裁兼研究主任理查德o怀尔斯回应:“我们的战略是以人们需要解读的方式告诉他他们当地的气候情况,唯一能构建这个目标的方法就是通过大数据分析。大数据让你需要非常简单、明晰地传达。
”There are actually two types of big data in use today to help understand and deal with climate change, Wiles said. The first is relatively recently collected data that is so voluminous and complex that it couldn’t be effectively manipulated before, such as NASA images of heat over cities, Wiles said. This kind of data “literally was too big to handle not that long ago,” he said, “but now you can handle it on a regular computer.”思尔斯认为,目前主要有两种大数据形式可以用来协助人们理解和应付气候变化。第一类是某些在近期才搜集到的数据,但它们往往数据量很大且非常复杂,拢在以前很难对其展开有效地分析,比如美国国家航空航天局(NASA)对各大城市的热光学绘图。思尔斯回应,这种数据“仍然到旋即之前,还因为数据量过大而基本上不了处置,但是现在你早已可以在一台普通的电脑上处置它们了。
”The second type of big data is older datasets that may be less-than-reliable. This data “was always kind of there,” Wiles said, such as historic temperature trends in the United States. That kind of dataset is not overly complex, but it can be fraught with gaps and errors. “A guy in Oklahoma may have broken his thermometer back in 1936,” Wiles said, meaning that there could be no measurements at all for two months of that year.第二类大数据是一些比较较杨家但有可能不那么可信的数据。思尔斯回应,这些数据“基本上仍然都在那儿”,比如美国的历史气温趋势。
这种数据一般不过于简单,但有可能不存在不少缺口和误差。比如思尔斯就认为:“1936年,俄克拉荷马州的某个负责管理量气温的家伙有可能不小心把温度计弄坏了。
”这样的话,当年有可能就有两个月显然没气温记录。Address those issues, and existing data can be “just as powerful, just as big,” Wiles said. “It makes it possible to make the story very local.”思尔斯回应,要解决问题这些问题,现有的数据可以说道“能力有多大,艰难就有多大。
但是大数据技术使得说明了一城一地的气候变化沦为有可能。”Climate Central imports data from historical government records to produce highly localized graphics for about 150 local TV weather forecasters across the U.S., illustrating climate change in each station’s particular area. For example, “Junes in Toledo are getting hotter,” Wiles said. “We use these data all the time to try to localize the climate change story so people can understand it.”气候中心从政府的历史记录中提供原始数据,然后为美国各地的150余家地方电视台的天气预报节目制作高度本地化的气候图形,以阐述该地区的气候变化。比如思尔斯认为:“今年六月,托雷多市变热了。我们仍然利用这些数据企图让当地人理解气候变化趋势。
”‘One million hours of computation’100万小时的计算出来Though the Climate Central map is an effective tool for illustrating the problem of rising sea levels, big data technology is also helping researchers model, analyze, and predict the effects of climate change.气候中心的地图是阐述海平面下降情况的一个十分有效地的工具。此外,大数据技术还能协助研究人员仿真、分析和预测气候变化的影响。
“Our goal is to turbo-charge the best science on massive data to create novel insights and drive action,” said Rebecca Moore, engineering manager for Google Earth Engine. Google Earth Engine aims to bring together the world’s satellite imagery—trillions of scientific measurements dating back almost 40 years—and make it available online along with tools for researchers.谷歌地图引擎(Google Earth Engine)的工程经理瑞贝卡o摩尔讲解道:“我们的目标是助力最差的大数据分析技术,以促成精致的看法并且增进行动。”谷歌地图目的将全球的卫星图像展开汇总,其中还包括40年来数以万亿收的观测数据,并将其与其它为研究人员研发的工具一道放到网上。Global deforestation, for example, “is a significant contributor to climate change, and until recently you could not find a detailed current map of the state of the world’s forests anywhere,” Moore said. That changed last November when Science magazine published the first high-resolution maps of global forest change from 2000 to 2012, powered by Google Earth Engine.比如在全球荒漠化问题上,摩尔回应:“全球荒漠化是气候变化的一个最重要推动者,直到旋即之前,还没一份详尽的动态地图需要表明全球各地的森林情况。但现在情况有所不同了,去年11月,《科学》(Science)杂志在谷歌地图引擎的协助下,公布了首张2000至2012年的高分辨率全球森林变化图。
“We ran forest-mapping algorithms developed by Professor Matt Hansen of University of Maryland on almost 700,000 Landsat satellite images—a total of 20 trillion pixels,” she said. “It required more than one million hours of computation, but because we ran the analysis on 10,000 computers in parallel, Earth Engine was able to produce the results in a matter of days.”摩尔讲解道:“我们运营的森林测绘算法是由马里兰大学(University of Maryland)的马特o汉森教授研发的,总共利用了70万张美国陆地资源卫星的图像,特一起约有20万亿个像素点。它必须多达100万小时的计算出来时间,但由于我们是在10,000台计算机上并行计算的,因此谷歌地球引擎才以求在几天内就得出结论了结果。On a single computer, that analysis would have taken more than 15 years. Anyone in the world can view the resulting interactive global map on a PC or mobile device.如果要用一台计算机计算出来的话,已完成这样一次分析大约必须多达15年的时间。但现在全球各地的任何人都可以在电脑或移动设备上查阅这次分析获得的这张互动式全球地图。
‘We have sensors everywhere’传感器无所不在Rapidly propelling such developments, meanwhile, is the fact that data is being collected today on a larger scale than ever before.在这些项目获得较慢进展的背后必不可少这样一个事实:如今我们对数据的搜集程度早已近超强以往任何时候。“Big data in climate first means that we have sensors everywhere: in space, looking down via remote sensing satellites, and on the ground,” said Kirk Borne, a data scientist and professor at George Mason University. Those sensors are continually recording information about weather, land use, vegetation, oceans, ice cover, precipitation, drought, water quality, and many more variables, he said. They are also tracking correlations between datasets: biodiversity changes, invasive species, and at-risk species, for example.乔治梅森大学的数据学家柯克o波恩教授认为:“大数据技术在气候研究领域的发展,首先意味著传感器早已无所不在。首先是太空中的遥测卫星,其次是地面上的传感器。
”这些传感器时刻记录着地球各地的天气、土地利用、植被、海洋、冰层、降水、旱季、水质等信息以及许多变量。同时它们也在追踪各种数据之间的关联,比如生物多样性的变化、侵略物种和濒临绝种物种等等。Two large monitoring projects of this kind are NEON—the National Ecological Observatory Network—andOOI, the Ocean Observatories Initiative.在这一类监控项目中有两个较为有代表性的大型项目,一个是美国国家生态观测站网络(NEON),一个是海洋观测计划(OOI)。
“All of these sensors also deliver a vast increase in the rate and the number of climate-related parameters that we are now measuring, monitoring, and tracking,” Borne said. “These data give us increasingly deeper and broader coverage of climate change, both temporally and geospatially.”波恩认为:“这些传感器令其我们现在正在观测和跟踪的气候参数无论在等级还是数量上都有了很大的提升。另外无论是在时间上还是在地理空间上,这些数据对气候变化的覆盖面积都显得更加浅、更加甚广。”Climate change is one of the largest examples of scientific modeling and simulation, Borne said. Efforts are focused not on tomorrow’s weather but on decades and centuries into the future.波恩回应,气候变化是科学建模建模应用于得最普遍的例子之一。科学家不仅利用建模建模来预测明天的天气,而且还用它来预测几十年甚至几百年后的气候。
“Huge climate simulations are now run daily, if not more frequently,” he said. These simulations have increasingly higher horizontal spatial resolution—hundreds of kilometers, versus tens of kilometers in older simulations; higher vertical resolution, referring to the number of atmospheric layers that can be modeled; and higher temporal resolution—zeroing in on minutes or hours as opposed to days or weeks, he added.他还回应:“大规模的气候仿真现在每天都在运营,有些甚至有可能更加频密。”这些仿真的水平分辨率更高,超过几百公里,而过去的仿真不能超过几十公里。
同时它们横向分辨率也显得更高,这也就回应可以对大气层中更加多的层展开建模。另外还有更高的瞬时分辨率,也就是说只必须几分钟或几个小时就可以展开一回合校正,而不是几天或几个星期。
The output of each daily simulation amounts to petabytes of data and requires an assortment of tools for storing, processing, analyzing, visualizing, and mining.每天的气候仿真都会分解几千兆字节的数据,并且必须一系列工具展开存储、处置、分析、挖出和图像化。‘All models are wrong, but some are useful’所有模型都是错的,但有些很简单Interpreting climate change data may be the most challenging part.气候变化数据的理解有可能是最具备挑战性的部分。“When working with big data, it is easy to create a model that explains the correlations that we discover in our data,” Borne said. “But we need to remember that correlation does not imply causation, and so we need to apply systematic scientific methodology.”波恩认为:“做大数据时,要创建一个模型来说明我们在数据中找到的某种关联是很更容易的。
但我们得忘记,这种关联并不代表原因,所以我们必须应用于系统化的科学方法。”It’s also important to heed the maxim that “all models are wrong, but some are useful,” Borne said, quoting statistician George Box. “This is especially critical for numerical computer simulations, where there are so many assumptions and ‘parameterizations of our ignorance.’波恩还认为,做大数据最差要忘记统计学家乔治o博克斯的名言:“所有模型都是错的,但有些很简单。”他回应:“这对数字计算机仿真尤为重要,因为其中有很多假设和‘代表了我们的幼稚的参数’”。
“What fixes that problem—and also addresses Box’s warning—is data assimilation,” Borne said, referring to the process by which “we incorporate the latest and greatest observational data into the current model of a real system in order to correct, adjust, and validate. Big data play a vital and essential role in climate prediction science by providing corrective actions through ongoing data assimilation.”波恩回应:“要想要解决问题这个问题,以及解决问题博克斯警告我们的问题,最重要的是作好数据同化。”也就是“把近期最差的观测数据划入一个现实系统的动态模型中,以对数据展开缺失、调整、证实。通过以不间断的数据同化作为校正措施,大数据在气候预测科学中扮演着了至关重要且不可或缺的角色。
‘We are in a data revolution’我们早已在一场数据革命之中Earlier this year, the Obama administration launchedClimate.data.gov with more than 100 curated, high-quality data sets, Web services, and tools that can be used by anyone to help prepare for the effects of climate change. At the same time, NASA invited citizens to help find solutions to the coastal flooding challenge at an April mass-collaboration event.今年早些时候,奥巴马政府发售了官方的气象研究网站Climate.data.gov,上面有100多种精心编辑的高质量数据以及网页服务和工具,任何人都可以利用这些数据与工具来研究气候变化的影响。与此同时,NASA也在今年四月的一次大型协作活动上,邀普通民众帮助其找寻应付沿海洪灾的解决方案。More recently, UN Global Pulse launched a Big Data Climate Challenge to crowdsource projects that use big data to address the economic dimensions of climate change.最近,联合国“全球脉动”行动(UN Global Pulse)发售了一项“大数据气候挑战”项目,将一些用大数据研究气候变化对经济的影响的项目通过众包的形式展开了公布。
“We’ve already received submissions from 20 countries in energy, smart cities, forestry and agriculture,” said Miguel Luengo-Oroz, chief scientist for Global Pulse, which focuses on relief and development efforts around the world. “We also hope to see submissions from fields such as architecture, green data centers, risk management and material sciences.”“全球脉动”行动主要致力于全球各地的贫困地区救灾与发展事业,该行动的首席科学家卢恩戈o奥罗兹回应:“我们早已接到了来自20多个国家的在能源、智能城市、林业和农业等领域的意见书。我们也期望接到建筑、绿色数据中心、风险管理和材料科学等领域的意见书。”Big data can allow for more efficient responses to emerging crises, distributed access to knowledge, and greater understanding of the effects personal and policy decisions have on the planet’s climate, Luengo-Oroz added.卢恩戈o奥罗兹补足道,大数据还可以用作提升脑溢血灾害的应急工作效率,获取更加普遍地提供科学知识的渠道,以及协助我们更佳地理解私人与政府的决策不会对地球的气候导致哪些影响。“But it’s not the data that will save us,” he said. “It’s the analysis and usage of the data that can help us make better decisions for climate action. Just like with climate change, it is no longer a question of, ‘is this happening?’ We are in a data revolution.”奥罗兹回应:“然而解救我们的不是那些数据,而是那些让我们能作出更佳的决策来应付气候变化的数据分析与用于方法。
这就像气候变化本身一样,现在早已不是‘它开始了吗’的问题。我们早已在一场数据革命之中。
本文关键词:金年汇app官方网站下载最新版,金年会app官方网,jinnianhui金年会登录入口,金年会电子游戏入口
本文来源:金年汇app官方网站下载最新版-www.fmv968.com