• Report: Global Catastrophic Risks 2017

    'Whether it’s the spectre of nuclear conflict or our planet tipping into catastrophic climate change, the need for effective global cooperation has never been greater' http://www.independent.co.uk/news/uk/home-news/climate-change-global-warming-nuclear-war-asteroid-pandemic-volcano-global-catastrophe-a7752171.html The Global Challenges Foundation’s annual report “Global Catastrophic Risks 2017” is based on the latest scientific research. It contains contributions from leading experts, and summarizes the current status of global efforts to manage catastrophic risks. The GCF has also commissioned an international survey, in which 8,000 members of the general public in eight countries have given their views on global risks and how to handle them. https://www.globalchallenges.org/en United N...

    published: 25 May 2017
  • How to manage global catastrophic risk

    On February 15, Governance Studies at Brookings hosted an event to discuss the management of global catastrophic risk. For decades, international organizations such as the United Nations, the International Monetary Fund, and the World Bank have helped national, regional, and global leaders tackle these challenges. However, many believe that new approaches and fresh thinking are needed in the global governance arena. https://www.brookings.edu/events/how-to-manage-global-catastrophic-risk/ (transcript available) Subscribe! http://www.youtube.com/subscription_center?add_user=BrookingsInstitution Follow Brookings on social media! Facebook: http://www.Facebook.com/Brookings Twitter: http://www.twitter.com/BrookingsInst Instagram: http://www.Instagram.com/brookingsinst LinkedIn: http://www.l...

    published: 15 Feb 2017
  • Nick Bostrom - Introduction to the Global Catastrophes Risk Conference 2008

    Nick Bostrom provides an introduction to the Global Catastrophic Risks Conference and briefly addressing some of the key themes running through it. Hard to find, exists a download here: http://podcasts.ox.ac.uk/introduction-global-catastrophes-risk-conference-2008 Subscribe to this Channel: http://youtube.com/subscription_center?add_user=TheRationalFuture Science, Technology & the Future: http://scifuture.org Humanity+: http://humanityplus.org

    published: 22 Feb 2015
  • Is Computer Superintelligence an Existential Risk?

    This documentary explores the risk that Artificial Intelligence could pose an existential threat to humanity.

    published: 08 May 2016
  • Global catastrophic risk - Video Learning - WizScience.com

    A "global catastrophic risk" is a hypothetical future event with the potential to seriously damage human well-being on a global scale. Some events could destroy or cripple modern civilization. Other, even more severe events could cause human extinction, any of which can be referred to as an "existential risk". Potential global catastrophic risks include but are not limited to hostile artificial intelligence, nanotechnology weapons, nuclear warfare, and pandemics. Researchers experience difficulty in studying human extinction directly, since humanity has never been destroyed before. While this does not mean that it will not be in the future, it does make modelling existential risks difficult, due in part to survivorship bias. The philosopher Nick Bostrom classifies risks ac...

    published: 10 Sep 2015
  • Jamais Cascio - Global Catastrophic Risks

    Jamais Cascio, who blogs at Open the Future discusses Global Catastrophic Risks: http://www.scifuture.org/?p=3711 We are surrounded by catastrophic existential risks – you know, personally, societally, civilizationally. The intriguing thing about them is that the chance of any one of the happening is extremely slim. So very low likelihood, very significant results. Yet there is a non-zero chance that at the end of this sentence a meteor will come down and strike me in the head. It didn’t happen, but it could – there is no physical reason why it wouldn’t and given enough time eventually something will happen. So we face – We are dealing with existential risks, catastrophic, globally & civilizationally catastrophic risks all the time and it’s easy to ignore them. The problem – the dilemma –...

    published: 07 Apr 2016
  • David Pearce - Global Catastrophic & Existential Risk - Sleepwalking into the Abyss

    Existential risk? I think the greatest underlying source of existential and global catastrophic risk lies in male human primates doing what evolution "designed" male human primates to do, namely wage war. (cf. http://ieet.org/index.php/IEET/more/4576 ) Unfortunately, we now have thermonuclear weapons to do so. Does the study of ERR diminish or enhance ER? One man's risk is another man's opportunity. 2) Is the existence of suffering itself a form of ER insofar as it increases the likelihood of intelligent agency pressing a global OFF button, cleanly or otherwise? If I focussed on ERR, phasing out suffering would be high on the To Do list. AGI? Well, I'd argue it's a form anthropomorphic projection on our part to ascribe intelligence or mind to digital computers. Believers in digital sentie...

    published: 05 Oct 2014
  • Security in an Age of Catastrophic Risk

    Bruce Schneier Chief Technology Officer, Resilient Systems In cyberspace and out, we're increasingly confronting extremely-low-probability, extremely-high-damage attacks. Protecting against these sorts of risks requires new ways of thinking about security; one that emphasizes agility and resilience, while avoiding worst-case thinking.

    published: 05 May 2015
  • Global Catastrophic Risks

    published: 15 Apr 2017
  • Global catastrophic risk

    A global catastrophic risk is a hypothetical future event that has the potential to damage human well-being on a global scale.Some events could cripple or destroy modern civilization.Any event that could cause human extinction is also known as an existential risk.Potential global catastrophic risks include but are not limited to hostile artificial intelligence, nanotechnology weapons, climate change, nuclear warfare, total war and pandemics. ---Image-Copyright-and-Permission--- About the author(s): The original uploader was Fredrik at English Wikipedia License: Public domain Author(s): Fredrik ---Image-Copyright-and-Permission--- This channel is dedicated to make Wikipedia, one of the biggest knowledge databases in the world available to people with limited vision. Article available und...

    published: 28 Aug 2016
  • Global catastrophic risks

    Global catastrophic risk

    published: 23 May 2017
  • 25 Catastrophic Disasters Scientists Believe Could Affect Our World

    The survival of mankind and civilization is a natural instinct that has kept us alive for thousands of years. However, during the past few decades the scientific community has become more aware than ever before of global catastrophic risks (GCR)—the risk of events large enough to significantly harm or even destroy human civilization on a global scale. Here follow 25 of the most catastrophic disasters against humanity that we might have to deal with in the near or distant future. Follow us on: Twitter: https://twitter.com/list25 Facebook: https://www.facebook.com/list25 Website: http://list25.com Instagram: https://instagram.com/list25/ Pinterest: https://www.pinterest.com/list25/ Check out the physical list at - http://list25.com/25-catastrophic-disasters-scientists-believe-could-affect-...

    published: 20 Mar 2015
  • Global catastrophic risk

    Global catastrophic risk =======Image-Copyright-Info======== License: Creative Commons Attribution-Share Alike 3.0 (CC BY-SA 3.0) LicenseLink: http://creativecommons.org/licenses/by-sa/3.0 Author-Info: Wrev Image Source: https://en.wikipedia.org/wiki/File:X-risk-chart-en-01a.svg =======Image-Copyright-Info======== ☆Video is targeted to blind users Attribution: Article text available under CC-BY-SA image source in video

    published: 06 Jan 2016
  • Wikipedia Global Catastrophic Risk

    published: 21 Mar 2017
  • Storm Force: Global Catastrophe (Full Documentary)

    A global catastrophic risk is a hypothetical future event that has the potential to damage human well-being on a global scale. Some events could cripple or destroy modern civilization. Any event that could cause human extinction or permanently and drastically curtail humanity's potential is known as an existential risk. Potential global catastrophic risks include anthropogenic risks (technology risks, governance risks) and natural or external risks. Examples of technology risks are hostile artificial intelligence, biotechnology risks, or nanotechnology weapons. Insufficient global governance creates risks in the social and political domain (potentially leading to a global war with or without a nuclear holocaust, bioterrorism using genetically modified organisms, cyberterrorism destroying ...

    published: 02 Aug 2017
  • Existential Risks and Extreme Opportunities | Stuart Armstrong | TEDxAthens

    What are existential risks? They are the risks that threaten the very survival of the human species, or those that could dramatically curtail its potential. There are many, from asteroid impact, to engineered pandemic to artificial intelligence (AI), and they are almost all understudied. AI risk is the least understood, but potentially the deadliest of all, as AIs could be extremely powerful agents with insufficiently safe motivations and goals. The problem is very difficult, philosophically and programmatically. If there obstacles are overcome, however, humanity can expect to look forwards to a world of dramatic abundance of health, wealth, and happiness. Stuart Armstrong’s research at the Future of Humanity Institute centres on formal decision theory, the risks and possibilities of Arti...

    published: 06 Apr 2015
Report: Global Catastrophic Risks 2017

Report: Global Catastrophic Risks 2017

  • Order:
  • Duration: 8:01
  • Updated: 25 May 2017
  • views: 1962
videos
'Whether it’s the spectre of nuclear conflict or our planet tipping into catastrophic climate change, the need for effective global cooperation has never been greater' http://www.independent.co.uk/news/uk/home-news/climate-change-global-warming-nuclear-war-asteroid-pandemic-volcano-global-catastrophe-a7752171.html The Global Challenges Foundation’s annual report “Global Catastrophic Risks 2017” is based on the latest scientific research. It contains contributions from leading experts, and summarizes the current status of global efforts to manage catastrophic risks. The GCF has also commissioned an international survey, in which 8,000 members of the general public in eight countries have given their views on global risks and how to handle them. https://www.globalchallenges.org/en United Nations footage with UN's Peter Thomson, President of the General Assembly http://www.un.org/apps/news/story.asp?NewsID=56405 Leonardo DiCaprio at United Nations Climate Summit 2014 (480P) https://www.youtube.com/watch?v=Nt3Tz3cpIpI Nuclear test footage from Castle Bravo https://en.wikipedia.org/wiki/Castle_Bravo Additional video footage via http://pixabay.com Sound effects by http://Soundmorph.com and http://EpicStockMedia.com This video was possible with support from the Environmental Coffee House https://www.facebook.com/environmentalcoffeehouse Support my future video productions https://www.patreon.com/ClimateState
https://wn.com/Report_Global_Catastrophic_Risks_2017
How to manage global catastrophic risk

How to manage global catastrophic risk

  • Order:
  • Duration: 1:25:43
  • Updated: 15 Feb 2017
  • views: 933
videos
On February 15, Governance Studies at Brookings hosted an event to discuss the management of global catastrophic risk. For decades, international organizations such as the United Nations, the International Monetary Fund, and the World Bank have helped national, regional, and global leaders tackle these challenges. However, many believe that new approaches and fresh thinking are needed in the global governance arena. https://www.brookings.edu/events/how-to-manage-global-catastrophic-risk/ (transcript available) Subscribe! http://www.youtube.com/subscription_center?add_user=BrookingsInstitution Follow Brookings on social media! Facebook: http://www.Facebook.com/Brookings Twitter: http://www.twitter.com/BrookingsInst Instagram: http://www.Instagram.com/brookingsinst LinkedIn: http://www.linkedin.com/com/company/the-brookings-institution
https://wn.com/How_To_Manage_Global_Catastrophic_Risk
Nick Bostrom - Introduction to the Global Catastrophes Risk Conference 2008

Nick Bostrom - Introduction to the Global Catastrophes Risk Conference 2008

  • Order:
  • Duration: 15:39
  • Updated: 22 Feb 2015
  • views: 427
videos
Nick Bostrom provides an introduction to the Global Catastrophic Risks Conference and briefly addressing some of the key themes running through it. Hard to find, exists a download here: http://podcasts.ox.ac.uk/introduction-global-catastrophes-risk-conference-2008 Subscribe to this Channel: http://youtube.com/subscription_center?add_user=TheRationalFuture Science, Technology & the Future: http://scifuture.org Humanity+: http://humanityplus.org
https://wn.com/Nick_Bostrom_Introduction_To_The_Global_Catastrophes_Risk_Conference_2008
Is Computer Superintelligence an Existential Risk?

Is Computer Superintelligence an Existential Risk?

  • Order:
  • Duration: 54:36
  • Updated: 08 May 2016
  • views: 556
videos
This documentary explores the risk that Artificial Intelligence could pose an existential threat to humanity.
https://wn.com/Is_Computer_Superintelligence_An_Existential_Risk
Global catastrophic risk - Video Learning - WizScience.com

Global catastrophic risk - Video Learning - WizScience.com

  • Order:
  • Duration: 2:59
  • Updated: 10 Sep 2015
  • views: 86
videos
A "global catastrophic risk" is a hypothetical future event with the potential to seriously damage human well-being on a global scale. Some events could destroy or cripple modern civilization. Other, even more severe events could cause human extinction, any of which can be referred to as an "existential risk". Potential global catastrophic risks include but are not limited to hostile artificial intelligence, nanotechnology weapons, nuclear warfare, and pandemics. Researchers experience difficulty in studying human extinction directly, since humanity has never been destroyed before. While this does not mean that it will not be in the future, it does make modelling existential risks difficult, due in part to survivorship bias. The philosopher Nick Bostrom classifies risks according to their scope and intensity. He considers risks that are at least "global" in scope and "endurable" in intensity to be global catastrophic risks. Those that are at least "trans-generational" in scope and "terminal" in intensity are classified as existential risks. While a global catastrophic risk may kill the vast majority of life on earth, humanity could still potentially recover. An existential risk, on the other hand, is one that either destroys humanity entirely or prevents any chance of civilization recovering. Bostrom considers existential risks to be far more significant. Bostrom identifies four types of existential risk. "Bangs" are sudden catastrophes, which may be accidental or deliberate. He thinks the most likely sources of bangs are malicious use of nanotechnology, nuclear war, and the possibility that the universe is a simulation that will end. "Crunches" are scenarios in which humanity survives but civilization is irreversibly destroyed. The most likely causes of this, he believes, are exhaustion of natural resources, a stable global government that prevents technological progress, or dysgenic pressures that lower average intelligence. "Shrieks" are undesirable futures. For example, if a single mind enhances its powers by merging with a computer, it could dominate human civilization, which could be bad. Bostrom believes that this scenario is most likely, followed by flawed superintelligence and a repressive totalitarian regime. "Whimpers" are the gradual decline of human civilization or current values. He thinks the most likely cause would be evolution changing moral preference, followed by extraterrestrial invasion. Wiz Science™ is "the" learning channel for children and all ages. SUBSCRIBE TODAY Disclaimer: This video is for your information only. The author or publisher does not guarantee the accuracy of the content presented in this video. USE AT YOUR OWN RISK. Background Music: "The Place Inside" by Silent Partner (royalty-free) from YouTube Audio Library. This video uses material/images from https://en.wikipedia.org/wiki/Global+catastrophic+risk, which is released under Creative Commons Attribution-Share-Alike License 3.0 http://creativecommons.org/licenses/by-sa/3.0/ . This video is licensed under Creative Commons Attribution-Share-Alike License 3.0 http://creativecommons.org/licenses/by-sa/3.0/ . To reuse/adapt the content in your own work, you must comply with the license terms. Wiz Science™ is "the" learning channel for children and all ages. SUBSCRIBE TODAY Disclaimer: This video is for your information only. The author or publisher does not guarantee the accuracy of the content presented in this video. USE AT YOUR OWN RISK. Background Music: "The Place Inside" by Silent Partner (royalty-free) from YouTube Audio Library. This video uses material/images from https://en.wikipedia.org/wiki/Global+catastrophic+risk, which is released under Creative Commons Attribution-Share-Alike License 3.0 http://creativecommons.org/licenses/by-sa/3.0/ . This video is licensed under Creative Commons Attribution-Share-Alike License 3.0 http://creativecommons.org/licenses/by-sa/3.0/ . To reuse/adapt the content in your own work, you must comply with the license terms.
https://wn.com/Global_Catastrophic_Risk_Video_Learning_Wizscience.Com
Jamais Cascio - Global Catastrophic Risks

Jamais Cascio - Global Catastrophic Risks

  • Order:
  • Duration: 5:23
  • Updated: 07 Apr 2016
  • views: 153
videos
Jamais Cascio, who blogs at Open the Future discusses Global Catastrophic Risks: http://www.scifuture.org/?p=3711 We are surrounded by catastrophic existential risks – you know, personally, societally, civilizationally. The intriguing thing about them is that the chance of any one of the happening is extremely slim. So very low likelihood, very significant results. Yet there is a non-zero chance that at the end of this sentence a meteor will come down and strike me in the head. It didn’t happen, but it could – there is no physical reason why it wouldn’t and given enough time eventually something will happen. So we face – We are dealing with existential risks, catastrophic, globally & civilizationally catastrophic risks all the time and it’s easy to ignore them. The problem – the dilemma – is when you have a slight uptake in the the likelihood of a catastrophic risk. NASA has a risk scale for the likelihood of an asteroid impact that would and the damage that it would produce – we’ve never gotten above essentially a level 1 risk – out of 10 – at least since they started taking measurements. There has been a couple of times when the likelihood of this happening has gotten up to – I think the highest we ever got was a temporary 4 – but were pretty likely that this level of risk isn’t going to be maintained. But we have these metrics of deciding – ok these risks are plausible – how can we contextualize them? So we do that with telling stories. We deal with catastrophic risks by creating mythologies – and what the mythologies do – mythologies here count as making movies or writing novels or engaging in speculative conversations.. and playing with toys. We craft the mythologies as a way of understanding how these catastrophic events could play out, and more importantly, how humans respond to catastrophe. So you know, recently there was a movie called “San Andreas” starring ‘The Rock’ – scientifically terrible – but ultimately it was a story about ‘how do humans respond to seeing each other in mortal terror? in mortal peril?’ and ‘how do we try to help each other?’ – and that I think becomes a really important ‘pedagogy of catastrophe’. It’s not about understanding the details of every possible dooms day scenario – it’s about understanding what our options are for helping each other afterwards, or helping each other avoid the catastrophe. And – I think all to often, especially the world of science/scifi/foresight, there’s kind of a dismissal of those kinds of ‘soft narratives’ “they’re not scientifically accurate therefore we can ignore them”. But I think we ignore them at our peril – because those are the stories that we viscerally, we as a human society viscerally respond to. We are driven by emotion, we are driven by empathy – and intelligence a way of contextualizing why we feel things – what our relationship is to things that we have emotional connections with. Allowing us to continue to have – our intelligence allows us to continue to have and maintain persistent emotional connections. Cascio gave the closing talk at GCR08, a Mountain View conference on Global Catastrophic Risks. Titled "Uncertainty, Complexity and Taking Action," the discussion focused on the challenges inherent in planning against future disasters emerging as the result of global-scale change: https://vimeo.com/2712394 http://ieet.org/index.php/IEET/eventinfo/ieet20081114/ Many thanks for watching! - Support me via Patreon: https://www.patreon.com/scifuture - Please Subscribe to this Channel: http://youtube.com/subscription_center?add_user=TheRationalFuture - Science, Technology & the Future website: http://scifuture.org
https://wn.com/Jamais_Cascio_Global_Catastrophic_Risks
David Pearce - Global Catastrophic & Existential Risk - Sleepwalking into the Abyss

David Pearce - Global Catastrophic & Existential Risk - Sleepwalking into the Abyss

  • Order:
  • Duration: 21:37
  • Updated: 05 Oct 2014
  • views: 815
videos
Existential risk? I think the greatest underlying source of existential and global catastrophic risk lies in male human primates doing what evolution "designed" male human primates to do, namely wage war. (cf. http://ieet.org/index.php/IEET/more/4576 ) Unfortunately, we now have thermonuclear weapons to do so. Does the study of ERR diminish or enhance ER? One man's risk is another man's opportunity. 2) Is the existence of suffering itself a form of ER insofar as it increases the likelihood of intelligent agency pressing a global OFF button, cleanly or otherwise? If I focussed on ERR, phasing out suffering would be high on the To Do list. AGI? Well, I'd argue it's a form anthropomorphic projection on our part to ascribe intelligence or mind to digital computers. Believers in digital sentience, let alone digital (super)intelligence, need to explain Moravec's paradox. (cf. http://en.wikipedia.org/wiki/Moravec's_paradox) For sure, digital computers can be used to model everything from the weather to the Big Bang to thermonuclear reactions. Yet why is, say, a bumble bee more successful in navigating its environment in open-field contexts than the most advanced artificial robot the Pentagon can build today? The success of biological lifeforms since the Cambrian Explosion has turned on the computational capacity of organic robots to solve the binding problem (http://tracker.preterhuman.net/texts/body_and_health/Neurology/Binding.pdf) and generate cross-morally matched, real-time simulations of the mind-independent world. On theoretical grounds, I predict digital computers will never be capable of generating unitary phenomenal minds, unitary selves or unitary virtual worlds. In short, classical digital computers are invincibly ignorant zombies. (cf. http://ieet.org/index.php/IEET/more/pearce20120510) They can never "wake up" and explore the manifold varieties of sentience. .. So why support initiatives to reduce existential and global catastrophic risk? Such advocacy might seem especially paradoxical if you're inclined to believe (as I am) that Hubble volumes where primordial information-bearing self-replicators arise more than once are vanishingly rare - and therefore cosmic rescue missions may be infeasible. Suffering sentience may exist in terrible abundance beyond our cosmological horizon and in googols of other Everett branches. But on current understanding, it's hard to see how rational agency can do anything about it. .. The bad news? I fear we're sleepwalking towards the abyss. Some of the trillions of dollars of weaponry we're stockpiling designed to kill and maim rival humans will be used in armed conflict between nation states. Tens of millions and possibly hundreds of millions of people may perish in thermonuclear war. Multiple possible flash-points exist. I don't know if global catastrophe can be averted. For evolutionary reasons, male humans are biologically primed for competition and violence. Perhaps the least sociologically implausible prevention-measure would be a voluntary transfer of the monopoly of violence currently claimed by state actors to the United Nations. But I wouldn't count on any such transfer of power this side of Armageddon. http://www.hedweb.com/social-media/pre2014.html Does the study of existential and global catastrophic risk increase or decrease the likelihood of catastrophe? The issue is complicated by the divergent senses that researchers attach to the term "existential risk": http://www.abolitionist.com/anti-natalism.html An important strand of Bostrom’s research concerns the future of humanity and long-term outcomes. He introduced the concept of an existential risk, which he defines as one in which an "adverse outcome would either annihilate Earth-originating intelligent life or permanently and drastically curtail its potential." In the 2008 volume "Global Catastrophic Risks", editors Bostrom and Cirkovic characterize the relation between existential risk and the broader class of global catastrophic risks, and link existential risk to observer selection effects[8] and the Fermi paradox.[9] In In a 2013-paper in the journal Global Policy, Bostrom offers a taxonomy of existential risk and proposes a reconceptualization of sustainability in dynamic terms, as a developmental trajectory that minimizes existential risk. Bostrom has argued that, from a consequentialist perspective, even small reductions in the cumulative amount of existential risk that humanity will face is extremely valuable, to the point where the traditional utilitarian imperative—to maximize expected utility—can be simplified to the Maxipok principle: maximize the probability of an OK outcome (where an OK outcome is any that avoids existential catastrophe). Subscribe to this Channel: http://youtube.com/subscription_center?add_user=TheRationalFuture Science, Technology & the Future: http://scifuture.org Facebook group on Existential Risk: https://www.facebook.com/groups/ExistentialRisk
https://wn.com/David_Pearce_Global_Catastrophic_Existential_Risk_Sleepwalking_Into_The_Abyss
Security in an Age of Catastrophic Risk

Security in an Age of Catastrophic Risk

  • Order:
  • Duration: 46:46
  • Updated: 05 May 2015
  • views: 5096
videos
Bruce Schneier Chief Technology Officer, Resilient Systems In cyberspace and out, we're increasingly confronting extremely-low-probability, extremely-high-damage attacks. Protecting against these sorts of risks requires new ways of thinking about security; one that emphasizes agility and resilience, while avoiding worst-case thinking.
https://wn.com/Security_In_An_Age_Of_Catastrophic_Risk
Global Catastrophic Risks

Global Catastrophic Risks

  • Order:
  • Duration: 0:37
  • Updated: 15 Apr 2017
  • views: 1
videos
https://wn.com/Global_Catastrophic_Risks
Global catastrophic risk

Global catastrophic risk

  • Order:
  • Duration: 19:05
  • Updated: 28 Aug 2016
  • views: 12
videos
A global catastrophic risk is a hypothetical future event that has the potential to damage human well-being on a global scale.Some events could cripple or destroy modern civilization.Any event that could cause human extinction is also known as an existential risk.Potential global catastrophic risks include but are not limited to hostile artificial intelligence, nanotechnology weapons, climate change, nuclear warfare, total war and pandemics. ---Image-Copyright-and-Permission--- About the author(s): The original uploader was Fredrik at English Wikipedia License: Public domain Author(s): Fredrik ---Image-Copyright-and-Permission--- This channel is dedicated to make Wikipedia, one of the biggest knowledge databases in the world available to people with limited vision. Article available under a Creative Commons license Image source in video
https://wn.com/Global_Catastrophic_Risk
Global catastrophic risks

Global catastrophic risks

  • Order:
  • Duration: 1:46
  • Updated: 23 May 2017
  • views: 5
videos https://wn.com/Global_Catastrophic_Risks
25 Catastrophic Disasters Scientists Believe Could Affect Our World

25 Catastrophic Disasters Scientists Believe Could Affect Our World

  • Order:
  • Duration: 8:44
  • Updated: 20 Mar 2015
  • views: 167776
videos
The survival of mankind and civilization is a natural instinct that has kept us alive for thousands of years. However, during the past few decades the scientific community has become more aware than ever before of global catastrophic risks (GCR)—the risk of events large enough to significantly harm or even destroy human civilization on a global scale. Here follow 25 of the most catastrophic disasters against humanity that we might have to deal with in the near or distant future. Follow us on: Twitter: https://twitter.com/list25 Facebook: https://www.facebook.com/list25 Website: http://list25.com Instagram: https://instagram.com/list25/ Pinterest: https://www.pinterest.com/list25/ Check out the physical list at - http://list25.com/25-catastrophic-disasters-scientists-believe-could-affect-our-world/ Preview: The era of black holes End-time A global oppressive government Grey goo Gamma-ray burst The extinction risk of climate change Coronal mass ejection Big Rip Big Crunch Genetic pollution and more... Lists featured on this post: 25 Coolest NASA Discoveries That Changed Your Life - https://www.youtube.com/watch?v=yee7T2fJfQQ&index=23&list=PL3Ikn3SKdJHh3ydUUZpusjBVgt3PTPmeH and 25 Humorous And Unconventional Units Of Measurement - https://www.youtube.com/watch?v=50mjhfAcO9o&list=PL3Ikn3SKdJHh3ydUUZpusjBVgt3PTPmeH&index=67
https://wn.com/25_Catastrophic_Disasters_Scientists_Believe_Could_Affect_Our_World
Global catastrophic risk

Global catastrophic risk

  • Order:
  • Duration: 17:20
  • Updated: 06 Jan 2016
  • views: 8
videos
Global catastrophic risk =======Image-Copyright-Info======== License: Creative Commons Attribution-Share Alike 3.0 (CC BY-SA 3.0) LicenseLink: http://creativecommons.org/licenses/by-sa/3.0 Author-Info: Wrev Image Source: https://en.wikipedia.org/wiki/File:X-risk-chart-en-01a.svg =======Image-Copyright-Info======== ☆Video is targeted to blind users Attribution: Article text available under CC-BY-SA image source in video
https://wn.com/Global_Catastrophic_Risk
Wikipedia Global Catastrophic Risk

Wikipedia Global Catastrophic Risk

  • Order:
  • Duration: 59:22
  • Updated: 21 Mar 2017
  • views: 0
videos
https://wn.com/Wikipedia_Global_Catastrophic_Risk
Storm Force: Global Catastrophe (Full Documentary)

Storm Force: Global Catastrophe (Full Documentary)

  • Order:
  • Duration: 52:12
  • Updated: 02 Aug 2017
  • views: 51
videos
A global catastrophic risk is a hypothetical future event that has the potential to damage human well-being on a global scale. Some events could cripple or destroy modern civilization. Any event that could cause human extinction or permanently and drastically curtail humanity's potential is known as an existential risk. Potential global catastrophic risks include anthropogenic risks (technology risks, governance risks) and natural or external risks. Examples of technology risks are hostile artificial intelligence, biotechnology risks, or nanotechnology weapons. Insufficient global governance creates risks in the social and political domain (potentially leading to a global war with or without a nuclear holocaust, bioterrorism using genetically modified organisms, cyberterrorism destroying critical infrastructures like the electrical grid, or the failure to manage a natural pandemic) as well as problems and risks in the domain of earth system governance (with risks resulting from global warming, environmental degradation, or famine as a result of non-equitable resource distribution, human overpopulation, crop failures and non-sustainable agriculture). Examples for non-anthropogenic risks are an asteroid impact event, a supervolcanic eruption, a lethal gamma-ray burst, a geomagnetic storm destroying all electronic equipment, natural long-term climate change, or extraterrestrial life impacting life on Earth.
https://wn.com/Storm_Force_Global_Catastrophe_(Full_Documentary)
Existential Risks and Extreme Opportunities | Stuart Armstrong | TEDxAthens

Existential Risks and Extreme Opportunities | Stuart Armstrong | TEDxAthens

  • Order:
  • Duration: 18:10
  • Updated: 06 Apr 2015
  • views: 3423
videos
What are existential risks? They are the risks that threaten the very survival of the human species, or those that could dramatically curtail its potential. There are many, from asteroid impact, to engineered pandemic to artificial intelligence (AI), and they are almost all understudied. AI risk is the least understood, but potentially the deadliest of all, as AIs could be extremely powerful agents with insufficiently safe motivations and goals. The problem is very difficult, philosophically and programmatically. If there obstacles are overcome, however, humanity can expect to look forwards to a world of dramatic abundance of health, wealth, and happiness. Stuart Armstrong’s research at the Future of Humanity Institute centres on formal decision theory, the risks and possibilities of Artificial Intelligence, the long term potential for intelligent life, and anthropic (self-locating) probability. He is particularly interested in finding decision processes that give the “correct” answer under situations of anthropic ignorance and ignorance of one’s own utility function, ways of mapping humanity’s partially defined values onto an artificial entity, and the interaction between various existential risks. He aims to improve the understanding of the different types and natures of uncertainties surrounding human progress in the mid-to-far future. This talk was given at a TEDx event using the TED conference format but independently organized by a local community. Learn more at http://ted.com/tedx
https://wn.com/Existential_Risks_And_Extreme_Opportunities_|_Stuart_Armstrong_|_Tedxathens
×