• Report: Global Catastrophic Risks 2017

    'Whether it’s the spectre of nuclear conflict or our planet tipping into catastrophic climate change, the need for effective global cooperation has never been greater' http://www.independent.co.uk/news/uk/home-news/climate-change-global-warming-nuclear-war-asteroid-pandemic-volcano-global-catastrophe-a7752171.html The Global Challenges Foundation’s annual report “Global Catastrophic Risks 2017” is based on the latest scientific research. It contains contributions from leading experts, and summarizes the current status of global efforts to manage catastrophic risks. The GCF has also commissioned an international survey, in which 8,000 members of the general public in eight countries have given their views on global risks and how to handle them. https://www.globalchallenges.org/en United N...

    published: 25 May 2017
  • How to manage global catastrophic risk

    On February 15, Governance Studies at Brookings hosted an event to discuss the management of global catastrophic risk. For decades, international organizations such as the United Nations, the International Monetary Fund, and the World Bank have helped national, regional, and global leaders tackle these challenges. However, many believe that new approaches and fresh thinking are needed in the global governance arena. https://www.brookings.edu/events/how-to-manage-global-catastrophic-risk/ (transcript available) Subscribe! http://www.youtube.com/subscription_center?add_user=BrookingsInstitution Follow Brookings on social media! Facebook: http://www.Facebook.com/Brookings Twitter: http://www.twitter.com/BrookingsInst Instagram: http://www.Instagram.com/brookingsinst LinkedIn: http://www.l...

    published: 15 Feb 2017
  • Top 10 Global Catastrophes That Could Happen Today

    If you cant understand what we are saying, firstly Sorry! Secondly turn on the Captions button right hand of the video.. :D The world around us is ready to crumble at any minute, from the solar flares threatening our satellites, to the rumbling earthquake set to rip apart the United States. In this special episode of AllTime 10s we bring you the Top 10 Global Catastrophes That Could Happen Tomorrow. Click to Subscribe.. http://bit.ly/WTVC4x Check out the best of Alltime10s - https://www.youtube.com/playlist?list=PLec1lxRhYOzt2qqqnFBIpUm63wr5yhLF6 Where else to find All Time 10s... Facebook: http://ow.ly/3FNFR Twitter: http://ow.ly/3FNMk

    published: 27 Oct 2015
  • Nick Bostrom - Introduction to the Global Catastrophes Risk Conference 2008

    Nick Bostrom provides an introduction to the Global Catastrophic Risks Conference and briefly addressing some of the key themes running through it. Hard to find, exists a download here: http://podcasts.ox.ac.uk/introduction-global-catastrophes-risk-conference-2008 Subscribe to this Channel: http://youtube.com/subscription_center?add_user=TheRationalFuture Science, Technology & the Future: http://scifuture.org Humanity+: http://humanityplus.org

    published: 22 Feb 2015
  • Jamais Cascio - Global Catastrophic Risks

    Jamais Cascio, who blogs at Open the Future discusses Global Catastrophic Risks: http://www.scifuture.org/?p=3711 We are surrounded by catastrophic existential risks – you know, personally, societally, civilizationally. The intriguing thing about them is that the chance of any one of the happening is extremely slim. So very low likelihood, very significant results. Yet there is a non-zero chance that at the end of this sentence a meteor will come down and strike me in the head. It didn’t happen, but it could – there is no physical reason why it wouldn’t and given enough time eventually something will happen. So we face – We are dealing with existential risks, catastrophic, globally & civilizationally catastrophic risks all the time and it’s easy to ignore them. The problem – the dilemma –...

    published: 07 Apr 2016
  • #27 - Dr Tom Inglesby on careers and policies that reduce global catastrophic biological risks

    How about this for a movie idea: a main character has to prevent a new contagious strain of Ebola spreading around the world. She’s the best of the best. So good in fact, that her work on early detection systems contains the strain at its source. Ten minutes into the movie, we see the results of her work – nothing happens. Life goes on as usual. She continues to be amazingly competent, and nothing continues to go wrong. Fade to black. Roll credits. If your job is to prevent catastrophes, success is when nobody has to pay attention to you. But without regular disasters to remind authorities why they hired you in the first place, they can’t tell if you’re actually achieving anything. And when budgets come under pressure you may find that success condemns you to the chopping block. Dr Tom I...

    published: 18 Apr 2018
  • The Science of Global Catastrophe

    The NASA climatologist outlines how and when the accumulation of greenhouse gases will make Earth uninhabitable for our species—and why human life cannot be transferred to a different planet.

    published: 24 Apr 2012
  • Existential Risks and Extreme Opportunities | Stuart Armstrong | TEDxAthens

    What are existential risks? They are the risks that threaten the very survival of the human species, or those that could dramatically curtail its potential. There are many, from asteroid impact, to engineered pandemic to artificial intelligence (AI), and they are almost all understudied. AI risk is the least understood, but potentially the deadliest of all, as AIs could be extremely powerful agents with insufficiently safe motivations and goals. The problem is very difficult, philosophically and programmatically. If there obstacles are overcome, however, humanity can expect to look forwards to a world of dramatic abundance of health, wealth, and happiness. Stuart Armstrong’s research at the Future of Humanity Institute centres on formal decision theory, the risks and possibilities of Arti...

    published: 06 Apr 2015
  • Catastrophe - Episode 2 - Snowball Earth

    Subscribe to Naked Science - http://goo.gl/wpc2Q1 Every other Wednesday we present a new video, so join us to see the truth laid bare... This spectacular five-part documentary series, presented by Tony Robinson, investigates the history of natural disasters, from the planet's beginnings to the present, putting a new perspective on our existence and suggesting that we are the product of catastrophe. 99% of all the creatures that have ever lived, no longer exist. They were wiped-out in a series of global catastrophes. Each disaster changed the course of evolution on earth. Without them mankind, nor any of the life we see around us, would be here today. For out of catastrophe comes rebirth. Evolution is a savage, imperfect and violent process. It's survive or perish. The earth's his...

    published: 02 Apr 2014
  • Global catastrophic risk - Video Learning - WizScience.com

    A "global catastrophic risk" is a hypothetical future event with the potential to seriously damage human well-being on a global scale. Some events could destroy or cripple modern civilization. Other, even more severe events could cause human extinction, any of which can be referred to as an "existential risk". Potential global catastrophic risks include but are not limited to hostile artificial intelligence, nanotechnology weapons, nuclear warfare, and pandemics. Researchers experience difficulty in studying human extinction directly, since humanity has never been destroyed before. While this does not mean that it will not be in the future, it does make modelling existential risks difficult, due in part to survivorship bias. The philosopher Nick Bostrom classifies risks ac...

    published: 10 Sep 2015
  • Catastrophic Risk and Threats to the Global Commons (High-Res)

    published: 01 Apr 2012
  • David Pearce - Global Catastrophic & Existential Risk - Sleepwalking into the Abyss

    Existential risk? I think the greatest underlying source of existential and global catastrophic risk lies in male human primates doing what evolution "designed" male human primates to do, namely wage war. (cf. http://ieet.org/index.php/IEET/more/4576 ) Unfortunately, we now have thermonuclear weapons to do so. Does the study of ERR diminish or enhance ER? One man's risk is another man's opportunity. 2) Is the existence of suffering itself a form of ER insofar as it increases the likelihood of intelligent agency pressing a global OFF button, cleanly or otherwise? If I focussed on ERR, phasing out suffering would be high on the To Do list. AGI? Well, I'd argue it's a form anthropomorphic projection on our part to ascribe intelligence or mind to digital computers. Believers in digital sentie...

    published: 05 Oct 2014
  • Sam Harris on Artificial Intelligence & Existential Risks (Sept 12, 2017)

    Interview published September 12, 2017 Link to the full interview: https://after-on.com/episodes/006

    published: 13 Sep 2017
  • Max Tegmark: Effective altruism, existential risk & existential hope

    Saturday, Hall D, 11:10am - 11:40am

    published: 18 Jun 2017
  • [Wikipedia] Global catastrophic risk

    A global catastrophic risk is a hypothetical future event that has the potential to damage human well-being on a global scale. Some events could cripple or destroy modern civilization. Any event that could cause human extinction is also known as an existential risk. Potential global catastrophic risks include but are not limited to hostile artificial intelligence, nanotechnology weapons, climate change, nuclear warfare, total war, and pandemics. Researchers experience difficulty in studying near human extinction directly, since humanity has never been destroyed before. While this does not mean that it will not be in the future, it does make modelling existential risks difficult, due in part to survivorship bias. https://en.wikipedia.org/wiki/Global_catastrophic_risk Please support this c...

    published: 20 Feb 2017
  • Existential Risks in the Solar System - Professor Joseph Silk FRS

    There are strong reasons to believe that the survival of life on the Earth is under threat. Human activity is one example that we are able to control, at least in principle. We might irreversibly pollute, or destroy the planet with thermonuclear devices. Epidemics might become uncontrollable. Asteroid impact could devastate the Earth, although preventive measures might detect and monitor orbits of potential killer asteroids. Longer term, the sun will evolve into a red giant and expand to a hundred times the orbit of the earth. The earth will burn to a crisp, losing its atmosphere and oceans. By then, humanity, or whatever remains, should have found safer havens than the inner solar system. The transcript and downloadable versions of the lecture are available from the Gresham College web...

    published: 11 Apr 2018
  • Overpopulation - global catastrophe

    Overpopulation is a global problem that may result in bad consequences for our planet. This short video was one of my assignments from AP Human Geography class, when we studied the Theory of Malthus about overpopulation. All footage (except the Earth from space footage) was collected through my travelling trips to different countries, which helped me to shape the vision of this problem. Video & Montage by Samina M. Narration by Samina M.

    published: 04 Jun 2017
  • Existential Risk: Managing Extreme Technological Risk

    Of the 45 million centuries of the Earth’s history, this one is very special. It is the first century that one species – us – hold the future of the planet in our hands. Take a fast-paced tour through our work on biological risks, environmental risks, artificial intelligence risks and more with the Astronomer Royal, Lord Martin Rees; Bertrand Russell Professor of Philosophy, Huw Price; Executive Director of Cambridge University’s Centre for the Study of Existential Risk, Dr Sean O hEigeartaigh; and Academic Project Manager of the Centre, Dr Catherine Rhodes. This video (Stories of Impact) was created on behalf of the Templeton World Charity Foundation, Inc. The opinions expressed are those of the interviewees and do not necessarily reflect the views of Templeton World Charity Foundatio...

    published: 21 Nov 2017
  • Global catastrophic risk

    A global catastrophic risk is a hypothetical future event that has the potential to damage human well-being on a global scale.Some events could cripple or destroy modern civilization.Any event that could cause human extinction is also known as an existential risk.Potential global catastrophic risks include but are not limited to hostile artificial intelligence, nanotechnology weapons, climate change, nuclear warfare, total war and pandemics. ---Image-Copyright-and-Permission--- About the author(s): The original uploader was Fredrik at English Wikipedia License: Public domain Author(s): Fredrik ---Image-Copyright-and-Permission--- This channel is dedicated to make Wikipedia, one of the biggest knowledge databases in the world available to people with limited vision. Article available und...

    published: 28 Aug 2016
  • Global catastrophic risks

    Global catastrophic risk

    published: 23 May 2017
  • Global Catastrophic Risk Institute Top #7 Facts

    published: 03 Feb 2016
  • Classifying Global Catastrophic Risks | Shahar Avin | EAGxOxford 2016

    This talk is part of the EAGxOxford 2016 conference (www.eagxoxford.com) that took place at the Examination Schools in Oxford University. The recordings were produced by 'Speeding Films' and ‘The Hideout’.

    published: 15 Apr 2017
  • Global catastrophic risks and emerging technologies in Davos

    The Global Challenges Foundation visited the World Economic Forum in Davos, and organized a discussion on global catastrophic risks and emerging technologies. How can global governance be reformed to meet new challenges?

    published: 13 Apr 2018
  • The Doomsday Clock | Countdown to Global Catastrophe - Documentary Films

    The Doomsday Clock is a symbolic clock face, representing a countdown to possible global catastrophe (e.g. nuclear war or climate change). It has been maintained since 1947 by the members of the Science and Security Board of the Bulletin of the Atomic Scientists, who are in turn advised by the Governing Board and the Board of Sponsors, including 18 Nobel Laureates. The closer they set the Clock to midnight, the closer the scientists believe the world is to global disaster. Read More: https://en.wikipedia.org/wiki/Doomsday_Clock

    published: 28 Aug 2015
developed with YouTube
Report: Global Catastrophic Risks 2017
8:01

Report: Global Catastrophic Risks 2017

  • Order:
  • Duration: 8:01
  • Updated: 25 May 2017
  • views: 2454
videos
'Whether it’s the spectre of nuclear conflict or our planet tipping into catastrophic climate change, the need for effective global cooperation has never been greater' http://www.independent.co.uk/news/uk/home-news/climate-change-global-warming-nuclear-war-asteroid-pandemic-volcano-global-catastrophe-a7752171.html The Global Challenges Foundation’s annual report “Global Catastrophic Risks 2017” is based on the latest scientific research. It contains contributions from leading experts, and summarizes the current status of global efforts to manage catastrophic risks. The GCF has also commissioned an international survey, in which 8,000 members of the general public in eight countries have given their views on global risks and how to handle them. https://www.globalchallenges.org/en United Nations footage with UN's Peter Thomson, President of the General Assembly http://www.un.org/apps/news/story.asp?NewsID=56405 Leonardo DiCaprio at United Nations Climate Summit 2014 (480P) https://www.youtube.com/watch?v=Nt3Tz3cpIpI Nuclear test footage from Castle Bravo https://en.wikipedia.org/wiki/Castle_Bravo Additional video footage via http://pixabay.com Sound effects by http://Soundmorph.com and http://EpicStockMedia.com This video was possible with support from the Environmental Coffee House https://www.facebook.com/environmentalcoffeehouse Support my future video productions https://www.patreon.com/ClimateState
https://wn.com/Report_Global_Catastrophic_Risks_2017
How to manage global catastrophic risk
1:25:43

How to manage global catastrophic risk

  • Order:
  • Duration: 1:25:43
  • Updated: 15 Feb 2017
  • views: 1144
videos
On February 15, Governance Studies at Brookings hosted an event to discuss the management of global catastrophic risk. For decades, international organizations such as the United Nations, the International Monetary Fund, and the World Bank have helped national, regional, and global leaders tackle these challenges. However, many believe that new approaches and fresh thinking are needed in the global governance arena. https://www.brookings.edu/events/how-to-manage-global-catastrophic-risk/ (transcript available) Subscribe! http://www.youtube.com/subscription_center?add_user=BrookingsInstitution Follow Brookings on social media! Facebook: http://www.Facebook.com/Brookings Twitter: http://www.twitter.com/BrookingsInst Instagram: http://www.Instagram.com/brookingsinst LinkedIn: http://www.linkedin.com/com/company/the-brookings-institution
https://wn.com/How_To_Manage_Global_Catastrophic_Risk
Top 10 Global Catastrophes That Could Happen Today
13:19

Top 10 Global Catastrophes That Could Happen Today

  • Order:
  • Duration: 13:19
  • Updated: 27 Oct 2015
  • views: 785946
videos
If you cant understand what we are saying, firstly Sorry! Secondly turn on the Captions button right hand of the video.. :D The world around us is ready to crumble at any minute, from the solar flares threatening our satellites, to the rumbling earthquake set to rip apart the United States. In this special episode of AllTime 10s we bring you the Top 10 Global Catastrophes That Could Happen Tomorrow. Click to Subscribe.. http://bit.ly/WTVC4x Check out the best of Alltime10s - https://www.youtube.com/playlist?list=PLec1lxRhYOzt2qqqnFBIpUm63wr5yhLF6 Where else to find All Time 10s... Facebook: http://ow.ly/3FNFR Twitter: http://ow.ly/3FNMk
https://wn.com/Top_10_Global_Catastrophes_That_Could_Happen_Today
Nick Bostrom - Introduction to the Global Catastrophes Risk Conference 2008
15:39

Nick Bostrom - Introduction to the Global Catastrophes Risk Conference 2008

  • Order:
  • Duration: 15:39
  • Updated: 22 Feb 2015
  • views: 492
videos
Nick Bostrom provides an introduction to the Global Catastrophic Risks Conference and briefly addressing some of the key themes running through it. Hard to find, exists a download here: http://podcasts.ox.ac.uk/introduction-global-catastrophes-risk-conference-2008 Subscribe to this Channel: http://youtube.com/subscription_center?add_user=TheRationalFuture Science, Technology & the Future: http://scifuture.org Humanity+: http://humanityplus.org
https://wn.com/Nick_Bostrom_Introduction_To_The_Global_Catastrophes_Risk_Conference_2008
Jamais Cascio - Global Catastrophic Risks
5:23

Jamais Cascio - Global Catastrophic Risks

  • Order:
  • Duration: 5:23
  • Updated: 07 Apr 2016
  • views: 160
videos
Jamais Cascio, who blogs at Open the Future discusses Global Catastrophic Risks: http://www.scifuture.org/?p=3711 We are surrounded by catastrophic existential risks – you know, personally, societally, civilizationally. The intriguing thing about them is that the chance of any one of the happening is extremely slim. So very low likelihood, very significant results. Yet there is a non-zero chance that at the end of this sentence a meteor will come down and strike me in the head. It didn’t happen, but it could – there is no physical reason why it wouldn’t and given enough time eventually something will happen. So we face – We are dealing with existential risks, catastrophic, globally & civilizationally catastrophic risks all the time and it’s easy to ignore them. The problem – the dilemma – is when you have a slight uptake in the the likelihood of a catastrophic risk. NASA has a risk scale for the likelihood of an asteroid impact that would and the damage that it would produce – we’ve never gotten above essentially a level 1 risk – out of 10 – at least since they started taking measurements. There has been a couple of times when the likelihood of this happening has gotten up to – I think the highest we ever got was a temporary 4 – but were pretty likely that this level of risk isn’t going to be maintained. But we have these metrics of deciding – ok these risks are plausible – how can we contextualize them? So we do that with telling stories. We deal with catastrophic risks by creating mythologies – and what the mythologies do – mythologies here count as making movies or writing novels or engaging in speculative conversations.. and playing with toys. We craft the mythologies as a way of understanding how these catastrophic events could play out, and more importantly, how humans respond to catastrophe. So you know, recently there was a movie called “San Andreas” starring ‘The Rock’ – scientifically terrible – but ultimately it was a story about ‘how do humans respond to seeing each other in mortal terror? in mortal peril?’ and ‘how do we try to help each other?’ – and that I think becomes a really important ‘pedagogy of catastrophe’. It’s not about understanding the details of every possible dooms day scenario – it’s about understanding what our options are for helping each other afterwards, or helping each other avoid the catastrophe. And – I think all to often, especially the world of science/scifi/foresight, there’s kind of a dismissal of those kinds of ‘soft narratives’ “they’re not scientifically accurate therefore we can ignore them”. But I think we ignore them at our peril – because those are the stories that we viscerally, we as a human society viscerally respond to. We are driven by emotion, we are driven by empathy – and intelligence a way of contextualizing why we feel things – what our relationship is to things that we have emotional connections with. Allowing us to continue to have – our intelligence allows us to continue to have and maintain persistent emotional connections. Cascio gave the closing talk at GCR08, a Mountain View conference on Global Catastrophic Risks. Titled "Uncertainty, Complexity and Taking Action," the discussion focused on the challenges inherent in planning against future disasters emerging as the result of global-scale change: https://vimeo.com/2712394 http://ieet.org/index.php/IEET/eventinfo/ieet20081114/ Many thanks for watching! - Support me via Patreon: https://www.patreon.com/scifuture - Please Subscribe to this Channel: http://youtube.com/subscription_center?add_user=TheRationalFuture - Science, Technology & the Future website: http://scifuture.org
https://wn.com/Jamais_Cascio_Global_Catastrophic_Risks
#27 - Dr Tom Inglesby on careers and policies that reduce global catastrophic biological risks
2:17:01

#27 - Dr Tom Inglesby on careers and policies that reduce global catastrophic biological risks

  • Order:
  • Duration: 2:17:01
  • Updated: 18 Apr 2018
  • views: 70
videos
How about this for a movie idea: a main character has to prevent a new contagious strain of Ebola spreading around the world. She’s the best of the best. So good in fact, that her work on early detection systems contains the strain at its source. Ten minutes into the movie, we see the results of her work – nothing happens. Life goes on as usual. She continues to be amazingly competent, and nothing continues to go wrong. Fade to black. Roll credits. If your job is to prevent catastrophes, success is when nobody has to pay attention to you. But without regular disasters to remind authorities why they hired you in the first place, they can’t tell if you’re actually achieving anything. And when budgets come under pressure you may find that success condemns you to the chopping block. Dr Tom Inglesby, Director of the Center for Health Security at the Johns Hopkins Bloomberg School of Public Health, worries this may be about to happen to the scientists working on the ‘Global Health Security Agenda’. In 2014 Ebola showed the world why we have to detect and contain new diseases before they spread, and that when it comes to contagious diseases the nations of the world sink or swim together. Fifty countries decided to work together to make sure all their health systems were up to the challenge. Back then Congress provided 5 years’ funding to help some of the world’s poorest countries build the basic health security infrastructure necessary to control pathogens before they could reach the US. Links to learn more, job opportunities, and full transcript: https://80000hours.org/podcast/episodes/tom-inglesby-health-security/ But with Ebola fading from public memory and no recent tragedies to terrify us, Congress may not renew that funding and the project could fall apart. (Learn more about how you can help: http://www.nti.org/analysis/articles/protect-us-investments-global-health-security/ ) But there are positive signs as well - the center Inglesby leads recently received a $16 million grant from the Open Philanthropy Project to further their work preventing global catastrophes. It also runs the [Emerging Leaders in Biosecurity Fellowship](http://www.centerforhealthsecurity.org/our-work/emergingbioleaders/) to train the next generation of biosecurity experts for the US government. And Inglesby regularly testifies to Congress on the threats we all face and how to address them. In this in-depth interview we try to provide concrete guidance for listeners who want to to pursue a career in health security. Some of the topics we cover include: * Should more people in medicine work on security? * What are the top jobs for people who want to improve health security and how do they work towards getting them? * What people can do to protect funding for the Global Health Security Agenda. * Should we be more concerned about natural or human caused pandemics? Which is more neglected? * Should we be allocating more attention and resources to global catastrophic risk scenarios? * Why are senior figures reluctant to prioritize one project or area at the expense of another? * What does Tom think about the idea that in the medium term, human-caused pandemics will pose a far greater risk than natural pandemics, and so we should focus on specific counter-measures? * Are the main risks and solutions understood, and it’s just a matter of implementation? Or is the principal task to identify and understand them? * How is the current US government performing in these areas? * Which agencies are empowered to think about low probability high magnitude events? And more... *The 80,000 Hours podcast is produced by Keiran Harris.* **Get this episode by subscribing to our podcast on the world’s most pressing problems and how to solve them: type *80,000 Hours* into your podcasting app.**
https://wn.com/27_Dr_Tom_Inglesby_On_Careers_And_Policies_That_Reduce_Global_Catastrophic_Biological_Risks
The Science of Global Catastrophe
5:11

The Science of Global Catastrophe

  • Order:
  • Duration: 5:11
  • Updated: 24 Apr 2012
  • views: 32
videos
The NASA climatologist outlines how and when the accumulation of greenhouse gases will make Earth uninhabitable for our species—and why human life cannot be transferred to a different planet.
https://wn.com/The_Science_Of_Global_Catastrophe
Existential Risks and Extreme Opportunities | Stuart Armstrong | TEDxAthens
18:10

Existential Risks and Extreme Opportunities | Stuart Armstrong | TEDxAthens

  • Order:
  • Duration: 18:10
  • Updated: 06 Apr 2015
  • views: 3989
videos
What are existential risks? They are the risks that threaten the very survival of the human species, or those that could dramatically curtail its potential. There are many, from asteroid impact, to engineered pandemic to artificial intelligence (AI), and they are almost all understudied. AI risk is the least understood, but potentially the deadliest of all, as AIs could be extremely powerful agents with insufficiently safe motivations and goals. The problem is very difficult, philosophically and programmatically. If there obstacles are overcome, however, humanity can expect to look forwards to a world of dramatic abundance of health, wealth, and happiness. Stuart Armstrong’s research at the Future of Humanity Institute centres on formal decision theory, the risks and possibilities of Artificial Intelligence, the long term potential for intelligent life, and anthropic (self-locating) probability. He is particularly interested in finding decision processes that give the “correct” answer under situations of anthropic ignorance and ignorance of one’s own utility function, ways of mapping humanity’s partially defined values onto an artificial entity, and the interaction between various existential risks. He aims to improve the understanding of the different types and natures of uncertainties surrounding human progress in the mid-to-far future. This talk was given at a TEDx event using the TED conference format but independently organized by a local community. Learn more at http://ted.com/tedx
https://wn.com/Existential_Risks_And_Extreme_Opportunities_|_Stuart_Armstrong_|_Tedxathens
Catastrophe - Episode 2 - Snowball Earth
48:03

Catastrophe - Episode 2 - Snowball Earth

  • Order:
  • Duration: 48:03
  • Updated: 02 Apr 2014
  • views: 413262
videos
Subscribe to Naked Science - http://goo.gl/wpc2Q1 Every other Wednesday we present a new video, so join us to see the truth laid bare... This spectacular five-part documentary series, presented by Tony Robinson, investigates the history of natural disasters, from the planet's beginnings to the present, putting a new perspective on our existence and suggesting that we are the product of catastrophe. 99% of all the creatures that have ever lived, no longer exist. They were wiped-out in a series of global catastrophes. Each disaster changed the course of evolution on earth. Without them mankind, nor any of the life we see around us, would be here today. For out of catastrophe comes rebirth. Evolution is a savage, imperfect and violent process. It's survive or perish. The earth's history of catastrophes has both moulded the planet and determined evolution. For each disaster led to another leap forward on the evolutionary trail form single celled bacteria to humankind itself. Episode 2 - Snowball Earth This programme delves into a world lying beneath a frozen surface. It is the greatest climate disaster ever to have hit Earth. 650 million years ago, a cataclysmic ice age sealed the entire planet beneath ice and snow, almost destroying life and turning the world into one huge snowball. Snowball Earth uncovers the story behind one of the most controversial theories in science today. To investigate, the programme travels the world to follow scientists scouring southern Australia, Nevada's Death Valley and Alaskan glaciers for tantalising clues as to how our planet ran away into this doomsday scenario. The results could improve understanding of evolution and survival of life.
https://wn.com/Catastrophe_Episode_2_Snowball_Earth
Global catastrophic risk - Video Learning - WizScience.com
2:59

Global catastrophic risk - Video Learning - WizScience.com

  • Order:
  • Duration: 2:59
  • Updated: 10 Sep 2015
  • views: 107
videos
A "global catastrophic risk" is a hypothetical future event with the potential to seriously damage human well-being on a global scale. Some events could destroy or cripple modern civilization. Other, even more severe events could cause human extinction, any of which can be referred to as an "existential risk". Potential global catastrophic risks include but are not limited to hostile artificial intelligence, nanotechnology weapons, nuclear warfare, and pandemics. Researchers experience difficulty in studying human extinction directly, since humanity has never been destroyed before. While this does not mean that it will not be in the future, it does make modelling existential risks difficult, due in part to survivorship bias. The philosopher Nick Bostrom classifies risks according to their scope and intensity. He considers risks that are at least "global" in scope and "endurable" in intensity to be global catastrophic risks. Those that are at least "trans-generational" in scope and "terminal" in intensity are classified as existential risks. While a global catastrophic risk may kill the vast majority of life on earth, humanity could still potentially recover. An existential risk, on the other hand, is one that either destroys humanity entirely or prevents any chance of civilization recovering. Bostrom considers existential risks to be far more significant. Bostrom identifies four types of existential risk. "Bangs" are sudden catastrophes, which may be accidental or deliberate. He thinks the most likely sources of bangs are malicious use of nanotechnology, nuclear war, and the possibility that the universe is a simulation that will end. "Crunches" are scenarios in which humanity survives but civilization is irreversibly destroyed. The most likely causes of this, he believes, are exhaustion of natural resources, a stable global government that prevents technological progress, or dysgenic pressures that lower average intelligence. "Shrieks" are undesirable futures. For example, if a single mind enhances its powers by merging with a computer, it could dominate human civilization, which could be bad. Bostrom believes that this scenario is most likely, followed by flawed superintelligence and a repressive totalitarian regime. "Whimpers" are the gradual decline of human civilization or current values. He thinks the most likely cause would be evolution changing moral preference, followed by extraterrestrial invasion. Wiz Science™ is "the" learning channel for children and all ages. SUBSCRIBE TODAY Disclaimer: This video is for your information only. The author or publisher does not guarantee the accuracy of the content presented in this video. USE AT YOUR OWN RISK. Background Music: "The Place Inside" by Silent Partner (royalty-free) from YouTube Audio Library. This video uses material/images from https://en.wikipedia.org/wiki/Global+catastrophic+risk, which is released under Creative Commons Attribution-Share-Alike License 3.0 http://creativecommons.org/licenses/by-sa/3.0/ . This video is licensed under Creative Commons Attribution-Share-Alike License 3.0 http://creativecommons.org/licenses/by-sa/3.0/ . To reuse/adapt the content in your own work, you must comply with the license terms. Wiz Science™ is "the" learning channel for children and all ages. SUBSCRIBE TODAY Disclaimer: This video is for your information only. The author or publisher does not guarantee the accuracy of the content presented in this video. USE AT YOUR OWN RISK. Background Music: "The Place Inside" by Silent Partner (royalty-free) from YouTube Audio Library. This video uses material/images from https://en.wikipedia.org/wiki/Global+catastrophic+risk, which is released under Creative Commons Attribution-Share-Alike License 3.0 http://creativecommons.org/licenses/by-sa/3.0/ . This video is licensed under Creative Commons Attribution-Share-Alike License 3.0 http://creativecommons.org/licenses/by-sa/3.0/ . To reuse/adapt the content in your own work, you must comply with the license terms.
https://wn.com/Global_Catastrophic_Risk_Video_Learning_Wizscience.Com
Catastrophic Risk and Threats to the Global Commons (High-Res)
1:34:09

Catastrophic Risk and Threats to the Global Commons (High-Res)

  • Order:
  • Duration: 1:34:09
  • Updated: 01 Apr 2012
  • views: 647
videos
https://wn.com/Catastrophic_Risk_And_Threats_To_The_Global_Commons_(High_Res)
David Pearce - Global Catastrophic & Existential Risk - Sleepwalking into the Abyss
21:37

David Pearce - Global Catastrophic & Existential Risk - Sleepwalking into the Abyss

  • Order:
  • Duration: 21:37
  • Updated: 05 Oct 2014
  • views: 850
videos
Existential risk? I think the greatest underlying source of existential and global catastrophic risk lies in male human primates doing what evolution "designed" male human primates to do, namely wage war. (cf. http://ieet.org/index.php/IEET/more/4576 ) Unfortunately, we now have thermonuclear weapons to do so. Does the study of ERR diminish or enhance ER? One man's risk is another man's opportunity. 2) Is the existence of suffering itself a form of ER insofar as it increases the likelihood of intelligent agency pressing a global OFF button, cleanly or otherwise? If I focussed on ERR, phasing out suffering would be high on the To Do list. AGI? Well, I'd argue it's a form anthropomorphic projection on our part to ascribe intelligence or mind to digital computers. Believers in digital sentience, let alone digital (super)intelligence, need to explain Moravec's paradox. (cf. http://en.wikipedia.org/wiki/Moravec's_paradox) For sure, digital computers can be used to model everything from the weather to the Big Bang to thermonuclear reactions. Yet why is, say, a bumble bee more successful in navigating its environment in open-field contexts than the most advanced artificial robot the Pentagon can build today? The success of biological lifeforms since the Cambrian Explosion has turned on the computational capacity of organic robots to solve the binding problem (http://tracker.preterhuman.net/texts/body_and_health/Neurology/Binding.pdf) and generate cross-morally matched, real-time simulations of the mind-independent world. On theoretical grounds, I predict digital computers will never be capable of generating unitary phenomenal minds, unitary selves or unitary virtual worlds. In short, classical digital computers are invincibly ignorant zombies. (cf. http://ieet.org/index.php/IEET/more/pearce20120510) They can never "wake up" and explore the manifold varieties of sentience. .. So why support initiatives to reduce existential and global catastrophic risk? Such advocacy might seem especially paradoxical if you're inclined to believe (as I am) that Hubble volumes where primordial information-bearing self-replicators arise more than once are vanishingly rare - and therefore cosmic rescue missions may be infeasible. Suffering sentience may exist in terrible abundance beyond our cosmological horizon and in googols of other Everett branches. But on current understanding, it's hard to see how rational agency can do anything about it. .. The bad news? I fear we're sleepwalking towards the abyss. Some of the trillions of dollars of weaponry we're stockpiling designed to kill and maim rival humans will be used in armed conflict between nation states. Tens of millions and possibly hundreds of millions of people may perish in thermonuclear war. Multiple possible flash-points exist. I don't know if global catastrophe can be averted. For evolutionary reasons, male humans are biologically primed for competition and violence. Perhaps the least sociologically implausible prevention-measure would be a voluntary transfer of the monopoly of violence currently claimed by state actors to the United Nations. But I wouldn't count on any such transfer of power this side of Armageddon. http://www.hedweb.com/social-media/pre2014.html Does the study of existential and global catastrophic risk increase or decrease the likelihood of catastrophe? The issue is complicated by the divergent senses that researchers attach to the term "existential risk": http://www.abolitionist.com/anti-natalism.html An important strand of Bostrom’s research concerns the future of humanity and long-term outcomes. He introduced the concept of an existential risk, which he defines as one in which an "adverse outcome would either annihilate Earth-originating intelligent life or permanently and drastically curtail its potential." In the 2008 volume "Global Catastrophic Risks", editors Bostrom and Cirkovic characterize the relation between existential risk and the broader class of global catastrophic risks, and link existential risk to observer selection effects[8] and the Fermi paradox.[9] In In a 2013-paper in the journal Global Policy, Bostrom offers a taxonomy of existential risk and proposes a reconceptualization of sustainability in dynamic terms, as a developmental trajectory that minimizes existential risk. Bostrom has argued that, from a consequentialist perspective, even small reductions in the cumulative amount of existential risk that humanity will face is extremely valuable, to the point where the traditional utilitarian imperative—to maximize expected utility—can be simplified to the Maxipok principle: maximize the probability of an OK outcome (where an OK outcome is any that avoids existential catastrophe). Subscribe to this Channel: http://youtube.com/subscription_center?add_user=TheRationalFuture Science, Technology & the Future: http://scifuture.org Facebook group on Existential Risk: https://www.facebook.com/groups/ExistentialRisk
https://wn.com/David_Pearce_Global_Catastrophic_Existential_Risk_Sleepwalking_Into_The_Abyss
Sam Harris on Artificial Intelligence & Existential Risks (Sept 12, 2017)
14:22

Sam Harris on Artificial Intelligence & Existential Risks (Sept 12, 2017)

  • Order:
  • Duration: 14:22
  • Updated: 13 Sep 2017
  • views: 775
videos
Interview published September 12, 2017 Link to the full interview: https://after-on.com/episodes/006
https://wn.com/Sam_Harris_On_Artificial_Intelligence_Existential_Risks_(Sept_12,_2017)
Max Tegmark: Effective altruism, existential risk & existential hope
35:11

Max Tegmark: Effective altruism, existential risk & existential hope

  • Order:
  • Duration: 35:11
  • Updated: 18 Jun 2017
  • views: 3001
videos
Saturday, Hall D, 11:10am - 11:40am
https://wn.com/Max_Tegmark_Effective_Altruism,_Existential_Risk_Existential_Hope
[Wikipedia] Global catastrophic risk
59:22

[Wikipedia] Global catastrophic risk

  • Order:
  • Duration: 59:22
  • Updated: 20 Feb 2017
  • views: 20
videos
A global catastrophic risk is a hypothetical future event that has the potential to damage human well-being on a global scale. Some events could cripple or destroy modern civilization. Any event that could cause human extinction is also known as an existential risk. Potential global catastrophic risks include but are not limited to hostile artificial intelligence, nanotechnology weapons, climate change, nuclear warfare, total war, and pandemics. Researchers experience difficulty in studying near human extinction directly, since humanity has never been destroyed before. While this does not mean that it will not be in the future, it does make modelling existential risks difficult, due in part to survivorship bias. https://en.wikipedia.org/wiki/Global_catastrophic_risk Please support this channel and help me upload more videos. Become one of my Patreons at https://www.patreon.com/user?u=3823907
https://wn.com/Wikipedia_Global_Catastrophic_Risk
Existential Risks in the Solar System - Professor Joseph Silk FRS
54:47

Existential Risks in the Solar System - Professor Joseph Silk FRS

  • Order:
  • Duration: 54:47
  • Updated: 11 Apr 2018
  • views: 1996
videos
There are strong reasons to believe that the survival of life on the Earth is under threat. Human activity is one example that we are able to control, at least in principle. We might irreversibly pollute, or destroy the planet with thermonuclear devices. Epidemics might become uncontrollable. Asteroid impact could devastate the Earth, although preventive measures might detect and monitor orbits of potential killer asteroids. Longer term, the sun will evolve into a red giant and expand to a hundred times the orbit of the earth. The earth will burn to a crisp, losing its atmosphere and oceans. By then, humanity, or whatever remains, should have found safer havens than the inner solar system. The transcript and downloadable versions of the lecture are available from the Gresham College website: https://www.gresham.ac.uk/lectures-and-events/existential-risks-in-the-solar-system Gresham College has been giving free public lectures since 1597. This tradition continues today with all of our five or so public lectures a week being made available for free download from our website. There are currently over 2,000 lectures free to access or download from the website. Website: http://www.gresham.ac.uk Twitter: http://twitter.com/GreshamCollege Facebook: https://www.facebook.com/greshamcollege Instagram: http://www.instagram.com/greshamcollege
https://wn.com/Existential_Risks_In_The_Solar_System_Professor_Joseph_Silk_Frs
Overpopulation - global catastrophe
1:35

Overpopulation - global catastrophe

  • Order:
  • Duration: 1:35
  • Updated: 04 Jun 2017
  • views: 346
videos
Overpopulation is a global problem that may result in bad consequences for our planet. This short video was one of my assignments from AP Human Geography class, when we studied the Theory of Malthus about overpopulation. All footage (except the Earth from space footage) was collected through my travelling trips to different countries, which helped me to shape the vision of this problem. Video & Montage by Samina M. Narration by Samina M.
https://wn.com/Overpopulation_Global_Catastrophe
Existential Risk: Managing Extreme Technological Risk
13:02

Existential Risk: Managing Extreme Technological Risk

  • Order:
  • Duration: 13:02
  • Updated: 21 Nov 2017
  • views: 1349
videos
Of the 45 million centuries of the Earth’s history, this one is very special. It is the first century that one species – us – hold the future of the planet in our hands. Take a fast-paced tour through our work on biological risks, environmental risks, artificial intelligence risks and more with the Astronomer Royal, Lord Martin Rees; Bertrand Russell Professor of Philosophy, Huw Price; Executive Director of Cambridge University’s Centre for the Study of Existential Risk, Dr Sean O hEigeartaigh; and Academic Project Manager of the Centre, Dr Catherine Rhodes. This video (Stories of Impact) was created on behalf of the Templeton World Charity Foundation, Inc. The opinions expressed are those of the interviewees and do not necessarily reflect the views of Templeton World Charity Foundation.
https://wn.com/Existential_Risk_Managing_Extreme_Technological_Risk
Global catastrophic risk
19:05

Global catastrophic risk

  • Order:
  • Duration: 19:05
  • Updated: 28 Aug 2016
  • views: 17
videos
A global catastrophic risk is a hypothetical future event that has the potential to damage human well-being on a global scale.Some events could cripple or destroy modern civilization.Any event that could cause human extinction is also known as an existential risk.Potential global catastrophic risks include but are not limited to hostile artificial intelligence, nanotechnology weapons, climate change, nuclear warfare, total war and pandemics. ---Image-Copyright-and-Permission--- About the author(s): The original uploader was Fredrik at English Wikipedia License: Public domain Author(s): Fredrik ---Image-Copyright-and-Permission--- This channel is dedicated to make Wikipedia, one of the biggest knowledge databases in the world available to people with limited vision. Article available under a Creative Commons license Image source in video
https://wn.com/Global_Catastrophic_Risk
Global catastrophic risks
1:46

Global catastrophic risks

  • Order:
  • Duration: 1:46
  • Updated: 23 May 2017
  • views: 6
videos https://wn.com/Global_Catastrophic_Risks
Global Catastrophic Risk Institute Top  #7 Facts
0:53

Global Catastrophic Risk Institute Top #7 Facts

  • Order:
  • Duration: 0:53
  • Updated: 03 Feb 2016
  • views: 7
videos
https://wn.com/Global_Catastrophic_Risk_Institute_Top_7_Facts
Classifying Global Catastrophic Risks | Shahar Avin | EAGxOxford 2016
25:15

Classifying Global Catastrophic Risks | Shahar Avin | EAGxOxford 2016

  • Order:
  • Duration: 25:15
  • Updated: 15 Apr 2017
  • views: 149
videos
This talk is part of the EAGxOxford 2016 conference (www.eagxoxford.com) that took place at the Examination Schools in Oxford University. The recordings were produced by 'Speeding Films' and ‘The Hideout’.
https://wn.com/Classifying_Global_Catastrophic_Risks_|_Shahar_Avin_|_Eagxoxford_2016
Global catastrophic risks and emerging technologies in Davos
1:04

Global catastrophic risks and emerging technologies in Davos

  • Order:
  • Duration: 1:04
  • Updated: 13 Apr 2018
  • views: 50
videos
The Global Challenges Foundation visited the World Economic Forum in Davos, and organized a discussion on global catastrophic risks and emerging technologies. How can global governance be reformed to meet new challenges?
https://wn.com/Global_Catastrophic_Risks_And_Emerging_Technologies_In_Davos
The Doomsday Clock | Countdown to Global Catastrophe - Documentary Films
43:51

The Doomsday Clock | Countdown to Global Catastrophe - Documentary Films

  • Order:
  • Duration: 43:51
  • Updated: 28 Aug 2015
  • views: 4912
videos
The Doomsday Clock is a symbolic clock face, representing a countdown to possible global catastrophe (e.g. nuclear war or climate change). It has been maintained since 1947 by the members of the Science and Security Board of the Bulletin of the Atomic Scientists, who are in turn advised by the Governing Board and the Board of Sponsors, including 18 Nobel Laureates. The closer they set the Clock to midnight, the closer the scientists believe the world is to global disaster. Read More: https://en.wikipedia.org/wiki/Doomsday_Clock
https://wn.com/The_Doomsday_Clock_|_Countdown_To_Global_Catastrophe_Documentary_Films
×