<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
		>
<channel>
	<title>Comments on: Positive feedback&#8230;</title>
	<atom:link href="http://habitablezone.com/2017/05/10/positive-feedback-2/feed/" rel="self" type="application/rss+xml" />
	<link>https://habitablezone.com/2017/05/10/positive-feedback-2/</link>
	<description></description>
	<lastBuildDate>Sun, 05 Apr 2026 21:05:37 -0700</lastBuildDate>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>1</sy:updateFrequency>
	<generator>http://wordpress.org/?v=3.3.1</generator>
	<item>
		<title>By: ER</title>
		<link>https://habitablezone.com/2017/05/10/positive-feedback-2/#comment-39152</link>
		<dc:creator>ER</dc:creator>
		<pubDate>Fri, 12 May 2017 12:27:10 +0000</pubDate>
		<guid isPermaLink="false">https://www.habitablezone.com/?p=63896#comment-39152</guid>
		<description>In fact, in my one numerical analysis course, Fortran programming was a prerequisite.  We were expected to code up our exercises.  In fact, all computer computations involve substituting the calculus with more conventional algebra-type calculations. Computers cannot do differentiation or integration, all they can do is substitute simpler iterative operations to arrive at a numerical result.  

All numerical analysis solutions involve simplifications of the original calculus so it is not necessary to formally carry out the integral or derivative.
Its like linear interpolation when using trig or log tables.  Sometimes the approximation works, but sometimes it doesn&#039;t. You actually have to try it out with an example where the answer is already known.

Before digital computers, numerical analysis was a technique used to substitute very complex mathematical operations with simpler ones that could be done in a reasonable amount of time, or assigned to students who actually did the number crunching. These guys, by the way, were also called &quot;computers&quot;, and followed step-by-step instructions printed on paper forms called &quot;programs&quot;.

The major specialty of astronomical interest where I went to school was positional astronomy, or astrometry.  Its dull, repetitive work, but it is the fundamental sky survey work on which everything else, including the more glamorous astrophysics and cosmology, relies.  The actual work involved carefully measuring stellar positions with powerful microscopes off photographic plates, and subjecting the results  to multiple corrections and adjustments to remove instrumental errors and variations introduced by precession, aberration and other geodynamical and orbital movements.  The statistics and math were well understood, and not all that complex, but they were too tedious and involved to be employed on thousands of stars that might be on even one photographic plate.  It was so labor-intensive that astrometrists relied on approximations to these equations (numerical analysis) to speed things up.  Their skill and reputations were based on finding clever ways to do this.

When computers came along, these numerical approximations became unnecessary.  You could just drop the entire equations in the Fortran code and calculate them out--brute force.  And suddenly, these old time PhDs were forced to rely on undergraduates to code up their calculations, not just toil away as human computers.  Their clever numerical approximations and shortcuts, and the proofs needed to guarantee they were equivalent to the full formal calculations, were no longer necessary. Anybody with computer skills and a knowledge of basic algebra could automate the calculations.

I should have learned my lesson then, but the young are arrogant and cocky.  The day would come when the skills painfully honed over a lifetime would be automated, and suddenly everything you know becomes, if not obsolete, then superfluous.</description>
		<content:encoded><![CDATA[<p>In fact, in my one numerical analysis course, Fortran programming was a prerequisite.  We were expected to code up our exercises.  In fact, all computer computations involve substituting the calculus with more conventional algebra-type calculations. Computers cannot do differentiation or integration, all they can do is substitute simpler iterative operations to arrive at a numerical result.  </p>
<p>All numerical analysis solutions involve simplifications of the original calculus so it is not necessary to formally carry out the integral or derivative.<br />
Its like linear interpolation when using trig or log tables.  Sometimes the approximation works, but sometimes it doesn&#8217;t. You actually have to try it out with an example where the answer is already known.</p>
<p>Before digital computers, numerical analysis was a technique used to substitute very complex mathematical operations with simpler ones that could be done in a reasonable amount of time, or assigned to students who actually did the number crunching. These guys, by the way, were also called &#8220;computers&#8221;, and followed step-by-step instructions printed on paper forms called &#8220;programs&#8221;.</p>
<p>The major specialty of astronomical interest where I went to school was positional astronomy, or astrometry.  Its dull, repetitive work, but it is the fundamental sky survey work on which everything else, including the more glamorous astrophysics and cosmology, relies.  The actual work involved carefully measuring stellar positions with powerful microscopes off photographic plates, and subjecting the results  to multiple corrections and adjustments to remove instrumental errors and variations introduced by precession, aberration and other geodynamical and orbital movements.  The statistics and math were well understood, and not all that complex, but they were too tedious and involved to be employed on thousands of stars that might be on even one photographic plate.  It was so labor-intensive that astrometrists relied on approximations to these equations (numerical analysis) to speed things up.  Their skill and reputations were based on finding clever ways to do this.</p>
<p>When computers came along, these numerical approximations became unnecessary.  You could just drop the entire equations in the Fortran code and calculate them out&#8211;brute force.  And suddenly, these old time PhDs were forced to rely on undergraduates to code up their calculations, not just toil away as human computers.  Their clever numerical approximations and shortcuts, and the proofs needed to guarantee they were equivalent to the full formal calculations, were no longer necessary. Anybody with computer skills and a knowledge of basic algebra could automate the calculations.</p>
<p>I should have learned my lesson then, but the young are arrogant and cocky.  The day would come when the skills painfully honed over a lifetime would be automated, and suddenly everything you know becomes, if not obsolete, then superfluous.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: mcfly</title>
		<link>https://habitablezone.com/2017/05/10/positive-feedback-2/#comment-39146</link>
		<dc:creator>mcfly</dc:creator>
		<pubDate>Fri, 12 May 2017 03:01:32 +0000</pubDate>
		<guid isPermaLink="false">https://www.habitablezone.com/?p=63896#comment-39146</guid>
		<description>Thank you for that.

You know, something that has helped keep me interested in math and stats over the years is the knowledge that I *don&#039;t* have to use electronics to get the answer. In every math class I&#039;ve ever taken, calculators were verboten. If you have a pencil, some paper and your brain, you got it covered. I never left the mainstream courses, though--calculus, linear algebra, discrete math. Nothing exotic. I have no experience with some of the other fields you mention.

Stats of course was a different beast. We were *encouraged* to use a TI stats calculator. All I ever resorted to was a very basic calculator (otherwise, it can be time consuming and error-prone to, say, add up 30 floating-point numbers or calculate a division to 10 decimal places), and am extremely proud that I still got all A&#039;s in my stats classes. Yeah, finals were nerve-wracking; I used up every single second, while others who&#039;d put their TI in the driver&#039;s seat were getting a start on their post-final bender while I was still pounding out numbers under the watchful glare of invigilators who just wanted me out of there so they could get a start on their post-final bender.

Hmmm...is it possible to digress when you never had a point to start with?</description>
		<content:encoded><![CDATA[<p>Thank you for that.</p>
<p>You know, something that has helped keep me interested in math and stats over the years is the knowledge that I *don&#8217;t* have to use electronics to get the answer. In every math class I&#8217;ve ever taken, calculators were verboten. If you have a pencil, some paper and your brain, you got it covered. I never left the mainstream courses, though&#8211;calculus, linear algebra, discrete math. Nothing exotic. I have no experience with some of the other fields you mention.</p>
<p>Stats of course was a different beast. We were *encouraged* to use a TI stats calculator. All I ever resorted to was a very basic calculator (otherwise, it can be time consuming and error-prone to, say, add up 30 floating-point numbers or calculate a division to 10 decimal places), and am extremely proud that I still got all A&#8217;s in my stats classes. Yeah, finals were nerve-wracking; I used up every single second, while others who&#8217;d put their TI in the driver&#8217;s seat were getting a start on their post-final bender while I was still pounding out numbers under the watchful glare of invigilators who just wanted me out of there so they could get a start on their post-final bender.</p>
<p>Hmmm&#8230;is it possible to digress when you never had a point to start with?</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: ER</title>
		<link>https://habitablezone.com/2017/05/10/positive-feedback-2/#comment-39143</link>
		<dc:creator>ER</dc:creator>
		<pubDate>Fri, 12 May 2017 01:54:03 +0000</pubDate>
		<guid isPermaLink="false">https://www.habitablezone.com/?p=63896#comment-39143</guid>
		<description>Analysis (analytical geometry, calculus and differential equations) describes changing quantities, like fluid flow, dynamics, and kinematics.  But statistics is a different bird altogether. It explores the fuzziness of nature. It allows you to make meaningful statements even when you can&#039;t come up with exact solutions.

It allows you to make excrutiatingly precise determinations of totally random phenomena.  You can determine accurately how many people will die in auto accidents over a long holiday weekend without having a clue as to the causes of those accidents, or who will be involved.  One standard deviation, two?  What does that actually mean?  A system of differential equations can describe complex physical assemblies with many moving parts, but statistics can give you some idea of how many things have to happen in what order before you can reasonably believe something else will happen.  That&#039;s a differnt kind of question altogether.  Yes, it appears to be a discipline really suited to model mental decisions.  The real world responds to analysis, but your decisions and choices are best based on statistics.

There is another branch of mathematics, much neglected now, called numerical analysis.  It is used to model real equations which have no easily determined solutions with other, simpler equations which give you approximations good enough for engineering work.  It is no longer used too much, because now we have computers that can be used to brute-force any calculation.  But at one time, it was often critical to come up with easily-computed algorithms which simulated the formal equations of real and complex analysis and converged to acceptable solutions close enough to those given by the &quot;real&quot; equations.  It was actually a branch of experimental mathematics, because you didn&#039;t always know if, or under what circumstances, a numerical solution would be as good as a formal one. You had to actually try it and find out!

Sometimes the numerical analysis algorithm would quickly converge, after just a few iterations, to the formal values.  At other times it would approach it asymptotically, or oscillate around it.  Other times, the algorithm would approach the correct value, but eventually start to veer away.

For example, you can solve a definite integral by integrating the function and solving for a specific interval.  But some functions can&#039;t be integrated, or maybe you just don&#039;t know how, and you have to divide the area under the curve into little rectangular strips and add up their areas.
In my time, sometimes we even plotted the integrals out on graph paper, cut out the curves with scissors and actually weighed them on a Mettler balance to get the area under the curve (the definite integral).

The ancient Egyptians could not calculate pi, or use the Pythagorean theorem, but they had developed numerical and graphical solutions to get decent enough results to build their pyramids.

Its like you&#039;re exploring the mental space between the mathematics and the engineering.  Today, the computer can be instructed to do the calculations, and you can try it in multiple ways so you can determine how the approximations converge, or if they do at all.  But I remember when you could take courses in it, for those times when you just didn&#039;t have a good mathematical model of what you  were studying.

Sorry.  Sometimes I just ramble aimlessly.</description>
		<content:encoded><![CDATA[<p>Analysis (analytical geometry, calculus and differential equations) describes changing quantities, like fluid flow, dynamics, and kinematics.  But statistics is a different bird altogether. It explores the fuzziness of nature. It allows you to make meaningful statements even when you can&#8217;t come up with exact solutions.</p>
<p>It allows you to make excrutiatingly precise determinations of totally random phenomena.  You can determine accurately how many people will die in auto accidents over a long holiday weekend without having a clue as to the causes of those accidents, or who will be involved.  One standard deviation, two?  What does that actually mean?  A system of differential equations can describe complex physical assemblies with many moving parts, but statistics can give you some idea of how many things have to happen in what order before you can reasonably believe something else will happen.  That&#8217;s a differnt kind of question altogether.  Yes, it appears to be a discipline really suited to model mental decisions.  The real world responds to analysis, but your decisions and choices are best based on statistics.</p>
<p>There is another branch of mathematics, much neglected now, called numerical analysis.  It is used to model real equations which have no easily determined solutions with other, simpler equations which give you approximations good enough for engineering work.  It is no longer used too much, because now we have computers that can be used to brute-force any calculation.  But at one time, it was often critical to come up with easily-computed algorithms which simulated the formal equations of real and complex analysis and converged to acceptable solutions close enough to those given by the &#8220;real&#8221; equations.  It was actually a branch of experimental mathematics, because you didn&#8217;t always know if, or under what circumstances, a numerical solution would be as good as a formal one. You had to actually try it and find out!</p>
<p>Sometimes the numerical analysis algorithm would quickly converge, after just a few iterations, to the formal values.  At other times it would approach it asymptotically, or oscillate around it.  Other times, the algorithm would approach the correct value, but eventually start to veer away.</p>
<p>For example, you can solve a definite integral by integrating the function and solving for a specific interval.  But some functions can&#8217;t be integrated, or maybe you just don&#8217;t know how, and you have to divide the area under the curve into little rectangular strips and add up their areas.<br />
In my time, sometimes we even plotted the integrals out on graph paper, cut out the curves with scissors and actually weighed them on a Mettler balance to get the area under the curve (the definite integral).</p>
<p>The ancient Egyptians could not calculate pi, or use the Pythagorean theorem, but they had developed numerical and graphical solutions to get decent enough results to build their pyramids.</p>
<p>Its like you&#8217;re exploring the mental space between the mathematics and the engineering.  Today, the computer can be instructed to do the calculations, and you can try it in multiple ways so you can determine how the approximations converge, or if they do at all.  But I remember when you could take courses in it, for those times when you just didn&#8217;t have a good mathematical model of what you  were studying.</p>
<p>Sorry.  Sometimes I just ramble aimlessly.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: mcfly</title>
		<link>https://habitablezone.com/2017/05/10/positive-feedback-2/#comment-39142</link>
		<dc:creator>mcfly</dc:creator>
		<pubDate>Fri, 12 May 2017 01:06:32 +0000</pubDate>
		<guid isPermaLink="false">https://www.habitablezone.com/?p=63896#comment-39142</guid>
		<description>I think of the p-value as a point of intersection between statistics and ethics. As you know, a 95% confidence interval is by far the most commonly used, but it&#039;s still somewhat arbitrary. Whenever I do hypothesis testing (which, granted, isn&#039;t all that often anymore), I explore different intervals if only to glimpse the &quot;lay of the land&quot; (which might not need quotes if you&#039;re dealing with geostatistics!)

Anywho, I&#039;ve long thought that an explanation of why a certain p-value was chosen, and how the hypothesis faired with different values, would be a simple and transparent way to help explain one&#039;s determinations.

Just an non-specialist&#039;s opinion.</description>
		<content:encoded><![CDATA[<p>I think of the p-value as a point of intersection between statistics and ethics. As you know, a 95% confidence interval is by far the most commonly used, but it&#8217;s still somewhat arbitrary. Whenever I do hypothesis testing (which, granted, isn&#8217;t all that often anymore), I explore different intervals if only to glimpse the &#8220;lay of the land&#8221; (which might not need quotes if you&#8217;re dealing with geostatistics!)</p>
<p>Anywho, I&#8217;ve long thought that an explanation of why a certain p-value was chosen, and how the hypothesis faired with different values, would be a simple and transparent way to help explain one&#8217;s determinations.</p>
<p>Just an non-specialist&#8217;s opinion.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: RL</title>
		<link>https://habitablezone.com/2017/05/10/positive-feedback-2/#comment-39140</link>
		<dc:creator>RL</dc:creator>
		<pubDate>Fri, 12 May 2017 00:17:57 +0000</pubDate>
		<guid isPermaLink="false">https://www.habitablezone.com/?p=63896#comment-39140</guid>
		<description>fixed missing numbers (nt)</description>
		<content:encoded><![CDATA[<p>fixed missing numbers (nt)</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: hank</title>
		<link>https://habitablezone.com/2017/05/10/positive-feedback-2/#comment-39136</link>
		<dc:creator>hank</dc:creator>
		<pubDate>Fri, 12 May 2017 00:05:52 +0000</pubDate>
		<guid isPermaLink="false">https://www.habitablezone.com/?p=63896#comment-39136</guid>
		<description>If they are warmed sufficiently, they can release CH4 (a very powerful) greenhouse gas) into the environment.

You may recall that clathrates were a key factor in the failure of  the BP Deep Horizons Platform.</description>
		<content:encoded><![CDATA[<p>If they are warmed sufficiently, they can release CH4 (a very powerful) greenhouse gas) into the environment.</p>
<p>You may recall that clathrates were a key factor in the failure of  the BP Deep Horizons Platform.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: hank</title>
		<link>https://habitablezone.com/2017/05/10/positive-feedback-2/#comment-39134</link>
		<dc:creator>hank</dc:creator>
		<pubDate>Fri, 12 May 2017 00:01:04 +0000</pubDate>
		<guid isPermaLink="false">https://www.habitablezone.com/?p=63896#comment-39134</guid>
		<description>8)</description>
		<content:encoded><![CDATA[<p> <img src='https://habitablezone.com/wp-includes/images/smilies/icon_cool.gif' alt='8)' class='wp-smiley' /> </p>
]]></content:encoded>
	</item>
	<item>
		<title>By: RL</title>
		<link>https://habitablezone.com/2017/05/10/positive-feedback-2/#comment-39132</link>
		<dc:creator>RL</dc:creator>
		<pubDate>Thu, 11 May 2017 23:46:52 +0000</pubDate>
		<guid isPermaLink="false">https://www.habitablezone.com/?p=63896#comment-39132</guid>
		<description>&lt;a href=&quot;http://www.nature.com/nclimate/journal/v7/n5/full/nclimate3262.html&quot; target=&quot;_blank&quot; rel=&quot;nofollow&quot;&gt;http://www.nature.com/nclimate/journal/v7/n5/full/nclimate3262.html&lt;/a&gt;


&lt;blockquote&gt;Here we show that in each of the models, their present-day spatial distribution of permafrost and air temperature can be used to infer the sensitivity of permafrost to future global warming. Using the same approach for the observed permafrost distribution and air temperature, we estimate a sensitivity of permafrost area loss to global mean warming at stabilization of 4 (+1/-1.1) million km2  PER °C (1σ confidence), which is around 20% higher than previous studies. Our method facilitates an assessment for COP21 climate change targets11: if the climate is stabilized at 2 °C above pre-industrial levels, we estimate that the permafrost area would eventually be reduced by over 40%. Stabilizing at 1.5 °C rather than 2 °C would save approximately 2 million km2 of permafrost.&lt;/blockquote&gt;



&lt;a href=&quot;http://www.colorado.edu/today/2015/10/26/study-shows-thawing-permafrost-quickly-turns-co2-climate-concern&quot; target=&quot;_blank&quot; rel=&quot;nofollow&quot;&gt;http://www.colorado.edu/today/2015/10/26/study-shows-thawing-permafrost-quickly-turns-co2-climate-concern&lt;/a&gt;
	

&lt;blockquote&gt;Researchers from the U.S. Geological Survey and key academic partners including the University of Colorado Boulder have quantified how rapidly ancient permafrost decomposes upon thawing and how much carbon dioxide is produced in the process.

Huge stores of organic carbon in permafrost soils -- frozen for hundreds to tens of thousands of years across high northern latitudes worldwide -- are currently isolated from the modern day carbon cycle. However, if thawed by changing climate conditions, wildfire, or other disturbances, this massive carbon reservoir could decompose and be emitted as the greenhouse gases carbon dioxide and methane, or be carried as dissolved organic carbon to streams and rivers.

“Many scientists worldwide are now investigating the complicated potential end results of thawing permafrost,” said Rob Striegl, USGS scientist and study co-author. “There are critical questions to consider, such as: How much of the stored permafrost carbon might thaw in a future climate? Where will it go? And, what are the consequences for our climate and our aquatic ecosystems?”

At a newly excavated tunnel operated by the U.S. Army Corps of Engineers near Fairbanks, Alaska, a research team from USGS, CU-Boulder and and Florida State University set out to determine how rapidly the dissolved organic carbon from ancient (about 35,000 years old) “yedoma” soils decomposes upon soil thaw and how much carbon dioxide is produced.

Yedoma is a distinct type of permafrost soil found across Alaska and Siberia that accounts for a significant portion of the permafrost soil carbon pool. These soils were deposited as wind-blown silts in the late Pleistocene age and froze soon after they were formed.

“It had previously been assumed that permafrost soil carbon this old was already degraded and not susceptible to rapid decomposition upon thaw,” said Kim Wickland, the USGS scientist who led the team.

The researchers found that more than half of the dissolved organic carbon in yedoma permafrost was decomposed within one week after thawing. About 50 percent of that carbon was converted to carbon dioxide, while the rest likely became microbial biomass.&lt;/blockquote&gt;

</description>
		<content:encoded><![CDATA[<p><a href="http://www.nature.com/nclimate/journal/v7/n5/full/nclimate3262.html" target="_blank" rel="nofollow">http://www.nature.com/nclimate/journal/v7/n5/full/nclimate3262.html</a></p>
<blockquote><p>Here we show that in each of the models, their present-day spatial distribution of permafrost and air temperature can be used to infer the sensitivity of permafrost to future global warming. Using the same approach for the observed permafrost distribution and air temperature, we estimate a sensitivity of permafrost area loss to global mean warming at stabilization of 4 (+1/-1.1) million km2  PER °C (1σ confidence), which is around 20% higher than previous studies. Our method facilitates an assessment for COP21 climate change targets11: if the climate is stabilized at 2 °C above pre-industrial levels, we estimate that the permafrost area would eventually be reduced by over 40%. Stabilizing at 1.5 °C rather than 2 °C would save approximately 2 million km2 of permafrost.</p></blockquote>
<p><a href="http://www.colorado.edu/today/2015/10/26/study-shows-thawing-permafrost-quickly-turns-co2-climate-concern" target="_blank" rel="nofollow">http://www.colorado.edu/today/2015/10/26/study-shows-thawing-permafrost-quickly-turns-co2-climate-concern</a></p>
<blockquote><p>Researchers from the U.S. Geological Survey and key academic partners including the University of Colorado Boulder have quantified how rapidly ancient permafrost decomposes upon thawing and how much carbon dioxide is produced in the process.</p>
<p>Huge stores of organic carbon in permafrost soils &#8212; frozen for hundreds to tens of thousands of years across high northern latitudes worldwide &#8212; are currently isolated from the modern day carbon cycle. However, if thawed by changing climate conditions, wildfire, or other disturbances, this massive carbon reservoir could decompose and be emitted as the greenhouse gases carbon dioxide and methane, or be carried as dissolved organic carbon to streams and rivers.</p>
<p>“Many scientists worldwide are now investigating the complicated potential end results of thawing permafrost,” said Rob Striegl, USGS scientist and study co-author. “There are critical questions to consider, such as: How much of the stored permafrost carbon might thaw in a future climate? Where will it go? And, what are the consequences for our climate and our aquatic ecosystems?”</p>
<p>At a newly excavated tunnel operated by the U.S. Army Corps of Engineers near Fairbanks, Alaska, a research team from USGS, CU-Boulder and and Florida State University set out to determine how rapidly the dissolved organic carbon from ancient (about 35,000 years old) “yedoma” soils decomposes upon soil thaw and how much carbon dioxide is produced.</p>
<p>Yedoma is a distinct type of permafrost soil found across Alaska and Siberia that accounts for a significant portion of the permafrost soil carbon pool. These soils were deposited as wind-blown silts in the late Pleistocene age and froze soon after they were formed.</p>
<p>“It had previously been assumed that permafrost soil carbon this old was already degraded and not susceptible to rapid decomposition upon thaw,” said Kim Wickland, the USGS scientist who led the team.</p>
<p>The researchers found that more than half of the dissolved organic carbon in yedoma permafrost was decomposed within one week after thawing. About 50 percent of that carbon was converted to carbon dioxide, while the rest likely became microbial biomass.</p></blockquote>
]]></content:encoded>
	</item>
	<item>
		<title>By: mcfly</title>
		<link>https://habitablezone.com/2017/05/10/positive-feedback-2/#comment-39130</link>
		<dc:creator>mcfly</dc:creator>
		<pubDate>Thu, 11 May 2017 23:32:26 +0000</pubDate>
		<guid isPermaLink="false">https://www.habitablezone.com/?p=63896#comment-39130</guid>
		<description>It looks like they used a standard 95% confidence interval throughout, but with the ongoing controversy surrounding the p-value I would have liked to see info summarizing results from using different values. I&#039;m not a statistician, but imho those intervals should be explored and disclosed, specifically regarding the effect on hypothesis testing. If that means the Supplementary Material needs a supplement, so be it.

I&#039;m not suggesting for a second that there&#039;s anything amiss with the work as it is, of course...just opining that some additional info could shine a much wider light on their analysis. If I missed anything in the documents that renders my point...well, pointless...please let me know.</description>
		<content:encoded><![CDATA[<p>It looks like they used a standard 95% confidence interval throughout, but with the ongoing controversy surrounding the p-value I would have liked to see info summarizing results from using different values. I&#8217;m not a statistician, but imho those intervals should be explored and disclosed, specifically regarding the effect on hypothesis testing. If that means the Supplementary Material needs a supplement, so be it.</p>
<p>I&#8217;m not suggesting for a second that there&#8217;s anything amiss with the work as it is, of course&#8230;just opining that some additional info could shine a much wider light on their analysis. If I missed anything in the documents that renders my point&#8230;well, pointless&#8230;please let me know.</p>
]]></content:encoded>
	</item>
	<item>
		<title>By: TB</title>
		<link>https://habitablezone.com/2017/05/10/positive-feedback-2/#comment-39115</link>
		<dc:creator>TB</dc:creator>
		<pubDate>Thu, 11 May 2017 18:10:47 +0000</pubDate>
		<guid isPermaLink="false">https://www.habitablezone.com/?p=63896#comment-39115</guid>
		<description>&lt;p&gt;Original sources:&lt;/p&gt;

The earlier USGS study, summarized on a poster here: 
&lt;a href=&quot;https://csc.alaska.edu/sites/default/files/Waldrop_NACP_AGU2012.pdf&quot; rel=&quot;nofollow&quot;&gt;https://csc.alaska.edu/sites/default/files/Waldrop_NACP_AGU2012.pdf&lt;/a&gt;
 
The original paper for the new study is online here: 
&lt;a href=&quot;http://www.pnas.org/content/early/2017/05/02/1618567114&quot; rel=&quot;nofollow&quot;&gt;http://www.pnas.org/content/early/2017/05/02/1618567114&lt;/a&gt;
 
And the detailed appendix is here: 
&lt;a href=&quot;http://www.pnas.org/content/suppl/2017/05/03/1618567114.DCSupplemental/pnas.1618567114.sapp.pdf&quot; rel=&quot;nofollow&quot;&gt;http://www.pnas.org/content/suppl/2017/05/03/1618567114.DCSupplemental/pnas.1618567114.sapp.pdf&lt;/a&gt;</description>
		<content:encoded><![CDATA[<p>Original sources:</p>
<p>The earlier USGS study, summarized on a poster here:<br />
<a href="https://csc.alaska.edu/sites/default/files/Waldrop_NACP_AGU2012.pdf" rel="nofollow">https://csc.alaska.edu/sites/default/files/Waldrop_NACP_AGU2012.pdf</a></p>
<p>The original paper for the new study is online here:<br />
<a href="http://www.pnas.org/content/early/2017/05/02/1618567114" rel="nofollow">http://www.pnas.org/content/early/2017/05/02/1618567114</a></p>
<p>And the detailed appendix is here:<br />
<a href="http://www.pnas.org/content/suppl/2017/05/03/1618567114.DCSupplemental/pnas.1618567114.sapp.pdf" rel="nofollow">http://www.pnas.org/content/suppl/2017/05/03/1618567114.DCSupplemental/pnas.1618567114.sapp.pdf</a></p>
]]></content:encoded>
	</item>
</channel>
</rss>
