<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	
	>
<channel>
	<title>Comments for Amethyst Informatics</title>
	<atom:link href="http://amethystinformatics.co.uk/blog/comments/feed/" rel="self" type="application/rss+xml" />
	<link>http://amethystinformatics.co.uk/blog</link>
	<description>Crystallising your data</description>
	<lastBuildDate>Wed, 02 Dec 2015 07:18:00 +0000</lastBuildDate>
	<sy:updatePeriod>hourly</sy:updatePeriod>
	<sy:updateFrequency>1</sy:updateFrequency>
	<generator>https://wordpress.org/?v=4.2.38</generator>
	<item>
		<title>Comment on Big Data Said Business School Oxford by Carlota Chesser</title>
		<link>http://amethystinformatics.co.uk/blog/2014/01/04/big-data-said-business-school-oxford/#comment-1782</link>
		<dc:creator><![CDATA[Carlota Chesser]]></dc:creator>
		<pubDate>Wed, 02 Dec 2015 07:18:00 +0000</pubDate>
		<guid isPermaLink="false">http://amethystinformatics.co.uk/blog/?p=51#comment-1782</guid>
		<description><![CDATA[I truly appreciate this post. I have been looking everywhere for this! Thank goodness I found it on Bing. You have made my day! Thx again!]]></description>
		<content:encoded><![CDATA[<p>I truly appreciate this post. I have been looking everywhere for this! Thank goodness I found it on Bing. You have made my day! Thx again!</p>
]]></content:encoded>
	</item>
	<item>
		<title>Comment on When Less is More by molmod</title>
		<link>http://amethystinformatics.co.uk/blog/2015/06/30/when-less-is-more/#comment-127</link>
		<dc:creator><![CDATA[molmod]]></dc:creator>
		<pubDate>Wed, 15 Jul 2015 13:28:31 +0000</pubDate>
		<guid isPermaLink="false">http://amethystinformatics.co.uk/blog/?p=193#comment-127</guid>
		<description><![CDATA[That&#039;s definitely the case : an interpretable model can be much more useful.  Linear regression models certainly win in that sense over NNs.  However, sometimes the descriptors that go into such a model have an obscure relationship with molecular structure and give no real insight into how to optimise the activity of interest.  I don&#039;t think I&#039;d dare ask a chemist to reduce his Balaban index for example.

I love the images generated by the Google NNs in your link.  Particularly the one that sees animals in the pictures of clouds.]]></description>
		<content:encoded><![CDATA[<p>That&#8217;s definitely the case : an interpretable model can be much more useful.  Linear regression models certainly win in that sense over NNs.  However, sometimes the descriptors that go into such a model have an obscure relationship with molecular structure and give no real insight into how to optimise the activity of interest.  I don&#8217;t think I&#8217;d dare ask a chemist to reduce his Balaban index for example.</p>
<p>I love the images generated by the Google NNs in your link.  Particularly the one that sees animals in the pictures of clouds.</p>
]]></content:encoded>
	</item>
	<item>
		<title>Comment on When Less is More by Amethyst</title>
		<link>http://amethystinformatics.co.uk/blog/2015/06/30/when-less-is-more/#comment-118</link>
		<dc:creator><![CDATA[Amethyst]]></dc:creator>
		<pubDate>Wed, 08 Jul 2015 16:56:40 +0000</pubDate>
		<guid isPermaLink="false">http://amethystinformatics.co.uk/blog/?p=193#comment-118</guid>
		<description><![CDATA[Hey Molmod, I like your &quot;less is more&quot; neural net thoughts.  Do you think that the interpretability of a model might also be a factor in the choice of method used?  I have found that some chemistry clients favour models that give them a meaningful reason to why a compound is being predicted as active.

I have just stumbled across this &lt;a target=&quot;_blank&quot; href=&quot;http://googleresearch.blogspot.co.uk/2015/06/inceptionism-going-deeper-into-neural.html&quot; rel=&quot;nofollow&quot;&gt;interesting blog article&lt;/a&gt; published last month about neural networks in image recognition.  Has some really eye catching images of what you get when you take a peek inside a neural network.  For example take a look at the magical animals that are found in the sky.]]></description>
		<content:encoded><![CDATA[<p>Hey Molmod, I like your &#8220;less is more&#8221; neural net thoughts.  Do you think that the interpretability of a model might also be a factor in the choice of method used?  I have found that some chemistry clients favour models that give them a meaningful reason to why a compound is being predicted as active.</p>
<p>I have just stumbled across this <a target="_blank" href="http://googleresearch.blogspot.co.uk/2015/06/inceptionism-going-deeper-into-neural.html" rel="nofollow">interesting blog article</a> published last month about neural networks in image recognition.  Has some really eye catching images of what you get when you take a peek inside a neural network.  For example take a look at the magical animals that are found in the sky.</p>
]]></content:encoded>
	</item>
	<item>
		<title>Comment on When Less is More by molmod</title>
		<link>http://amethystinformatics.co.uk/blog/2015/06/30/when-less-is-more/#comment-114</link>
		<dc:creator><![CDATA[molmod]]></dc:creator>
		<pubDate>Sun, 05 Jul 2015 13:35:37 +0000</pubDate>
		<guid isPermaLink="false">http://amethystinformatics.co.uk/blog/?p=193#comment-114</guid>
		<description><![CDATA[Hi there Amethyst.  I remember this well when I was playing around with neural nets a lot.  Add too many neurons and it was great at telling you what you already knew but the predictions were hopeless.  The other interesting thing was that you could train them too much.  If you curtailed the training, the predictions were usually better than if if let the training run on longer, whereas the error in the test set just got smaller and smaller.  Definitely less is more there...
Neural nets seem to have fallen out of fashion these days (at least in chemistry applications) although there have been one or two more recent papers that may have got around some of these issues.  I wait to see if they rise again in popularity to the levels we saw in the &#039;90s.]]></description>
		<content:encoded><![CDATA[<p>Hi there Amethyst.  I remember this well when I was playing around with neural nets a lot.  Add too many neurons and it was great at telling you what you already knew but the predictions were hopeless.  The other interesting thing was that you could train them too much.  If you curtailed the training, the predictions were usually better than if if let the training run on longer, whereas the error in the test set just got smaller and smaller.  Definitely less is more there&#8230;<br />
Neural nets seem to have fallen out of fashion these days (at least in chemistry applications) although there have been one or two more recent papers that may have got around some of these issues.  I wait to see if they rise again in popularity to the levels we saw in the &#8217;90s.</p>
]]></content:encoded>
	</item>
	<item>
		<title>Comment on Stolen Identity of an Informatician by molmod</title>
		<link>http://amethystinformatics.co.uk/blog/2015/05/30/stolen-identity-of-an-informatician/#comment-109</link>
		<dc:creator><![CDATA[molmod]]></dc:creator>
		<pubDate>Fri, 26 Jun 2015 20:58:09 +0000</pubDate>
		<guid isPermaLink="false">http://amethystinformatics.co.uk/blog/?p=169#comment-109</guid>
		<description><![CDATA[It looks like lots of people are getting big data for Christmas. Personally, I would prefer the choccy you mentioned.]]></description>
		<content:encoded><![CDATA[<p>It looks like lots of people are getting big data for Christmas. Personally, I would prefer the choccy you mentioned.</p>
]]></content:encoded>
	</item>
	<item>
		<title>Comment on Stolen Identity of an Informatician by Amethyst</title>
		<link>http://amethystinformatics.co.uk/blog/2015/05/30/stolen-identity-of-an-informatician/#comment-108</link>
		<dc:creator><![CDATA[Amethyst]]></dc:creator>
		<pubDate>Fri, 26 Jun 2015 15:53:00 +0000</pubDate>
		<guid isPermaLink="false">http://amethystinformatics.co.uk/blog/?p=169#comment-108</guid>
		<description><![CDATA[Hi Molmod!  Great spot.  Thankfully I can confirm that the regular dips occur annually on Christmas week.

At least it proves that an Informatician isn&#039;t just for Christmas (phew), so please remember to feed us coffee and chocolate all year round... 

I did ponder on plotting the running average to smooth out fluctuations and highlight the overall trends.  However since the weekly figures painted a clear picture I decided against this.

Below I added in the phrase &quot;Christmas present&quot; to show when we are doing our online festive shopping in relation to the dips.

&lt;img src=&quot;http://amethystinformatics.co.uk/blog/wp-content/uploads/2015/05/ChristmasShopping1.png&quot; alt=&quot;Trendy geek with festive online shopping&quot; /&gt;]]></description>
		<content:encoded><![CDATA[<p>Hi Molmod!  Great spot.  Thankfully I can confirm that the regular dips occur annually on Christmas week.</p>
<p>At least it proves that an Informatician isn&#8217;t just for Christmas (phew), so please remember to feed us coffee and chocolate all year round&#8230; </p>
<p>I did ponder on plotting the running average to smooth out fluctuations and highlight the overall trends.  However since the weekly figures painted a clear picture I decided against this.</p>
<p>Below I added in the phrase &#8220;Christmas present&#8221; to show when we are doing our online festive shopping in relation to the dips.</p>
<p><img src="http://amethystinformatics.co.uk/blog/wp-content/uploads/2015/05/ChristmasShopping1.png" alt="Trendy geek with festive online shopping" /></p>
]]></content:encoded>
	</item>
	<item>
		<title>Comment on Stolen Identity of an Informatician by molmod</title>
		<link>http://amethystinformatics.co.uk/blog/2015/05/30/stolen-identity-of-an-informatician/#comment-107</link>
		<dc:creator><![CDATA[molmod]]></dc:creator>
		<pubDate>Thu, 25 Jun 2015 09:06:32 +0000</pubDate>
		<guid isPermaLink="false">http://amethystinformatics.co.uk/blog/?p=169#comment-107</guid>
		<description><![CDATA[Hi there Amethyst.  Have you noticed the regular dips in the search frequency in all three graphs?  Perhaps there&#039;s an annual cull of cheminformatics folk.  You better watch out...
Or perhaps it&#039;s just Christmas time and everyone&#039;s busy opening their presents.]]></description>
		<content:encoded><![CDATA[<p>Hi there Amethyst.  Have you noticed the regular dips in the search frequency in all three graphs?  Perhaps there&#8217;s an annual cull of cheminformatics folk.  You better watch out&#8230;<br />
Or perhaps it&#8217;s just Christmas time and everyone&#8217;s busy opening their presents.</p>
]]></content:encoded>
	</item>
	<item>
		<title>Comment on Stolen Identity of an Informatician by alphag</title>
		<link>http://amethystinformatics.co.uk/blog/2015/05/30/stolen-identity-of-an-informatician/#comment-78</link>
		<dc:creator><![CDATA[alphag]]></dc:creator>
		<pubDate>Mon, 01 Jun 2015 06:13:03 +0000</pubDate>
		<guid isPermaLink="false">http://amethystinformatics.co.uk/blog/?p=169#comment-78</guid>
		<description><![CDATA[Labels come and labels go... Designer or otherwise. Data will always remain data; how one uses it will always change with time. Of more concern, is the continual dumbing down of ideas and concepts to &#039;on-board&#039; the &quot;trendy brigade&quot;. Dinosaurs were trendy once! (and may be again, if patience  a virtue)]]></description>
		<content:encoded><![CDATA[<p>Labels come and labels go&#8230; Designer or otherwise. Data will always remain data; how one uses it will always change with time. Of more concern, is the continual dumbing down of ideas and concepts to &#8216;on-board&#8217; the &#8220;trendy brigade&#8221;. Dinosaurs were trendy once! (and may be again, if patience  a virtue)</p>
]]></content:encoded>
	</item>
	<item>
		<title>Comment on A Chemist&#8217;s Centenary Celebration of Pi by alphag</title>
		<link>http://amethystinformatics.co.uk/blog/2015/03/14/a-chemists-centenary-celebration-of-pi/#comment-35</link>
		<dc:creator><![CDATA[alphag]]></dc:creator>
		<pubDate>Mon, 16 Mar 2015 08:18:56 +0000</pubDate>
		<guid isPermaLink="false">http://amethystinformatics.co.uk/blog/?p=99#comment-35</guid>
		<description><![CDATA[My daily dose(s) of pi = the xanthine derived systems which include caffeine and theobromine (chocolate) !   Don&#039;t leave home without it.]]></description>
		<content:encoded><![CDATA[<p>My daily dose(s) of pi = the xanthine derived systems which include caffeine and theobromine (chocolate) !   Don&#8217;t leave home without it.</p>
]]></content:encoded>
	</item>
	<item>
		<title>Comment on A Chemist&#8217;s Centenary Celebration of Pi by Amethyst</title>
		<link>http://amethystinformatics.co.uk/blog/2015/03/14/a-chemists-centenary-celebration-of-pi/#comment-34</link>
		<dc:creator><![CDATA[Amethyst]]></dc:creator>
		<pubDate>Sat, 14 Mar 2015 14:55:14 +0000</pubDate>
		<guid isPermaLink="false">http://amethystinformatics.co.uk/blog/?p=99#comment-34</guid>
		<description><![CDATA[Thanks for the fascinating example of a pi-system.  It is amazing how many retinal shape changes must be happening with the cyclical repeating process of trans going back to cis.  I notice from your wiki link that photoreceptor cells are shaped as rods and cones, bringing us geometrically back to pi

I am  glad you like the picture. It was created by &lt;a href=&quot;http://www.linkedin.com/in/yvonnemacken&quot; target=&quot;_blank&quot; rel=&quot;nofollow&quot;&gt;Design Fusion&lt;/a&gt;]]></description>
		<content:encoded><![CDATA[<p>Thanks for the fascinating example of a pi-system.  It is amazing how many retinal shape changes must be happening with the cyclical repeating process of trans going back to cis.  I notice from your wiki link that photoreceptor cells are shaped as rods and cones, bringing us geometrically back to pi</p>
<p>I am  glad you like the picture. It was created by <a href="http://www.linkedin.com/in/yvonnemacken" target="_blank" rel="nofollow">Design Fusion</a></p>
]]></content:encoded>
	</item>
</channel>
</rss>
