{"id":232,"date":"2021-03-12T14:26:48","date_gmt":"2021-03-12T14:26:48","guid":{"rendered":"https:\/\/science-chatter.blogs.uni-hamburg.de\/?p=232"},"modified":"2022-08-23T09:48:08","modified_gmt":"2022-08-23T09:48:08","slug":"the-phase-problem-of-my-life-as-a-phd-student","status":"publish","type":"post","link":"https:\/\/science-chatter.blogs.uni-hamburg.de\/?p=232","title":{"rendered":"The &#8220;phase&#8221; problem of my life as a PhD student"},"content":{"rendered":"\n<p>There is a famous legend that everyone hears somewhere &#8211; <\/p>\n\n\n\n<p>Isaac Newton was sitting in his garden one morning and watched an apple fall from the tree. He thought to himself &#8211; <em>Why do apples always fall down and not up? Is it because the earth is attracting the apple towards itself or vice versa?<\/em><\/p>\n\n\n\n<!--more-->\n\n\n\n<p>The story goes that this series of events eventually led him to write down a mathematical equation for the force of gravity on an object of non-zero mass and study the motion of planets around the sun.<\/p>\n\n\n\n<p class=\"has-medium-font-size\"><strong>To learn anything about an object, you have to \u201cwatch\u201d it- or rather- \u201cobserve\u201d it. <\/strong><\/p>\n\n\n\n<p>But how do we \u201cobserve\u201d things which are too small for our naked eyes to see?<\/p>\n\n\n\n<p>In our Biology class in school, all of us are taught about the \u201cMicroscope\u201d which is used to look at cells that make up our bodies. White (optical) light from the sun is incident on our \u201cobject\u201d \u2013 i.e. the cells on the slide of the microscope. The microscope has an arrangement of mirrors and lenses with different focal lengths carefully positioned so that they produce a \u201cmagnified image\u201d of the cells for our eyes to see.<br>Similarly,  we have several \u201cMicroscopy\u201d techniques to look at individual atoms of a material and how they behave in the presence or absence of internal or external forces.  In my case, it is called <a href=\"https:\/\/link.springer.com\/content\/pdf\/10.1023\/A:1012607212694.pdf\">Nuclear Resonant Scattering (NRS)<\/a>.  The setup does not look like a desktop microscope in a Biology lab that works on optical light.  This is because to observe something small in size,<a href=\"https:\/\/physics.stackexchange.com\/questions\/511003\/why-do-we-need-short-wavelength-to-measure-the-position-of-a-particle-accurately\"> you need to illuminate it with light waves which have small enough wavelength (or equivalently, high enough energy)<\/a>.  Atoms are smaller than cells,  so to observe them we need ultra-energetic light waves with wavelengths smaller than the &#8220;normal&#8221;  light.  For our NRS technique, we need a special X-ray light that is emitted by ultra-fast electrons looping around a 2.3 kilometers long ring at the <a href=\"https:\/\/www.desy.de\/research\/facilities__projects\/petra_iii\/index_eng.html\">DESY synchrotron facility<\/a> in Hamburg.<\/p>\n\n\n\n<p><br>The current focus of my project is to tackle the \u201cphase problem\u201d in this NRS technique in hopes to make it as useful and versatile as possible in studying materials for several purposes.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><mark style=\"background-color:rgba(0, 0, 0, 0)\" class=\"has-inline-color has-vivid-red-color\">The phase problem<\/mark><\/h2>\n\n\n\n<p>In the language of image processing, it is a problem of object field reconstruction using the image field intensity. Imagine- you clicked an image of an object but your \u201ccamera\u201d is faulty.  The camera only captures the brightness or \u201cintensity profile\u201d of the image and not its features or \u201cphase profile\u201d. How can you retrieve the complete information about the object using such a camera?<\/p>\n\n\n\n<figure class=\"wp-block-image size-large is-style-default\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"565\" src=\"https:\/\/science-chatter.blogs.uni-hamburg.de\/wp-content\/uploads\/2021\/03\/Phaseproblem-1-1024x565.png\" alt=\"\" class=\"wp-image-614\" srcset=\"https:\/\/science-chatter.blogs.uni-hamburg.de\/wp-content\/uploads\/2021\/03\/Phaseproblem-1-1024x565.png 1024w, https:\/\/science-chatter.blogs.uni-hamburg.de\/wp-content\/uploads\/2021\/03\/Phaseproblem-1-300x166.png 300w, https:\/\/science-chatter.blogs.uni-hamburg.de\/wp-content\/uploads\/2021\/03\/Phaseproblem-1-768x424.png 768w, https:\/\/science-chatter.blogs.uni-hamburg.de\/wp-content\/uploads\/2021\/03\/Phaseproblem-1-1536x848.png 1536w, https:\/\/science-chatter.blogs.uni-hamburg.de\/wp-content\/uploads\/2021\/03\/Phaseproblem-1-2048x1130.png 2048w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><figcaption>Phase problem in the faulty camera.<\/figcaption><\/figure>\n\n\n\n<p>In our case, NRS is the &#8220;faulty&#8221; camera and we want to retrieve the phase profile from its spectral image data in order to reconstruct the object. Our object is, for example, a film of ferromagnetic material &#8211; containing special Iron (<a href=\"http:\/\/web.mit.edu\/dvp\/www\/Work\/8.14\/dvp-mossbauer-paper.pdf\">Fe<sup>57<\/sup><\/a>) nuclei which are capable of exhibiting <a href=\"https:\/\/link.springer.com\/content\/pdf\/10.1007%2F978-3-540-44699-6_4.pdf\">nuclear resonance.<\/a>  . But a difference from the camera analogy is that our &#8220;image&#8221; in this case is not two dimensional but rather a 1-dimensional dataset.<\/p>\n\n\n\n<p>For the more Mathematics-oriented folks our there, the phase problem in NRS can be theoretically framed as a non-linear <a href=\"https:\/\/en.wikipedia.org\/wiki\/Inverse_problem\">inverse problem<\/a>.<\/p>\n\n\n\n<p>Let us assume that our film with resonant nuclei is just a homogenous medium with some refractive index <strong>n(\u03c9)<\/strong> and thickness <strong>D<\/strong>. Radiation that passes through\/ illuminates the sample has an initial field <strong>E<sub>0<\/sub>(\u03c9) <\/strong>with wave vector <strong>k<sub>0<\/sub> = \u03c9\/c<\/strong>, and is scattered by the object.<\/p>\n\n\n\n<p>The object field in energy <strong>(\u03c9)<\/strong> space is thus a complex function <\/p>\n\n\n\n<p><strong>E(\u03c9)<\/strong> = <strong>|E<sub>0<\/sub>(\u03c9)|e<sup>ik<sub>0<\/sub>Dn(\u03c9)<\/sup>=|E(\u03c9)|e<sup> i\u03b7(\u03c9)<\/sup><\/strong><\/p>\n\n\n\n<p>with an amplitude part<strong> |E(\u03c9)|<\/strong> and a phase part <strong>\u03b7(\u03c9)<\/strong>. <\/p>\n\n\n\n<p>Let <strong>E(t)<\/strong> be its &#8216;image&#8217; in time <strong>(t)<\/strong> space which is related to our object field via a <a href=\"https:\/\/homepages.inf.ed.ac.uk\/rbf\/HIPR2\/fourier.htm\">Fourier transform<\/a>:<br><strong>E(t) = F<sup>\u22121<\/sup>{E(\u03c9)}<\/strong> &#8230;&#8230;&#8230;..or equivalently,  <strong>E(\u03c9) = F{E(t)}<\/strong>.<\/p>\n\n\n\n<p>In general, the image field is also complex and has an amplitude part <strong>E(t)<\/strong> and a<br>phase part <strong>\u03c6(t)<\/strong>:      <strong>E(t) = |E(t)|e<sup>i\u03c6(t)<\/sup><\/strong><\/p>\n\n\n\n<p>The time based detector in our NRS experiment setup can only measure the intensity spectrum in time i.e. <strong>I(t)<\/strong> where<strong> I(t) = |E(t)|<sup>2<\/sup><\/strong> .<\/p>\n\n\n\n<figure class=\"wp-block-image size-large\"><img loading=\"lazy\" decoding=\"async\" width=\"657\" height=\"127\" src=\"https:\/\/science-chatter.blogs.uni-hamburg.de\/wp-content\/uploads\/2021\/03\/eq2.png\" alt=\"The following text is outlined in a  red and black box  - &quot;How can we extract the complex object field from the measurement of its image Intensity?&quot;\" class=\"wp-image-358\" srcset=\"https:\/\/science-chatter.blogs.uni-hamburg.de\/wp-content\/uploads\/2021\/03\/eq2.png 657w, https:\/\/science-chatter.blogs.uni-hamburg.de\/wp-content\/uploads\/2021\/03\/eq2-300x58.png 300w\" sizes=\"auto, (max-width: 657px) 100vw, 657px\" \/><figcaption>Note: The problem of extracting E(\u03c9) is equivalent to extracting the<br>phase \u03c6(t) or \u03b7(\u03c9).<\/figcaption><\/figure>\n\n\n\n<h2 class=\"wp-block-heading\"><mark style=\"background-color:rgba(0, 0, 0, 0)\" class=\"has-inline-color has-vivid-red-color\">Some problems have solutions&#8230; and this one?<\/mark><\/h2>\n\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\"><p><strong>There are no solutions. There are only trade-offs.<\/strong><\/p><cite><strong>&nbsp;<\/strong><a href=\"https:\/\/best-quotations.com\/authquotes.php?auth=2172\">Thomas Sowell, &nbsp;1930 <\/a><\/cite><\/blockquote>\n\n\n\n<p><a href=\"https:\/\/en.wikipedia.org\/wiki\/Phase_retrieval\">Algorithms to <em>approximately<\/em> retrieve the phase<\/a> are important not only to NRS, but all types of X-ray spectroscopy. This is mostly because unlike white light optics experiments, implementing <a href=\"https:\/\/en.wikipedia.org\/wiki\/Interferometric_microscopy\">beam-splitting interferometers<\/a> for the short X-ray wavelengths is difficult and one has to take extreme care of the thermal and mechanical stability of the instruments.<\/p>\n\n\n\n<p>However, no matter what numerical\/analytical\/machine learning algorithm we may use, we cannot magically approximate <strong>E(\u03c9)<\/strong> from the <strong>I(t)<\/strong> measurement data alone.<br>It is not sufficient in itself!<br>All phase-retrieval algorithms demand that we have at least \u2018some\u2019 information about our object field in order to extract it from just the amplitude of the image field. Sometimes this means designing experiments with another detector to<a href=\"https:\/\/en.wikipedia.org\/wiki\/Gerchberg%E2%80%93Saxton_algorithm\"> take an extra intensity measurement in the object field space  <strong>I(\u03c9) = |E(\u03c9)|<sup>2<\/sup><\/strong><\/a> .  Most modern approaches, however, find good enough work-arounds with just the image intensity measurement data and some internal or external constraints on  <strong>E(\u03c9)<\/strong>.  The retrieved phase information is as accurate as the strength of these constraints relative to the complex form of the object function. E.g. , in some image reconstruction problems, the knowledge that your object is real and <a href=\"https:\/\/mathworld.wolfram.com\/CompactSupport.html#:~:text=A%20function%20has%20compact%20support,function%20does%20have%20compact%20support.\">compactly supported<\/a> outside of its physical dimensions is enough. <\/p>\n\n\n\n<p>In recent years,<a href=\"https:\/\/www.youtube.com\/watch?v=yVzk7bbQOA8\"> holography<\/a> inspired phase retrieval techniques like <a href=\"https:\/\/arxiv.org\/ftp\/arxiv\/papers\/1606\/1606.09622.pdf\">Ptychography<\/a> <a href=\"https:\/\/www.desy.de\/news\/news_search\/index_eng.html?openDirectAnchor=2010&amp;two_columns=0&amp;printversion=1\">have taken over the fields of X-Ray diffraction<\/a>. In the camera analogy,  it means you take several pictures of your object at different overlapping angles with your faulty camera. If you have enough such images, the redundant data in the images makes it possible to chalk out an approximate phase profile of your object.<\/p>\n\n\n\n<p>Spoiler alert ! This is where I come in with my project. My hopes are to extend this concept to NRS and see what comes out of it. Extracting the phase profile from the 1-dimensional NRS data might eventually prove to be less or more difficult. Let&#8217;s see if in my case I eventually get to say- <em>At last! Ptychography saves the day!<\/em> However, between planning the experiment and designing the Ptychography engine for it, there is still a long way to go.<\/p>\n\n\n\n<p>Till then, I have 99 problems and looking for &#8220;phase&#8221; is all of them.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>There is a famous legend that everyone hears somewhere &#8211; Isaac Newton was sitting in his garden one morning and watched an apple fall from the tree. He thought to himself &#8211; Why do apples&#46;&#46;&#46;<\/p>\n","protected":false},"author":13,"featured_media":497,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[65,17],"tags":[10,14,11,12,15,13,19,18],"class_list":["post-232","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-blog","category-photonscience","tag-inverse-problems","tag-mathematical-imaging","tag-nuclear-resonance-scattering","tag-phase-retrieval","tag-phd-life","tag-ptychography","tag-spectroscopy","tag-synchrotron-radiation"],"aioseo_notices":[],"_links":{"self":[{"href":"https:\/\/science-chatter.blogs.uni-hamburg.de\/index.php?rest_route=\/wp\/v2\/posts\/232","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/science-chatter.blogs.uni-hamburg.de\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/science-chatter.blogs.uni-hamburg.de\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/science-chatter.blogs.uni-hamburg.de\/index.php?rest_route=\/wp\/v2\/users\/13"}],"replies":[{"embeddable":true,"href":"https:\/\/science-chatter.blogs.uni-hamburg.de\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=232"}],"version-history":[{"count":50,"href":"https:\/\/science-chatter.blogs.uni-hamburg.de\/index.php?rest_route=\/wp\/v2\/posts\/232\/revisions"}],"predecessor-version":[{"id":1646,"href":"https:\/\/science-chatter.blogs.uni-hamburg.de\/index.php?rest_route=\/wp\/v2\/posts\/232\/revisions\/1646"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/science-chatter.blogs.uni-hamburg.de\/index.php?rest_route=\/wp\/v2\/media\/497"}],"wp:attachment":[{"href":"https:\/\/science-chatter.blogs.uni-hamburg.de\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=232"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/science-chatter.blogs.uni-hamburg.de\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=232"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/science-chatter.blogs.uni-hamburg.de\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=232"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}