Skip to main content

Posts

Invention of achromat: a bitter fight over priority

The era of 1750s was an exciting time in the technology scene of England. Marine navigation of the empire demanded better time keeping and ever more precise celestial observation. Educated public was keenly interested in astronomy, partly from beliefs (still strong in our time) that celestial bodies determine our fate, partly from genuine desire to understand the Universe. Fortunes were made in manufacturing watches, navigation instruments, and telescopes. Although refractive (lens-based) telescopes were already well known at that time, they suffered from chromatic aberrations, because of light dispersion in glass: light with different wavelength (color) propagates with different speed and thus gets focused at different distances from the lens, which creates colored fringes in the image. It was widely believed, due to Sir Isaac Newton's works, that chromatic aberration cannot be corrected with lenses. However, some scientists kept pursuing this lead, including Leonhard Euler, who
Recent posts

Cleaning optics: what your mother (probably) didn't tell you about the soap

In everyday life and in the lab, there is often a mundane but important problem: cleaning optics . It can be your glasses, microscope objectives, AR-coated lenses, or microscopy coverlips. There are many comprehensive online resources on the topic, such as Newport  tutorial  and Photonics review . In practice, I found that achieving clean surface of glasses, lenses, and coverslips is really hard when using recommended organic solvents. I tried chemically pure 99.98% ethanol, isopropanol, acetone, with and without sonication, gentle wiping with special lens tissue pads  from Thorlabs and generic lens tissue. Almost always there is some residual dirt, smudges or dust. Out of desperation, I tried ordinary soap from lab dispenser - applied, gently wiped, rinsed with distilled water, dried with optical tissue. THIS THING DOES MAGIC. The cheapest and most effective way of cleaning optics ever. This became my only way of cleaning glasses, lenses with AR coatings, and coverslips. Hav

Shack-Hartmann sensor resolution - how much is good?

If you are new to adaptive optics (AO) like me, the selection of right hardware can be daunting. Starting with a  wavefront sensor - they range in price, resolution, and many options which are not obvious. By practical trial and error I learned something about resolution, which wasn't obvious to me a year ago. The Shack-Hartmann wavefront sensor (WFS) is essentially a camera with a lenslet array instead of an objective.  There are sensors with 15x15 lenses, 30x30 and higher. Naively, you might think "the more the better" - we are digital age kids used to get high-res for cheap. However, there is a catch. High-res sensor, say, 30x30 lenslets, divides your photon count by 900 per spot. Roughly speaking, when you image a fluorescent bead (or another point source) by a camera with "normal lens" (not a lenslet array), and your peak intensity is 2000, this makes a very nice, high SNR bead image. However, is you switch to Fourier (pupil) plane and image the wave

Control of sCMOS cameras with Python

Running a sCMOS camera using a software with GUI is nice, but at some point I want to control the camera from Jupyter notebook, so that I can acquire images and analyse them on the same page of code. Unfortunately many camera manufacturers don't provide Python API (any of them do?). So the hard way would to ferret out camera's drivers (.dll files), figure out the names and arguments of functions, and write a home-made Python wrapper for a DLL file using ctypes, like I did for Thorlabs wavefront sensor . This is a way of tears and pain. Luckily, there is a silk road to the camera control. The API for many cameras and other instruments is already implemented in MicroManager, which also has a Python wrapper MMCorePy  around it! So, after quick and easy installation of the  MMCorePy,  Python code becomes simple and clean, with all heavy lifting done in MicroManager API running under the hood:

Programming NI DAQmx board in Python: easier than you think!

For my DIY microscope I had a task - generate a train of digital pulses which simulate camera trigger, so that other devices (galvo and laser) are synched. I wanted to do it in Python , so that it seamlessly integrates in my data acquisition and analysis Jupyter notebook. After some quick search I found a PyDAQmx library which seemed mature and had good examples to begin with. Installation was smooth: download, unzip, open Anaconda prompt, python setup.py install After only 30 min fiddling, I was able to solve my problem in just a few lines of code: Holy crap, it just works, out of the box. Oscilloscope shows nice digital pulses every 100 ms, each 1 ms long. The code is much shorter and cleaner than would be in C, C#, or LabView. PyDAQmx appears to be a full-power wrapper around native NI DAQmx drivers (yes, they need to be installed), so presumably it can do all that can be done in C or even LabView (this statement needs to be tested). One can use PyDAQmx to control ga

Programming of DIY microscopes: MicroManager vs LabVIEW

In the flourishing field of DIY light microscopy, a decision of choosing the programming language to control the microscope is critically important. Modern microscopes are becoming increasingly intelligent. They orchestrate multiple devices (lasers, cameras, shutters, pockel cells) with ever increasing temporal precision, collect data semi-automatically following user-defined scenarios, adjust focus and illumination to follow the motion (or development) of a living organism. So, the programming language must seamlessly communicate with hardware, allow devices be easily added or removed, have rich libraries for device drivers and image processing, and allow coding of good-looking and smooth GUIs for end users. This is a long list of requirements! So, what are the  options for DIY microscope programming? There are currently two large schools of microscope programming - Labviewers and Micromanagers . (update: Matlab for microscope control also has a strong community, comparable to la

3D modeling in a lab

About once a week I am asked by my colleagues which 3D modeling software I am using - usually when I am staring at the new part being 3D printed. I am using  Autodesk Inventor for a few reasons: it is a professional software for engineers and has huge community around it it provides free academic license there are thousands of youtube videos with detailed tutorials by enthusiasts easy to learn at a basic level, but there is always a lot of room for growth In a lab, there are two main workflows where Inventor is necessary: 3D modeling of complex assemblies (like custom-built microscope) and 3D printing. There are many youtube tutorials for beginners , so I here only review some things that Inventor can do, without any specific instructions.  3D modeling of parts and assemblies Before building a new microscope, you can create its virtual model and check dimensions, required adapters, and whether things will fit together. Luckily, Thorlabs has 3D model of nearly all it