Thursday 08/21/14

10:00 am – 10:00 pm

Multicolor analysis


  • This code is still not too robust. A few mismatched outliers constrain and skew the quality of the warp map.
  • It also is harder when beads from a single field are very dense and when the quadview is badly misaligned.
  • CS always realigns the quadview before using the microscope and normally after that it aligned to within a pixel, even if it was off by 10s of pixels before. This time it was already aligned within a pixel and when I got back to the scope it was evidently out of line by 10s of pixels! My code should be able to handle this just fine (with some substantial reduction to the usable field of view, 30 pixel misalignment means 60 of 256 pixels lost)
  • proposed fix: lets do two warps, and throw out all the outliers before the second warp. Their are very clearly two populations, one that has matched quite well, and one that is off by anything from 10s of nm to 1000s of nm.
  • or maybe better, just rematch references and data after the first warp with a more aggressive cut off — I believe this was the original way anyway.
  • also, 3D correction via 3D bead fitting is out, so lets remove it from the ChromeWarps code. This is better corrected by adjusting the imaging plane offset between the movies so all the data is already maximally in focus and the zero points of the z-fits are already aligned to match. differences in the zero-point z-positions can be measured with z-calibration files
    • currently I determine this by eye to be about an 80 nm offset, which could be off +/- 40nm
    • I should probably try this with the same set of beads and fit both movies
    • Also should have z-calibration code return the offset from the starting point — not sure I save this as an output right now of if the graphs get re-centered.
  • Final solution:
    • MatchFeducials autocorrelation approach with upsampled images can run into trouble sometimes (not at all clear why, possibly this moves big shifts outside of the default region size for corrMmini). We totally don’t need sub-pixel registration for this initial step to just get the x,y shift so I changed the default to 1 pixel (no up or down-sampling).
    • also had bug in finding unique nearest-neighbor matches between molecule sets, which had been hacked with a final distance cutoff to reject the occassional weird errors. This has now been fixed.
    • If using autocorrelation, the cutoff distances should be unnecessary. As long as enough nearest neighbors are unique everything is good, it doesn’t matter how many total neighbors you have inside the search window.
    • only parameters.AffineRadius has any function now, parameters.matchRadius is not used.
  • updated solution pushed to release (ZhuangLab)

Data analysis

Chromatin simulations

  • Goals:
    • initialize simulations with unknotted polymers
    • validate that unknotted polymers are unknotted
    • get openMM running on Odyssey
  • Python on Odyssey

    • see new documentation on Anaconda
    • to switch to my python environment where I have installed the dependent packages for polymerutils:

    module load python
    source activate PolymerEnv

now launch to python. Then add openmmPolymer to path and import polymerutils

import sys
import polymerutils
import knotAnalysis
data = polymerutils.load("/n/home05/boettiger/OpenMM/Data/myPoly0082")
print data
num = knotAnalysis.analyzeKnot(data)
print num

local coding:

  • output of polymerutils.grow_rw cannot be inserted straight into
  • hacked this by converting to a list of lists and multiplying that by unit.nanometer (some weird class of its own from simtk). this object can successfully be passed to and read back in with Simulation.load (after which it counts members and populates Simulation.N). So after loading this object as an initial polymer structure the simulation will run without errors.
  • This approach is implemented in
  • I tested the inital structure and one of the later timesteps from this simulation with 82 monomers. Both return num = 1 under knotAnalysis.analyzeKnot run on Odyssey (requires linux platform). I was expecting 0 for uknotted but apparently this code does 1.
    • this could use some further validation. Visual inspection at least makes it look believably unknotted.
    • this is at least much better than my previous 200 node random-walk polymer that returned a non-integer value larger than the number of nodes, and had what appeared to be near-certain knots by visual inspection / rotation of the 3D polymer.

More Odyssey coding

  • running code failed at import simtk.unit as unit
  • attempted to pip install simtk in conda. Failed.
  • running on gpu nodes: should look something like this srun -p resonance --pty --mem 500 /bin/bash
    • I’ve replace ‘interact’ in the example, which is one of the ‘partitions’ on odyssey, with ‘resonance’, which I’ve been told is a gpu partition.

Some results

  • it seems the unknotted, small box high compression start off with a decent appearance of fractal folding, but even with GrosbergRepulsiveForce and no truncation, it appears we see crossing later on, both by visual inspection and by running knotAnalysis. Early stages show pronounced color segregation, later stages show more mixing and high (non-integer, troublingly) knot numbers.

UnknottedStart UnknottedStart2 Unknotted2000merConstrained Reknotted2000merConstrained FractalFoldingQ FractalFolding2Q

Clearly not simulating for long enough yet to estimate contact frequency (could do more reinitalizations / rounds)


Posted in Summaries | Tagged , | Comments Off

Wednesday 08/20/14

9:30 am – 9:15 pm


  • working on / revising BW proposal
  • two more rounds of comments from Carl
  • comments from Jeff.

STORM Analysis

Serious problems introduced by microscope A

  • recent bead data sets collected on new version of software are NOT renormalized to maintain full dynamic range and constant contrast! Need to fix this in
  • Dave destroys my conv_bk backup movies, does crazy unexpected stuff with power files and laser levels (blasts my sample with UV). This unanticipated behavior is highly destructive to my sample and data quality.
  • Unexpected behavior in dave is due to the entirely different way dave parses shutter files. This needs to be fixed before imaging more.
  • Running chromatic bead analysis. Quad view horribly misaligned — require match radius of greater than 30 pixels to match beads.


  • interesting article on transcription vs translation as determinant of protein levels. Also a good example of reusing public data sets.

Cover image


(Good article by Thomas, Hernan and Shawn as well).

Posted in Summaries | Comments Off

Protected: project 2: comparison data

This content is password protected. To view it please enter your password below:

Posted in Genomics | Comments Off

Tuesday 08/19/14

10:00 am – 11:00 pm


  • final comments back and forth on manuscript revisions and figures for Ph project
  • Ph paper sent to supporting middle authors for comments and revisions
  • working on BW proposal
  • got comments back from Carl on BW proposal.
  • Rewriting proposal from scratch, trying to improve clarity focus and impact.
  • completed first pass through new draft. Needs read-through and conclusion.
Posted in Summaries | Comments Off

Monday 08/19/14

10:00 am – 10:00 pm


  • revising manuscript and supplemental material for Ph Project
  • working on BW application


  • imaging L4E17. 2:00pm-10:00pm
  • imaging L4E17to19 (10:00pm – 10:00am)
Posted in Summaries | Comments Off

Sunday 08/17/14

10:15 am — 11:15 pm


  • working on essay for Burroughs Wellcome application.

Chromatin Imaging

  • Testing IR shutter for 750 imaging
    • wrote new communications/connections file for Toptica laser, saved to desktop for quick use (previous file copied from old storm2 computer doesn’t work — com port changed).
  • taking bead data
  • setting up to image new samples: L4E02-647 + E01-750
    • 647 worked much better this time — bright clear spots obvious in every cell.
    • 750 spots look pretty dim
    • originally made with pure BME buffer, spots were dim. Remade with MEA buffer, maybe a touch brighter but not strongly so. Remade with brand new MEA buffer, still not a dramatic change to 750 brightness.
    • original region selected went outside buffer resevoir over edge of PDMS. spots equally bright here (they were still bathed in buffer) but switching is poor (no buffer turn-over).


  • working to get KnotAnalysis running on Odyssey (requires linux)
  • see emails from today’s date from RC computing with directions for running anaconda python and installing local python packages
  • Adding my directories to systems path once in python: import sys, then sys.path.append("/n/home05/boettiger/OpenMM/openmmPolymer")
  • KnotAnalysis seems to execute fine, though the number doesn’t quite make sense (200 monomers “simplified” down to 32 but with a knot number (crossing number) of 427.3 something like that. Why isn’t this an integer and how can it be substantially larger than the number of simplified monomers (or total monomers?)
Posted in Summaries | Comments Off

Saturday 08/16/14

5:30 pm – 12:10 am

Data analysis

  • Analyzing recent multi-color chromatin data with ChromatinCropper.fig
  • not dramatically different overlap/exclusion between yellow-yellow and yellow-black
  • yellow-black is however a ‘closer’ boundary (at least in the black-to-yellow direction) since the black domain is more compact
  • should also do some modeling on this to better inform intuition.

Some snapshots from today’s analysis




  • Interesting review article by Pirrotta on new papers arguing PRC1 then PRC2

Maintaining the living things

  • passage cells
  • flip fly stocks
Posted in Fly Work, Summaries | Comments Off

Friday 08/15/14

10:00 am – 11:30 pm


  • imaging failed last night
    • should have taken more alarm when the test image didn’t save
    • parameters were default to .647 format which didn’t exist, resulting in save error.
    • computer auto-rebooted last night anyway
  • Updating software
    • updated imagewriters branched off new / stable storm2. Now have .647 and .750 export options again.
    • tested update, pushed to my storm-control git branch as storm2_140815 branch
    • could not reproduce jumpy behavior of steve for Hazen this morning, hopefully its done.
    • updated windows on new computer to not allow automatic restart if users are logged in.
    • Dave not issue not fixed (not so bad if first image is conventional. HB attributes this bug to JM. promises yet another version of Dave software coming soon anyway (eee hopefully this is better?).
  • setting up STORM imaging of chromatin-L4E1 (F10, first 100 kb)
    • spots much brighter than the sample 3, which was supposed to have E1-E3 (first 300 kb of F10).
    • Not really as bright as I’d expect though for 100 kb black region. Still not so sure the plate based method gives as good probes.
    • we’ll see how things turn out in the proper analysis.

ChromatinCropper update

  • relabel ‘channel1′ and ‘channel2′ — it’s already hardcoded to grab based on ’647′ and ’750′, so this ‘channel 1′ stuff is just confusing.

Ph Project

  • checking images
  • checking IQR values
  • clarify which panels are which in Fig S1 (wouldn’t have been an issue if we labeled on the figures).

project 2

  • validate and order primers
  • seqs to Jeff to make secondaries for screen
Posted in Summaries | Comments Off

Thursday 08/14/14

11:00 am – 11:15 pm

Project 2


  • STORM2 computer swapped to run Hamamatsu camera
  • Hal Steve and Dave all changed dramatically (control software)
  • Previously optimized storm2 branch from other computer NewDaxWriter Fetched won’t run
  • tried merging — still won’t run, can’t record movies
  • tried loading fluidics branch, also won’t run / save files
  • recording old fashion style
  • chrL4 E1 looks decent. Taking O/N STORM data.

PH project

  • some revision correspondence
  • offered equal contribution status.
  • got new cells from Ajaz. Still need secondary antibodies (and blocking).
Posted in Summaries | Comments Off

Protected: project 2: updates 08/14/14

This content is password protected. To view it please enter your password below:

Posted in Genomics | Comments Off