John Hoffer

hof@alumni.harvard.eduLinkedIn

0.1 Collaboration
0.2 Documentation
0.3 Interactive Editing


0.4 Rendering

Manual in 3DS Max:


Automated in Blender:


source code and configuration

1 Education

1.1 Harvard College
  • BA: 2016 — GPA: 3.51
  • Concentration: Neurobiology
  • Secondary: Computer Science

2 Skills

2.1 Programming

Use on daily basis:

  • GLSL ∙ Python ∙ Slurm
  • JavaScript ( D3 ∙ Node ∙ TypeScript )
    Use in past projects:
  • C++ ∙ PHP ∙ SQL ∙ MATLAB ∙ LATEX 
    Learning:
  • Lua ∙ Haskell ∙ Wolfram
  • CUDA ∙ .NET ∙ C#
    Daily Workflow:
  • Bash ∙ Tmux ∙ Vim ∙ RegEx
2.2 Design

Current Projects:

  • Blender (Python API) ∙ X3D ∙ CSS
    Frequent Usage:
  • 3ds Max ∙ Inkscape ∙ Gimp

3 Coursework

3.1 Computer Science
  • Rendering and Image Processing
  • Dynamic & Stochastic Processes
  • Computer Graphics
  • Visualization
3.2 Life Sciences
  • Computational Neuroscience
  • Principals of Neuroengineering
  • Computational Cognitive Neuro.
  • Cellular Basis of Neural Function
  • Drug Discovery and Development

4 Experience

HARVARD SEAS — Fellow

February 2016 — current — Visual Computing Group, Cambridge, MA

  • Built a pipeline to render ray-tracings from CNN image reconstructions.
  • Wrote a web server to handle terabytes of image data efficiently in real-time
  • Contributed to 5 open source projects
  • Developed several UIs to give the research community real-time collaborative access to neural reconstructions
  • Negotiated deliverable APIs for a multi-million dollar grant

WYSS INSTITUTE AT HARVARD — Microfabrication Intern

February—August 2015 — Human Organs-on-Chips, Boston, MA

  • Designed components for development of novel microfluidic cell culture assays
  • Developed and tested improved microscale fabrication procedures

MASSACHUSETTS GENERAL HOSPITAL — Research Intern

June—August 2013 — Psychiatric Genetics Unit, Boston, MA

  • Prepared DNA to correlate cognitive traits with single DNA base pairs
  • Identified possible genes for future study through a literature review

5 Key Open Source Projects

BUTTERFLY IMAGE SERVER January 2018 — Harvard VCG — Github Link

  • Primary developer of terabyte-scale image server in daily use for automated and interactive evaluation of CNN image reconstructions.

OPENSEADRAGON GL January 2017 — OpenSeadragon — Github Link

  • Enabled real-time parallel image processing on large-scale images in browser.

6 Journal Publications

SCALABLE INTERACTIVE VISUALIZATION FOR CONNECTOMICS August 2017 — Informatics — PDF Link

  • Designed and analyzed experiments on data transfer from network file systems
  • Documented the design and implementation of our servers and interfaces

Neural Data Access

Channel Metadata

  • ARIADNE-NDA Layer
    /api/anatomy/channel_metadata?experiment=root&sample=test&dataset=iarpa2016_12&channel=img
    
  • butterfly.rc.fas.harvard.edu (live url)
    /api/channel_metadata?experiment=root&sample=test&dataset=iarpa2016_12&channel=img
    
  • Response
    {
    "path": "/n/coxfs01/thejohnhoffer/2016/iarpa2016_12/img",
    "data-type": "uint8",
    "dimensions": {
        "y": 2000,
        "x": 2000,
        "z": 1774
    },
    "name": "img"
    }
    

Data

  • ARIADNE-NDA Layer
    /api/anatomy/data?experiment=root&sample=test&dataset=iarpa2016_12&x=0&y=0&z=0&width=512&height=512&channel=img
    
  • butterfly.rc.fas.harvard.edu (live url)
    /api/data?experiment=root&sample=test&dataset=iarpa2016_12&x=0&y=0&z=0&width=512&height=512&channel=img
    
  • Response Examples
&channel=img &channel=ids &channel=ids&format=tif
Raw 8-bit PNG (live) human-viewable PNG (live) 32-bit grayscale TIFF (live)

Channels

  • ARIADNE-NDA Layer
    /api/anatomy/channels?experiment=root&sample=test&dataset=iarpa2016_12
    
  • butterfly.rc.fas.harvard.edu (live url)
    /api/channels?experiment=root&sample=test&dataset=iarpa2016_12
    
  • Response
    ["img", "ids"]
    

Datasets

  • ARIADNE-NDA Layer
    /api/anatomy/datasets?experiment=root&sample=test
    
  • butterfly.rc.fas.harvard.edu (live url)
    /api/datasets?experiment=root&sample=test
    
  • Response
    ["iarpa2016_12", "gt-2x2x2-seg", "gt-3x6x6", "gt-4x6x6", "gt-3x4x4-synapse"]
    

Samples

  • ARIADNE-NDA Layer
    /api/anatomy/samples?experiment=root
    
  • butterfly.rc.fas.harvard.edu (live url)
    /api/samples?experiment=root
    
  • Response
    ["test"]
    

Experiments

  • ARIADNE-NDA Layer
    /api/anatomy/experiments
    
  • butterfly.rc.fas.harvard.edu (live url)
    /api/experiments
    
  • Response
    ["root"]
    

Unit test #0

  • Start with this on Butterfly
    {
    neuron1: [0,     0,   0,   0,   5,   5,   5,   5],
    neuron2: [1,     2,   3,   4,   6,   7,   8,   0],
    x:       [0,     0,   0,   0, 512, 512, 512, 512],
    y:       [256, 512,1024,2048, 256, 512,1024,2048],
    z:       [256,   0,2048, 512, 256,   0,2048, 512]
    }
    
  • Start with query
    {
    x: 0,
    y: 0,
    z: 0,
    width: 1040,
    height: 1040,
    depth: 1040
    }
    
  • Request 0
    /api/anatomy/entity_feature?feature=synapse_ids&x=0&y=0&z=0&width=1040&height=1040&depth=1040&experiment=root&sample=test&dataset=iarpa2016_12&channel=synapse-neuroglancer
    
  • Response 0
    ['s0','s1','s4','s5']
    
  • Request 1 Example for id=s5
    /api/anatomy/entity_feature?feature=synapse_parent&id=s5&x=0&y=0&z=0&width=1040&height=1040&depth=1040&experiment=root&sample=test&dataset=iarpa2016_12&channel=synapse-neuroglancer
    
  • Response 1 Example
    {
    synapse_id = 's5',
    synapse_parent_pre= 'n5',
    synapse_parent_post= 'n7'
    }