Zack Shackleton

Software Engineer | Naples, FL









Computer Science B.S. University of Central Florida

Biology B.A. The University of Florida

Welcome

My obsession with technology started from a young age and was fueled by tools like GeoCities and platforms like TechTV. I would spend hours a week customizing my computer or personal website to match this week’s new idea. I even got noticed by Gizmodo and Lifehacker for a few custom setups I created with GeekTool.

I'm now lucky enough to work in a field that I love. I strive to create clean, fun, and friendly user experiences. Feel free to change this website's appearance with the panel belowto your left or refresh this tab.

Below I have highlighted some of my recent work.

Realtor.com

Home search is better together

July 2025 - Present
Senior Staff Engineer

Strategy

Realtor.com has acquired Zenlist. At this time I am unable to disclose further information.

Zenlist

Home search is better together

July 2017 - Acquired July 2025
Director of Web
Lead Front End Engineer

Strategy

Unmatched agent-to-client home search that enables efficient collaboration by putting everyone in the same space. We focus on building tools to make the home buying process easier for Agents and their clients.

Read more

Truepad

The best valued homes Chicago homes

Aug 2014 - June 2017
Lead Front End Engineer
Front End Engineer
Tester

Strategy

Crowd source local Real Estate Agent knowledge on properties, to find the best valued homes in an area.

Read more

Thank you.

zshackleton@icloud.com

This website was a fun side project to test out different methods. I chose to build this static page with no external libraries using plain HTML, CSS, and Javascript. Rather than rely on a framework which would require a build step, I utilized web components.

Zenlist

Home search is better together

zenlist.com
July 2017 - Acquired July 2025
Director of Web

Zenlist’s objective is to deliver an unparalleled home-buying experience for Agents, Lenders, and their clients by ensuring transparency for all parties involved. Clients gain access to the comprehensive MLS inventory and all data accessible to the Agent. To facilitate communication and transparency, we developed MLS-specific search and listing detail pages, customizable saved searches that populated feeds for both Agents and Clients, tour scheduling functionality, chat, and a “merge layer” that consolidate multiple markets across the nation while adhering closely to the RESO standard. Additionally, we provide Agents with enhanced tools to create a public-facing branded bio page and custom branded public listing pages.

TLDR: While at Zenlist I was able to:

  • Architect and build a web application with over a 95% retention rate and over a million listings views a month, from scratch.
  • Be deeply engrained in the product development experience, creating new features to drive adoption and increase sales.
  • Lead user interviews to find out the best experience/flows for our end users.
  • Create a Figma style guide with components and variables reducing time of creation for Figma comps by 10x.
  • Continually experiment with latest front-end technologies to deliver the best product and user experience.

Some definitions of words/acronyms for people not in the real estate world.

  • Client: A regular homebuyer looking to purchase property
  • MLS (Multiple Listing Service): A database used by real estate professionals to share information about properties for sale
  • Agent: A licensed real estate professional who helps clients buy or sell properties
  • Lender: A financial institution or professional who provides mortgage loans to homebuyers

Architecture

While working at Zenlist for a little over 7 years we had a couple iterations of our codebase. As the world of front-end javascript shifted, we moved from NextJS (pages router), to react-router based application, and finally over to a Monorepo with multiple NextJS (app router) applications. As a company we constantly strived to use newest and best of what was available. I have highlighted below some of the more interesting parts of the development of the Zenlist product.

Code Philosophy & Standards

I pushed to instill small repeatable patterns within the codebase and monitored these patterns with a combination of prettier and eslint rules. Small things such as sorting of inputs can greatly increase the speed of a developer reading a file they may have never touched before. We favored descriptive naming like (ClientList, ClientListItem, ClientListItemActions) over brevity. We leveraged Typescript and the safety it provides, to keep developers moving rather than guessing as to what parameter they need to pass to a random function. To aid in this type safety I created multiple generative scripts to create types to ensure type-safety for all our modals, analytics, etc. A codebase should instill engineers to be creative in how they accomplish task, while giving them guardrails to prevent from major design or UX mismatch between features.

Design System

Figma

I created a full Figma project for the Zenlist application. While building out this Figma project I was able to utilize Figma’s variables, components, and variants to create a 1:1 matching style guide for the components in the codebase. Inside of Figma and our codebase we supported 4 separate color themes to give the user a more customizable experience.

Tying code to Figma designs

I used vanilla-extract and their Sprinkles within the application to provide necessary governance over css rules like colors, spacing, and fonts. We also used styles and recipes to better organize our css. Recipes were crucial in separation of concerns because all styling logic and how their classes were applied were driven from vanilla-extract. Again coming back to Typescript, having your css variables and rules autocomplete in a large codebase, greatly takes the stress off the developer as to the exact naming of a style variable.

Icon Management

In a real-estate application it is quite common to have list of client groups, a list of listings, navigation, and many other elements that would contain icons. In order to avoid the markup of ~40-50 SVGs all contaminating the html, I built a script to build any files found in our SVG directory into one sprite sheet. This script would optimize the paths and remove any unneeded styles or SVG parameters to make it ready to consume in the application. Finally this script would create a Typescript union type with all the icon IDs. We loaded this sprite sheet at the global level and then had a React component render any icon out of the sprite sheet based on the icon’s ID. This reduced the overall amount of HTML markup we were sending over the wire and made adding/removing icons to the app easier to manage due to the typescript tie in.

Maps

Agents and their clients often used our map views to find their perfect property. We heard specifically from our users that they wanted to see as many listings as possible at a particular zoom. We also needed to allow the user to add predefined and custom drawn shapes. We allowed the users to interact with those shapes similar to a selection in Photoshop. Users could apply union, intersect, or exclude actions to these shapes, in order to get their perfect bounding area. With our shape work I was able to utilize many of Google’s JS APIs (Autocomplete, Places, Geocoder, and Directions).

Rasterized Maps

Our original map set up used rasterized Google Maps with custom created canvas elements, absolutely positioned on top of the map. Each canvas tiles then had many markers drawn inside. This allowed us to meet the goal of thousands of markers without a degradation in performance. When a user would hover over a pin, we would then draw that marker on top of the other elements in the canvas. When the marker was selected, we would render a React component on top to show more information.

Vector Maps

As time moved on so did technology and common computer capabilities. Once Google released their Vector maps I jumped at the opportunity to have progressive maps that gave a better loading and interaction experiences. In my spare time I was playing around with Blender and 3d models. I decided to rebuild our map experience with Deck.GL creating separate layers and shaders for our different markers. By converting our markers to Deck.GL we were then able to render 3d perspective maps with our markers correctly positioned on top of their building. We continue to use our custom Canvas based markers as a fallback for the users who are not able to take advantage of Web.GL.

GraphQL

Trivium + Apollo

While working with our tech stack I found most of our cluttered code was around the Graphql queries and Apollo. To fix this issue, I created a tool that would introspect the backend schema and generate typesafe hooks for each query and mutation utilizing apollo-hooks. Each hook would then take the fragment of data required as a parameter and inject it into the query/mutation. This automation prevented developer errors and made managing Apollo's optimistic updates much easier.

Graphql-code-gen + URQL

URQL started making waves in the graphql world and their opinion of a lightweight graphql client aligned perfectly with Zenlist’s goals. At the time Apollo was a bit restrictive in how we wanted to manage our queries, mutations, and cache. We dove in head first and moved our graphql client over to URQL and utilized graphql-code-gen to generate our types.

Gql.tada + URQL

Graphql-code-gen was helpful at the time. Gql.tada came out and its ability to tie into the typescript language server was life changing for a developer. I no longer needed to regenerate all my types because I changed one fragment in the codebase. Gql.tada instantly lets the developer use newly added fields, warn on use of deprecated fields, warn fields being queried but not utilized in the component, or even the dreaded “fat-fingered typos”. Along with the move to gql.tada, we implemented fragment co-location and fragment masking to better compose the queries for only the data rendered out on screen. Yes fragment co-location reduced the data we were requesting, however it drastically made future product changes easier to manage since a developer only had to change the component where the data was displayed or altered.

Dynamic Pages

One of the most user loved capabilities of Zenlist was our ability to support ALL local MLS search filters, display ALL information as it’s defined in the MLS on our listing detail pages, and support ALL fields available in the local MLS listing input. We were able to accomplish this with a close tie in between our backend and frontend teams, enabling new MLSs in the fraction of the time.

Listing Detail Pages

Zenlist's listing detail pages were able to display MLS specific information. Ex: Home in Florida would have a field referencing “Lanais” where as a home in Rhode Island would have very different terms like “Porch” or “Screened Patio”. We also offered different view types for our different user groups. This allowed agents to get an easily printable data heavy view, while clients main focus would be on images with data available on the right.

Listing Input

One of my favorite problems with listing input, was error/status management. Along with a banner letting the user know how many fields had errors or needed to be completed, I optimized the UI to easily see which inputs were succeeding and which were failing. Along with the indicators on each field, I created a custom scrollbar that indicated where successful/failing fields were located in the list.

Adding a listing into the MLS requires a lot of different information, but one of the most important data pieces is the property images. I created an image upload/editor from the native canvas element. With our image editor, users were able to crop, rotate, tilt, and adjust contrast, brightness, and saturation of each photo. This kept the Zenlist Agents inside the application, as it was often easier for them to bulk upload their images and adjust them in app, rather than edit each image independently then upload.

As the user entered data and uploaded photos, Zenlist would show a mobile preview of the listing detail page. This instant feedback kept the agent informed on how their listing would appear to others.

Chat

Zenlist built an internal Chat application that was used by all of our user groups to communicate. We set up a chat application utilizing our graphql server and web sockets that would be triggered from changes in our database. I created a utility to watch for specific web socket messages, in order to refetch a particular query. This allowed the frontend to truly only query for the data it needs, we weren’t sending extra data in our web socket messages, and our URQL cache was updated accordingly. This chat application included features such as photo sharing, dynamic links, sharing listings, reactions to messages, and threads.

Truepad

The best valued homes Chicago homes

Aug 2015 - June 2017
Lead Front End Engineer
Front End Engineer
Tester

Problem

Information available regarding homes for sale and real estate agents online is limited, biased, and undifferentiated. It is not easy for a homebuyer to determine online which homes and agents are truly the best.

Solution

Utilize expertise from professional real estate agents to identify the best valued homes.

Homebuyers

Homebuyers save time searching through listings because homes identified as Truepads sell quicker than the average home. Homebuyers were also easily able to find an agent who has proven their market expertise.

Real Estate Agents

The more accurate an agent's insights were, the more exposure they gained to leads. On top of a full listing search Agents also were provided with tools to manage their clients, keep track of the latest homes on the market, and gaining exposure with customized ads.

TrueAssist

Agents were able to connect their Facebook accounts to Truepad to gain more exposure. Personalized Facebook ads were generated and posted based on an Agent's activity on Truepad.

Tech stuff

Our userbase accessed the platform from a wide variety of devices, requiring a lightweight responsive application. In order to achieve this we built a Server Side Rendered React application. Using Webpack, we were able chunk the application based on to only load the appropriate data for the user. Chunks were generated based on the route, screensize, and user type. This allowed us to deliver the most functionality at the smallest bundle size. Redux was used for app level state management and organized using the Ducks method. We used Jenkins to set up continuous deployment based on pull requests.