Let us manage your data labeling project

Through our partnership with CloudFactory, we’ve supervised data labeling services for a wide range of clients, including earth observation specialists, Fortune 500 enterprises, and international development organizations. Whether it’s semantic segmentation, change detection, or classification, our team can get it done for you.

Let’s chat
Illustration of a clipboard with list items being checked off in front of floating clouds.

Services Managing the annotation process for you

Our labeling lead removes the hassle of annotation management by designing training materials, communicating with labelers, tracking progress, and providing quality assurance. Services include:

  • Regularly validating labeling output and providing targeted feedback as often as needed.
  • Triaging labeler questions: answering routine queries, while understanding when to bring issues to your attention.
  • Escalating technical issues to our product engineers for immediate attention.
  • Providing regular updates on campaign status and expected completion dates.
  • Prioritizing projects or tasks based on your timeline.
  • Developing custom workflows.
  • Re-evaluating campaign design and recommending changes as needed.
Let’s chat

Process Designed specifically for your needs

Optimizing your machine learning project for annotation starts before data labeling begins, and we’re here to walk you through it.

Chart for understanding whether your use-case is likely to cause labeler fatigue.

Read some of our takeaways from training a data labeling team or tips for optimizing your labeling project on our company blog.

Our team helps you design the right type of annotation campaign, image processing, and tools for your use-case. Most importantly, our labeling lead works directly with you and the team at CloudFactory to make sure the project stays on track and that the labels are of the highest quality. Throughout the process, you’ll have GroundWork Pro access, which gives you more room to grow your projects.

Validation Validated labels boost the performance of your machine learning model

Our labeling workflow ends with the validation of your data. Each labeled task is looked at again and evaluated for accuracy.

It is then either accepted, modified, or rejected and sent back to the queue for relabeling. Validators are able to leave notes documenting any issues and giving instructions on needed adjustments. In our experience, validated labels can provide a significant boost to a machine learning model’s performance.

Screenshot of a labeler validating their data labels in the GroundWork application.

Labelers Partnering for quality

At the heart of our managed annotation work are CloudFactory data analysts. CloudFactory invests in the professional and personal development of its workforce, and it pays dividends in the quality of their work.

With a team dedicated to GroundWork clients, we can get started on your campaign within days as opposed to weeks without a long-term commitment.

Photograph of eight people standing side-by-side in an office building.

The team of data analysts assigned to Azavea in July 2019. Photo by CloudFactory.

Our data analysts are seasoned GroundWork users with in-depth knowledge of its features, maximizing the efficiency of your labeling time. Many have worked with the tool from its inception and have played an integral role in suggesting and developing features that increase productivity and annotation quality. We communicate and problem-solve with them daily.

CloudFactory logo.

Our dedicated team of CloudFactory data analysts have created hundreds of thousands of annotations with GroundWork and can start labeling within days.

Learn more

Case studies We’ve partnered with a variety of organizations to support their machine learning projects

International Development Organization

We worked with one of the largest international development organizations to create a dataset of urban tree canopy cover in Freetown, Sierra Leone.

In addition to our labeling services, we provided training, tracking, and feedback for their teams of managers and student labelers in Sierra Leone and Tanzania. The labeled data was used to train a feature detection algorithm.

Screenshot of tree canopy labeling in the GroundWork application.

Synthetic Aperture Radar (SAR) Data Provider

We worked with this client to annotate several areas of interest with varying classes. This imagery, which is not on the traditional visual spectrum, provided several challenges.

In order to support their data, we provided custom visualization and added a multimodal image labeling feature to GroundWork. Our Labeling Lead and team of data analysts developed custom workflows and techniques to accommodate and improve the annotation of non-optical imagery.

SAR imagery example.

Cloud Detection Model and Labeling Competition

Cloud cover is one of the most pernicious problems of satellite imagery. One of our machine learning engineers decided to tackle this problem by building a cloud detection model. Our professional annotaters produced over 20,000 labels for the model and learned to identify and annotate several different types of cloud cover. We also ran a cloud labeling competition for amateur labelers as part of a partnership with Radiant Earth. Within a week, participants from across the world had mapped around 2.3 million square kilometers of satellite imagery to create a signature training dataset and raise awareness of the SpatioTemporal Asset Catalog (STAC) spec.

Image of cloud detection model

Tell us your labeling needs

Cooling tower. Sunroofs in field. Winding road. Forest.

With our customers, we’ve produced hundreds of thousands of labels on low- and high-resolution optical imagery as well as SAR imagery. Those labels include artificial and natural features, such as: cooling towers, solar panels, vehicles, storage tanks, and forests, bodies of water, clouds, and land cover.

Start your project with us

Have labels that need to be cleaned up? Need help creating your machine learning models? We can do that, too!