Sam Altman OpenAI
OpenAI has asked contractors to upload real work they have done in previous jobs, according to a report from Wired. The move is part of a larger trend among AI companies that hire contractors to create high-quality training data. The goal is likely to eventually enable AI models to handle more complex office tasks.
According to the report, OpenAI has asked contractors to describe the tasks they’ve completed at other jobs and to share actual examples of their work. These examples could include Word documents, PDFs, PowerPoint presentations, Excel files or images. The company reportedly specifies that the files must be real outputs, not just summaries of the work.
Contractors are instructed to remove any proprietary or personal information before uploading their work. OpenAI reportedly provides a tool called the ChatGPT “Superstar Scrubbing” tool to help with this process.
Also read: Instagram denies data breach reports, says user accounts are secure
Despite these precautions, the approach carries risks. Intellectual property lawyer Evan Brown told Wired, any AI lab using this method “is putting itself at great risk” because it relies heavily on contractors to judge what is confidential and what is not.
AI companies have been increasingly looking for ways to make models capable of handling office tasks like creating presentations, drafting emails, or analysing spreadsheets. By studying actual work from humans, companies hope their AI agents can learn from realistic examples rather than artificial or synthetic data.
Also read: Samsung Galaxy Z Fold 7 price drops by over Rs 19,000: Check details here
The practice raises important questions about privacy, intellectual property, and the ethics of using real work for AI training.
Also read: Samsung Galaxy S26 Ultra leaks: Expected launch date, India price, camera, display and more