Categories

XLSX Sheet To JSON Schema

Infer JSON Schema from worksheet headers and sample rows with types, enum candidates, and basic constraints

Infer JSON Schema from spreadsheet samples.

  • Uses header row as property names
  • Infers primitive type and nullable behavior
  • Detects enum candidates and numeric/string constraints
  • Exports schema as JSON file

Example Results

1 examples

Infer JSON Schema from Sheet Samples

Read header and sample rows to infer field types, enum values, and required constraints

xlsx-sheet-to-json-schema-example1.json View File
View input parameters
{ "excelFile": "/public/samples/xlsx/workbook-sales.xlsx", "sheetName": "Sheet1", "headerRow": 1, "sampleSize": 100, "enumMaxDistinct": 10 }

Click to upload file or drag and drop file here

Maximum file size: 100MB Supported formats: application/vnd.openxmlformats-officedocument.spreadsheetml.sheet, application/vnd.ms-excel

0 0.98 1

Key Facts

Category
Format Conversion
Input Types
file, text, number, range
Output Type
file
Sample Coverage
4
API Ready
Yes

Overview

The XLSX Sheet To JSON Schema tool automatically generates structured JSON schemas by analyzing your Excel file's headers and data samples. It identifies data types, detects potential enum values, and applies constraints to ensure your exported data adheres to a consistent, valid format.

When to Use

  • When you need to convert legacy Excel data into a structured format for API integration.
  • When you want to enforce data validation rules based on existing spreadsheet content.
  • When you need to quickly generate a schema definition for database migration or application development.

How It Works

  • Upload your Excel file and specify the target sheet and header row index.
  • The tool scans the specified sample size to infer data types and identify recurring patterns.
  • It automatically detects enum candidates and applies required field thresholds based on your configuration.
  • Download the generated JSON schema file ready for use in your development environment.

Use Cases

Standardizing spreadsheet data for ingestion into NoSQL databases like MongoDB.
Creating validation schemas for frontend forms based on existing Excel templates.
Automating the documentation of data structures for team collaboration.

Examples

1. Generating API Schema from Sales Data

Backend Developer
Background
The sales team provides a monthly Excel report that needs to be imported into a new CRM system via an API.
Problem
Manually writing a JSON schema for 50+ columns is error-prone and slow.
How to Use
Upload the sales report, set the header row to 1, and let the tool infer types and constraints.
Example Config
sheetName: 'Sheet1', headerRow: 1, sampleSize: 500, enumMaxDistinct: 10
Outcome
A complete JSON schema file that includes field types, required fields, and enum lists for categorical data like 'Region' or 'Status'.

2. Validating Inventory Imports

Data Analyst
Background
Inventory managers upload product lists in Excel, but the data often contains inconsistent types.
Problem
Need a strict schema to validate incoming files before they reach the production database.
How to Use
Use the tool to generate a schema from a clean master template to enforce data integrity.
Example Config
requiredThreshold: 1.0, schemaTitle: 'InventoryProduct'
Outcome
A strict JSON schema that rejects any uploaded file missing required fields or containing invalid data types.

Try with Samples

json, xml, xlsx

Related Hubs

FAQ

How does the tool determine if a field is required?

It uses a 'Required Threshold' setting; if a column contains data in more than 98% of the sampled rows, it is marked as required.

Can I limit the number of enum values detected?

Yes, you can adjust the 'Enum Max Distinct' setting to control how many unique values are allowed before a field is treated as a standard string rather than an enum.

Does it support multiple sheets in one file?

The tool processes one sheet at a time. You can specify the sheet name in the configuration options.

What happens if my data has empty rows?

The tool ignores empty rows during the inference process and focuses on the populated data within your defined sample size.

Is my data uploaded to a server?

The file is processed to extract the schema structure; no sensitive data is stored permanently after the conversion process is complete.

API Documentation

Request Endpoint

POST /en/api/tools/xlsx-sheet-to-json-schema

Request Parameters

Parameter Name Type Required Description
excelFile file (Upload required) Yes -
sheetName text No -
headerRow number No -
sampleSize number No -
enumMaxDistinct number No -
requiredThreshold range No -
schemaTitle text No -

File type parameters need to be uploaded first via POST /upload/xlsx-sheet-to-json-schema to get filePath, then pass filePath to the corresponding file field.

Response Format

{
  "filePath": "/public/processing/randomid.ext",
  "fileName": "output.ext",
  "contentType": "application/octet-stream",
  "size": 1024,
  "metadata": {
    "key": "value"
  },
  "error": "Error message (optional)",
  "message": "Notification message (optional)"
}
File: File

AI MCP Documentation

Add this tool to your MCP server configuration:

{
  "mcpServers": {
    "elysiatools-xlsx-sheet-to-json-schema": {
      "name": "xlsx-sheet-to-json-schema",
      "description": "Infer JSON Schema from worksheet headers and sample rows with types, enum candidates, and basic constraints",
      "baseUrl": "https://elysiatools.com/mcp/sse?toolId=xlsx-sheet-to-json-schema",
      "command": "",
      "args": [],
      "env": {},
      "isActive": true,
      "type": "sse"
    }
  }
}

You can chain multiple tools, e.g.: `https://elysiatools.com/mcp/sse?toolId=png-to-webp,jpg-to-webp,gif-to-webp`, max 20 tools.

Supports URL file links or Base64 encoding for file parameters.

If you encounter any issues, please contact us at [email protected]