Working layout.
This commit is contained in:
parent
b5db7172cf
commit
4a294608b1
|
@ -1,20 +1,23 @@
|
||||||
# Current Focus
|
# Chatterbox TTS Migration: Backend Development (FastAPI)
|
||||||
|
|
||||||
**Date:** 2025-06-05
|
**Primary Goal:** Implement the FastAPI backend for TTS dialog generation.
|
||||||
|
|
||||||
**Primary Goal:** Initiate the migration of the Chatterbox TTS dialog generator from Gradio to a vanilla JavaScript frontend and FastAPI backend.
|
**Recent Accomplishments (Phase 1, Step 2 - Speaker Management):**
|
||||||
|
|
||||||
**Recent Accomplishments:**
|
- Created Pydantic models for speaker data (`speaker_models.py`).
|
||||||
|
- Implemented `SpeakerManagementService` (`speaker_service.py`) for CRUD operations on speakers (metadata in `speakers.yaml`, samples in `speaker_samples/`).
|
||||||
|
- Created FastAPI router (`routers/speakers.py`) with endpoints: `GET /api/speakers`, `POST /api/speakers`, `GET /api/speakers/{id}`, `DELETE /api/speakers/{id}`.
|
||||||
|
- Integrated speaker router into the main FastAPI app (`main.py`).
|
||||||
|
- Successfully tested all speaker API endpoints using `curl`.
|
||||||
|
|
||||||
- Set up the `.note/` Memory Bank directory and essential files.
|
**Current Task (Phase 1, Step 3 - TTS Core):**
|
||||||
- Reviewed `gradio_app.py` to understand existing dialog generation logic.
|
|
||||||
- Developed a detailed, phased plan for re-implementing the dialog generation functionality with FastAPI and Vanilla JS. This plan has been saved to `.note/detailed_migration_plan.md`.
|
|
||||||
|
|
||||||
**Current Task:**
|
- **Develop `TTSService` in `backend/app/services/tts_service.py`.**
|
||||||
|
- Focus on `ChatterboxTTS` model loading, inference, and critical memory management.
|
||||||
|
- Define methods for speech generation using speaker samples.
|
||||||
|
- Manage TTS parameters (exaggeration, cfg_weight, temperature).
|
||||||
|
|
||||||
- Awaiting your feedback on the detailed migration plan (see `.note/detailed_migration_plan.md`).
|
**Next Immediate Steps:**
|
||||||
|
|
||||||
**Next Steps (pending your approval of plan):**
|
1. Finalize and test the initial implementation of `TTSService`.
|
||||||
|
2. Proceed to Phase 1, Step 4: Dialog Processing - Implement `DialogProcessorService` including text splitting logic.
|
||||||
- Begin Phase 1: Backend API Development (FastAPI).
|
|
||||||
- Task 1.1: Project Setup (FastAPI project structure, `requirements.txt`).
|
|
||||||
|
|
|
@ -4,9 +4,11 @@ This plan outlines the steps to re-implement the dialog generation features of t
|
||||||
|
|
||||||
## 1. Backend (FastAPI) Development
|
## 1. Backend (FastAPI) Development
|
||||||
|
|
||||||
**Objective:** Create a robust API to handle TTS generation, speaker management, and file delivery.
|
### Objective
|
||||||
|
|
||||||
**Key Modules/Components:**
|
Create a robust API to handle TTS generation, speaker management, and file delivery.
|
||||||
|
|
||||||
|
### Key Modules/Components
|
||||||
|
|
||||||
* **API Endpoints:**
|
* **API Endpoints:**
|
||||||
* `POST /api/dialog/generate`:
|
* `POST /api/dialog/generate`:
|
||||||
|
@ -33,7 +35,7 @@ This plan outlines the steps to re-implement the dialog generation features of t
|
||||||
* **File Handling:**
|
* **File Handling:**
|
||||||
* Strategy for storing and serving generated `.wav` and `.zip` files (e.g., FastAPI `StaticFiles`, temporary directories, or cloud storage).
|
* Strategy for storing and serving generated `.wav` and `.zip` files (e.g., FastAPI `StaticFiles`, temporary directories, or cloud storage).
|
||||||
|
|
||||||
**Implementation Steps (Phase 1):**
|
### Implementation Steps (Phase 1)
|
||||||
|
|
||||||
1. **Project Setup:** Initialize FastAPI project, define dependencies (`fastapi`, `uvicorn`, `python-multipart`, `pyyaml`, `torch`, `torchaudio`, `chatterbox-tts`).
|
1. **Project Setup:** Initialize FastAPI project, define dependencies (`fastapi`, `uvicorn`, `python-multipart`, `pyyaml`, `torch`, `torchaudio`, `chatterbox-tts`).
|
||||||
2. **Speaker Management:** Implement `SpeakerManagementService` and the `/api/speakers` endpoints.
|
2. **Speaker Management:** Implement `SpeakerManagementService` and the `/api/speakers` endpoints.
|
||||||
|
@ -46,18 +48,20 @@ This plan outlines the steps to re-implement the dialog generation features of t
|
||||||
|
|
||||||
## 2. Frontend (Vanilla JavaScript) Development
|
## 2. Frontend (Vanilla JavaScript) Development
|
||||||
|
|
||||||
**Objective:** Create an intuitive UI for dialog construction, speaker management, and interaction with the backend.
|
### Objective
|
||||||
|
|
||||||
**Key Modules/Components:**
|
Create an intuitive UI for dialog construction, speaker management, and interaction with the backend.
|
||||||
|
|
||||||
|
### Key Modules/Components
|
||||||
|
|
||||||
* **HTML (`index.html`):** Structure for dialog editor, speaker controls, results display.
|
* **HTML (`index.html`):** Structure for dialog editor, speaker controls, results display.
|
||||||
* **CSS (`style.css`):** Styling for a clean and usable interface.
|
* **CSS (`style.css`):** Styling for a clean and usable interface.
|
||||||
* **JavaScript (`app.js`, `api.js`, `ui.js`):**
|
* **JavaScript (`app.js`, `api.js`, `ui.js`):
|
||||||
* `api.js`: Functions for all backend API communications (`fetch`).
|
* `api.js`: Functions for all backend API communications (`fetch`).
|
||||||
* `ui.js`: DOM manipulation for dynamic dialog lines, speaker lists, and results rendering.
|
* `ui.js`: DOM manipulation for dynamic dialog lines, speaker lists, and results rendering.
|
||||||
* `app.js`: Main application logic, event handling, state management (for dialog lines, speaker data).
|
* `app.js`: Main application logic, event handling, state management (for dialog lines, speaker data).
|
||||||
|
|
||||||
**Implementation Steps (Phase 2):**
|
### Implementation Steps (Phase 2)
|
||||||
|
|
||||||
1. **Basic Layout:** Create `index.html` and `style.css`.
|
1. **Basic Layout:** Create `index.html` and `style.css`.
|
||||||
2. **API Client:** Develop `api.js` to interface with all backend endpoints.
|
2. **API Client:** Develop `api.js` to interface with all backend endpoints.
|
||||||
|
|
|
@ -1,5 +1,25 @@
|
||||||
# Session Log
|
# Session Log
|
||||||
|
|
||||||
|
---
|
||||||
|
**Session Start:** 2025-06-05 (Continued)
|
||||||
|
|
||||||
|
**Goal:** Progress Phase 1 of Chatterbox TTS backend migration: Initial Project Setup.
|
||||||
|
|
||||||
|
**Key Activities & Insights:**
|
||||||
|
- Created `backend/app/main.py` with a basic FastAPI application instance.
|
||||||
|
- Confirmed user has an existing `.venv` at the project root.
|
||||||
|
- Updated `backend/README.md` to reflect usage of the root `.venv` instead of a backend-specific one.
|
||||||
|
- Adjusted venv activation paths and command execution locations (project root).
|
||||||
|
- Installed backend dependencies from `backend/requirements.txt` into the root `.venv`.
|
||||||
|
- Successfully ran the basic FastAPI server using `uvicorn backend.app.main:app --reload --host 0.0.0.0 --port 8000` from the project root.
|
||||||
|
- Verified the API is accessible.
|
||||||
|
- Confirmed all Memory Bank files are present. Reviewed `current_focus.md` and `session_log.md`.
|
||||||
|
|
||||||
|
**Next Steps:**
|
||||||
|
- Update `current_focus.md` and `session_log.md`.
|
||||||
|
- Proceed to Phase 1, Step 2: Speaker Management.
|
||||||
|
---
|
||||||
|
|
||||||
---
|
---
|
||||||
**Session Start:** 2025-06-05
|
**Session Start:** 2025-06-05
|
||||||
|
|
||||||
|
|
|
@ -0,0 +1,13 @@
|
||||||
|
// babel.config.cjs
|
||||||
|
module.exports = {
|
||||||
|
presets: [
|
||||||
|
[
|
||||||
|
'@babel/preset-env',
|
||||||
|
{
|
||||||
|
targets: {
|
||||||
|
node: 'current', // Target the current version of Node.js
|
||||||
|
},
|
||||||
|
},
|
||||||
|
],
|
||||||
|
],
|
||||||
|
};
|
|
@ -0,0 +1,34 @@
|
||||||
|
# Chatterbox TTS Backend
|
||||||
|
|
||||||
|
This directory contains the FastAPI backend for the Chatterbox TTS application.
|
||||||
|
|
||||||
|
## Project Structure
|
||||||
|
|
||||||
|
- `app/`: Contains the main FastAPI application code.
|
||||||
|
- `__init__.py`: Makes `app` a Python package.
|
||||||
|
- `main.py`: FastAPI application instance and core API endpoints.
|
||||||
|
- `services/`: Business logic for TTS, dialog processing, etc.
|
||||||
|
- `models/`: Pydantic models for API request/response.
|
||||||
|
- `utils/`: Utility functions.
|
||||||
|
- `requirements.txt`: Project dependencies for the backend.
|
||||||
|
- `README.md`: This file.
|
||||||
|
|
||||||
|
## Setup & Running
|
||||||
|
|
||||||
|
It is assumed you have a Python virtual environment at the project root (e.g., `.venv`).
|
||||||
|
|
||||||
|
1. Navigate to the **project root** directory (e.g., `/Volumes/SAM2/CODE/chatterbox-test`).
|
||||||
|
2. Activate the existing Python virtual environment:
|
||||||
|
```bash
|
||||||
|
source .venv/bin/activate # On macOS/Linux
|
||||||
|
# .\.venv\Scripts\activate # On Windows
|
||||||
|
```
|
||||||
|
3. Install dependencies (ensure your terminal is in the **project root**):
|
||||||
|
```bash
|
||||||
|
pip install -r backend/requirements.txt
|
||||||
|
```
|
||||||
|
4. Run the development server (ensure your terminal is in the **project root**):
|
||||||
|
```bash
|
||||||
|
uvicorn backend.app.main:app --reload --host 0.0.0.0 --port 8000
|
||||||
|
```
|
||||||
|
The API should then be accessible at `http://127.0.0.1:8000`.
|
|
@ -0,0 +1 @@
|
||||||
|
|
|
@ -0,0 +1,19 @@
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
# Determine PROJECT_ROOT dynamically.
|
||||||
|
# If config.py is at /Volumes/SAM2/CODE/chatterbox-test/backend/app/config.py
|
||||||
|
# then PROJECT_ROOT (/Volumes/SAM2/CODE/chatterbox-test) is 2 levels up.
|
||||||
|
PROJECT_ROOT = Path(__file__).resolve().parents[2]
|
||||||
|
|
||||||
|
# Speaker data paths
|
||||||
|
SPEAKER_DATA_BASE_DIR = PROJECT_ROOT / "speaker_data"
|
||||||
|
SPEAKER_SAMPLES_DIR = SPEAKER_DATA_BASE_DIR / "speaker_samples"
|
||||||
|
SPEAKERS_YAML_FILE = SPEAKER_DATA_BASE_DIR / "speakers.yaml"
|
||||||
|
|
||||||
|
# TTS temporary output path (used by DialogProcessorService)
|
||||||
|
TTS_TEMP_OUTPUT_DIR = PROJECT_ROOT / "tts_temp_outputs"
|
||||||
|
|
||||||
|
# Final dialog output path (used by Dialog router and served by main app)
|
||||||
|
# These are stored within the 'backend' directory to be easily servable.
|
||||||
|
DIALOG_OUTPUT_PARENT_DIR = PROJECT_ROOT / "backend"
|
||||||
|
DIALOG_GENERATED_DIR = DIALOG_OUTPUT_PARENT_DIR / "tts_generated_dialogs"
|
|
@ -0,0 +1,43 @@
|
||||||
|
from fastapi import FastAPI
|
||||||
|
from fastapi.staticfiles import StaticFiles
|
||||||
|
from fastapi.middleware.cors import CORSMiddleware
|
||||||
|
from pathlib import Path
|
||||||
|
from app.routers import speakers, dialog # Import the routers
|
||||||
|
from app import config
|
||||||
|
|
||||||
|
app = FastAPI(
|
||||||
|
title="Chatterbox TTS API",
|
||||||
|
description="API for generating TTS dialogs using Chatterbox TTS.",
|
||||||
|
version="0.1.0",
|
||||||
|
)
|
||||||
|
|
||||||
|
# CORS Middleware configuration
|
||||||
|
origins = [
|
||||||
|
"http://localhost:8001",
|
||||||
|
"http://127.0.0.1:8001",
|
||||||
|
# Add other origins if needed, e.g., your deployed frontend URL
|
||||||
|
]
|
||||||
|
|
||||||
|
app.add_middleware(
|
||||||
|
CORSMiddleware,
|
||||||
|
allow_origins=origins,
|
||||||
|
allow_credentials=True,
|
||||||
|
allow_methods=["*"], # Allows all methods
|
||||||
|
allow_headers=["*"], # Allows all headers
|
||||||
|
)
|
||||||
|
|
||||||
|
# Include routers
|
||||||
|
app.include_router(speakers.router, prefix="/api/speakers", tags=["Speakers"])
|
||||||
|
app.include_router(dialog.router, prefix="/api/dialog", tags=["Dialog Generation"])
|
||||||
|
|
||||||
|
@app.get("/")
|
||||||
|
async def read_root():
|
||||||
|
return {"message": "Welcome to the Chatterbox TTS API!"}
|
||||||
|
|
||||||
|
# Ensure the directory for serving generated audio exists
|
||||||
|
config.DIALOG_GENERATED_DIR.mkdir(parents=True, exist_ok=True)
|
||||||
|
|
||||||
|
# Mount StaticFiles to serve generated dialogs
|
||||||
|
app.mount("/generated_audio", StaticFiles(directory=config.DIALOG_GENERATED_DIR), name="generated_audio")
|
||||||
|
|
||||||
|
# Further endpoints for speakers, dialog generation, etc., will be added here.
|
|
@ -0,0 +1 @@
|
||||||
|
|
|
@ -0,0 +1,43 @@
|
||||||
|
from pydantic import BaseModel, Field, validator
|
||||||
|
from typing import List, Union, Literal, Optional
|
||||||
|
|
||||||
|
class DialogItemBase(BaseModel):
|
||||||
|
type: str
|
||||||
|
|
||||||
|
class SpeechItem(DialogItemBase):
|
||||||
|
type: Literal['speech'] = 'speech'
|
||||||
|
speaker_id: str = Field(..., description="ID of the speaker for this speech segment.")
|
||||||
|
text: str = Field(..., description="Text content to be synthesized.")
|
||||||
|
exaggeration: Optional[float] = Field(0.5, description="Controls the expressiveness of the speech. Higher values lead to more exaggerated speech. Default from Gradio.")
|
||||||
|
cfg_weight: Optional[float] = Field(0.5, description="Classifier-Free Guidance weight. Higher values make the speech more aligned with the prompt text and speaker characteristics. Default from Gradio.")
|
||||||
|
temperature: Optional[float] = Field(0.8, description="Controls randomness in generation. Lower values make speech more deterministic, higher values more varied. Default from Gradio.")
|
||||||
|
|
||||||
|
class SilenceItem(DialogItemBase):
|
||||||
|
type: Literal['silence'] = 'silence'
|
||||||
|
duration: float = Field(..., gt=0, description="Duration of the silence in seconds.")
|
||||||
|
|
||||||
|
class DialogRequest(BaseModel):
|
||||||
|
dialog_items: List[Union[SpeechItem, SilenceItem]] = Field(..., description="A list of speech and silence items.")
|
||||||
|
output_base_name: str = Field(..., description="Base name for the output files (e.g., 'my_dialog_v1'). Extensions will be added automatically.")
|
||||||
|
|
||||||
|
@validator('dialog_items', pre=True, each_item=True)
|
||||||
|
def check_item_type(cls, item):
|
||||||
|
if not isinstance(item, dict):
|
||||||
|
raise ValueError("Each dialog item must be a dictionary.")
|
||||||
|
item_type = item.get('type')
|
||||||
|
if item_type == 'speech':
|
||||||
|
# Pydantic will handle further validation based on SpeechItem model
|
||||||
|
return item
|
||||||
|
elif item_type == 'silence':
|
||||||
|
# Pydantic will handle further validation based on SilenceItem model
|
||||||
|
return item
|
||||||
|
raise ValueError(f"Unknown dialog item type: {item_type}. Must be 'speech' or 'silence'.")
|
||||||
|
|
||||||
|
class DialogResponse(BaseModel):
|
||||||
|
log: str = Field(description="Log of the dialog generation process.")
|
||||||
|
# For now, these URLs might be relative paths or placeholders.
|
||||||
|
# Actual serving strategy will determine the final URL format.
|
||||||
|
concatenated_audio_url: Optional[str] = Field(None, description="URL/path to the concatenated audio file.")
|
||||||
|
zip_archive_url: Optional[str] = Field(None, description="URL/path to the ZIP archive of all audio files.")
|
||||||
|
temp_dir_path: Optional[str] = Field(None, description="Path to the temporary directory holding generated files, for server-side reference.")
|
||||||
|
error_message: Optional[str] = Field(None, description="Error message if the process failed globally.")
|
|
@ -0,0 +1,20 @@
|
||||||
|
from pydantic import BaseModel
|
||||||
|
from typing import Optional
|
||||||
|
|
||||||
|
class SpeakerBase(BaseModel):
|
||||||
|
name: str
|
||||||
|
|
||||||
|
class SpeakerCreate(SpeakerBase):
|
||||||
|
# For receiving speaker name, file will be handled separately by FastAPI's UploadFile
|
||||||
|
pass
|
||||||
|
|
||||||
|
class Speaker(SpeakerBase):
|
||||||
|
id: str
|
||||||
|
sample_path: Optional[str] = None # Path to the speaker's audio sample
|
||||||
|
|
||||||
|
class Config:
|
||||||
|
from_attributes = True # Replaces orm_mode = True in Pydantic v2
|
||||||
|
|
||||||
|
class SpeakerResponse(SpeakerBase):
|
||||||
|
id: str
|
||||||
|
message: Optional[str] = None
|
|
@ -0,0 +1 @@
|
||||||
|
|
|
@ -0,0 +1,189 @@
|
||||||
|
from fastapi import APIRouter, Depends, HTTPException, BackgroundTasks
|
||||||
|
from pathlib import Path
|
||||||
|
import shutil
|
||||||
|
|
||||||
|
from app.models.dialog_models import DialogRequest, DialogResponse
|
||||||
|
from app.services.tts_service import TTSService
|
||||||
|
from app.services.speaker_service import SpeakerManagementService
|
||||||
|
from app.services.dialog_processor_service import DialogProcessorService
|
||||||
|
from app.services.audio_manipulation_service import AudioManipulationService
|
||||||
|
from app import config
|
||||||
|
|
||||||
|
router = APIRouter()
|
||||||
|
|
||||||
|
# --- Dependency Injection for Services ---
|
||||||
|
# These can be more sophisticated with a proper DI container or FastAPI's Depends system if services had complex init.
|
||||||
|
# For now, direct instantiation or simple Depends is fine.
|
||||||
|
|
||||||
|
def get_tts_service():
|
||||||
|
# Consider making device configurable
|
||||||
|
return TTSService(device="mps")
|
||||||
|
|
||||||
|
def get_speaker_management_service():
|
||||||
|
return SpeakerManagementService()
|
||||||
|
|
||||||
|
def get_dialog_processor_service(
|
||||||
|
tts_service: TTSService = Depends(get_tts_service),
|
||||||
|
speaker_service: SpeakerManagementService = Depends(get_speaker_management_service)
|
||||||
|
):
|
||||||
|
return DialogProcessorService(tts_service=tts_service, speaker_service=speaker_service)
|
||||||
|
|
||||||
|
def get_audio_manipulation_service():
|
||||||
|
return AudioManipulationService()
|
||||||
|
|
||||||
|
# --- Helper function to manage TTS model loading/unloading ---
|
||||||
|
async def manage_tts_model_lifecycle(tts_service: TTSService, task_function, *args, **kwargs):
|
||||||
|
"""Loads TTS model, executes task, then unloads model."""
|
||||||
|
try:
|
||||||
|
print("API: Loading TTS model...")
|
||||||
|
tts_service.load_model()
|
||||||
|
return await task_function(*args, **kwargs)
|
||||||
|
except Exception as e:
|
||||||
|
# Log or handle specific exceptions if needed before re-raising
|
||||||
|
print(f"API: Error during TTS model lifecycle or task execution: {e}")
|
||||||
|
raise
|
||||||
|
finally:
|
||||||
|
print("API: Unloading TTS model...")
|
||||||
|
tts_service.unload_model()
|
||||||
|
|
||||||
|
async def process_dialog_flow(
|
||||||
|
request: DialogRequest,
|
||||||
|
dialog_processor: DialogProcessorService,
|
||||||
|
audio_manipulator: AudioManipulationService,
|
||||||
|
background_tasks: BackgroundTasks
|
||||||
|
) -> DialogResponse:
|
||||||
|
"""Core logic for processing the dialog request."""
|
||||||
|
processing_log_entries = []
|
||||||
|
concatenated_audio_file_path = None
|
||||||
|
zip_archive_file_path = None
|
||||||
|
final_temp_dir_path_str = None
|
||||||
|
|
||||||
|
try:
|
||||||
|
# 1. Process dialog to generate segments
|
||||||
|
# The DialogProcessorService creates its own temp dir for segments
|
||||||
|
dialog_processing_result = await dialog_processor.process_dialog(
|
||||||
|
dialog_items=[item.model_dump() for item in request.dialog_items],
|
||||||
|
output_base_name=request.output_base_name
|
||||||
|
)
|
||||||
|
processing_log_entries.append(dialog_processing_result['log'])
|
||||||
|
segment_details = dialog_processing_result['segment_files']
|
||||||
|
temp_segment_dir = Path(dialog_processing_result['temp_dir'])
|
||||||
|
final_temp_dir_path_str = str(temp_segment_dir)
|
||||||
|
|
||||||
|
# Filter out error segments for concatenation and zipping
|
||||||
|
valid_segment_paths_for_concat = [
|
||||||
|
Path(s['path']) for s in segment_details
|
||||||
|
if s['type'] == 'speech' and s.get('path') and Path(s['path']).exists()
|
||||||
|
]
|
||||||
|
|
||||||
|
# Create a list of dicts suitable for concatenation service (speech paths and silence durations)
|
||||||
|
items_for_concatenation = []
|
||||||
|
for s_detail in segment_details:
|
||||||
|
if s_detail['type'] == 'speech' and s_detail.get('path') and Path(s_detail['path']).exists():
|
||||||
|
items_for_concatenation.append({'type': 'speech', 'path': s_detail['path']})
|
||||||
|
elif s_detail['type'] == 'silence' and 'duration' in s_detail:
|
||||||
|
items_for_concatenation.append({'type': 'silence', 'duration': s_detail['duration']})
|
||||||
|
# Errors are already logged by DialogProcessor
|
||||||
|
|
||||||
|
if not any(item['type'] == 'speech' for item in items_for_concatenation):
|
||||||
|
message = "No valid speech segments were generated. Cannot create concatenated audio or ZIP."
|
||||||
|
processing_log_entries.append(message)
|
||||||
|
return DialogResponse(
|
||||||
|
log="\n".join(processing_log_entries),
|
||||||
|
temp_dir_path=final_temp_dir_path_str,
|
||||||
|
error_message=message
|
||||||
|
)
|
||||||
|
|
||||||
|
# 2. Concatenate audio segments
|
||||||
|
config.DIALOG_GENERATED_DIR.mkdir(parents=True, exist_ok=True)
|
||||||
|
concat_filename = f"{request.output_base_name}_concatenated.wav"
|
||||||
|
concatenated_audio_file_path = config.DIALOG_GENERATED_DIR / concat_filename
|
||||||
|
|
||||||
|
audio_manipulator.concatenate_audio_segments(
|
||||||
|
segment_results=items_for_concatenation,
|
||||||
|
output_concatenated_path=concatenated_audio_file_path
|
||||||
|
)
|
||||||
|
processing_log_entries.append(f"Concatenated audio saved to: {concatenated_audio_file_path}")
|
||||||
|
|
||||||
|
# 3. Create ZIP archive
|
||||||
|
zip_filename = f"{request.output_base_name}_dialog_output.zip"
|
||||||
|
zip_archive_path = config.DIALOG_GENERATED_DIR / zip_filename
|
||||||
|
|
||||||
|
# Collect all valid generated speech segment files for zipping
|
||||||
|
individual_segment_paths = [
|
||||||
|
Path(s['path']) for s in segment_details
|
||||||
|
if s['type'] == 'speech' and s.get('path') and Path(s['path']).exists()
|
||||||
|
]
|
||||||
|
|
||||||
|
# concatenated_audio_file_path is already defined and checked for existence before this block
|
||||||
|
|
||||||
|
audio_manipulator.create_zip_archive(
|
||||||
|
segment_file_paths=individual_segment_paths,
|
||||||
|
concatenated_audio_path=concatenated_audio_file_path,
|
||||||
|
output_zip_path=zip_archive_path
|
||||||
|
)
|
||||||
|
processing_log_entries.append(f"ZIP archive created at: {zip_archive_path}")
|
||||||
|
|
||||||
|
# Schedule cleanup of the temporary segment directory
|
||||||
|
# background_tasks.add_task(shutil.rmtree, temp_segment_dir, ignore_errors=True)
|
||||||
|
# processing_log_entries.append(f"Scheduled cleanup for temporary segment directory: {temp_segment_dir}")
|
||||||
|
# For now, let's not auto-delete, so user can inspect. Cleanup can be a separate endpoint/job.
|
||||||
|
processing_log_entries.append(f"Temporary segment directory for inspection: {temp_segment_dir}")
|
||||||
|
|
||||||
|
return DialogResponse(
|
||||||
|
log="\n".join(processing_log_entries),
|
||||||
|
# URLs should be relative to a static serving path, e.g., /generated_audio/
|
||||||
|
# For now, just returning the name, assuming they are in DIALOG_OUTPUT_DIR
|
||||||
|
concatenated_audio_url=f"/generated_audio/{concat_filename}",
|
||||||
|
zip_archive_url=f"/generated_audio/{zip_filename}",
|
||||||
|
temp_dir_path=final_temp_dir_path_str
|
||||||
|
)
|
||||||
|
|
||||||
|
except FileNotFoundError as e:
|
||||||
|
error_msg = f"File not found during dialog generation: {e}"
|
||||||
|
processing_log_entries.append(error_msg)
|
||||||
|
raise HTTPException(status_code=404, detail=error_msg)
|
||||||
|
except ValueError as e:
|
||||||
|
error_msg = f"Invalid value or configuration: {e}"
|
||||||
|
processing_log_entries.append(error_msg)
|
||||||
|
raise HTTPException(status_code=400, detail=error_msg)
|
||||||
|
except RuntimeError as e:
|
||||||
|
error_msg = f"Runtime error during dialog generation: {e}"
|
||||||
|
processing_log_entries.append(error_msg)
|
||||||
|
# This could be a 500 if it's an unexpected server error
|
||||||
|
raise HTTPException(status_code=500, detail=error_msg)
|
||||||
|
except Exception as e:
|
||||||
|
import traceback
|
||||||
|
error_msg = f"An unexpected error occurred: {e}\n{traceback.format_exc()}"
|
||||||
|
processing_log_entries.append(error_msg)
|
||||||
|
raise HTTPException(status_code=500, detail=error_msg)
|
||||||
|
finally:
|
||||||
|
# Ensure logs are captured even if an early exception occurs before full response construction
|
||||||
|
if not concatenated_audio_file_path and not zip_archive_file_path and processing_log_entries:
|
||||||
|
print("Dialog generation failed. Log: \n" + "\n".join(processing_log_entries))
|
||||||
|
|
||||||
|
@router.post("/generate", response_model=DialogResponse)
|
||||||
|
async def generate_dialog_endpoint(
|
||||||
|
request: DialogRequest,
|
||||||
|
background_tasks: BackgroundTasks,
|
||||||
|
tts_service: TTSService = Depends(get_tts_service),
|
||||||
|
dialog_processor: DialogProcessorService = Depends(get_dialog_processor_service),
|
||||||
|
audio_manipulator: AudioManipulationService = Depends(get_audio_manipulation_service)
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Generates a dialog from a list of speech and silence items.
|
||||||
|
- Processes text into manageable chunks.
|
||||||
|
- Generates speech for each chunk using the specified speaker.
|
||||||
|
- Inserts silences as requested.
|
||||||
|
- Concatenates all audio segments into a single file.
|
||||||
|
- Creates a ZIP archive of all individual segments and the concatenated file.
|
||||||
|
"""
|
||||||
|
# Wrap the core processing logic with model loading/unloading
|
||||||
|
return await manage_tts_model_lifecycle(
|
||||||
|
tts_service,
|
||||||
|
process_dialog_flow,
|
||||||
|
request=request,
|
||||||
|
dialog_processor=dialog_processor,
|
||||||
|
audio_manipulator=audio_manipulator,
|
||||||
|
background_tasks=background_tasks
|
||||||
|
)
|
|
@ -0,0 +1,81 @@
|
||||||
|
from typing import List, Annotated
|
||||||
|
from fastapi import APIRouter, Depends, HTTPException, UploadFile, File, Form
|
||||||
|
|
||||||
|
from app.models.speaker_models import Speaker, SpeakerResponse
|
||||||
|
from app.services.speaker_service import SpeakerManagementService
|
||||||
|
|
||||||
|
router = APIRouter(
|
||||||
|
tags=["Speakers"],
|
||||||
|
responses={404: {"description": "Not found"}},
|
||||||
|
)
|
||||||
|
|
||||||
|
# Dependency to get the speaker service instance
|
||||||
|
# This could be more sophisticated with a proper DI system later
|
||||||
|
def get_speaker_service():
|
||||||
|
return SpeakerManagementService()
|
||||||
|
|
||||||
|
@router.get("/", response_model=List[Speaker])
|
||||||
|
async def get_all_speakers(
|
||||||
|
service: Annotated[SpeakerManagementService, Depends(get_speaker_service)]
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Retrieve all available speakers.
|
||||||
|
"""
|
||||||
|
return service.get_speakers()
|
||||||
|
|
||||||
|
@router.post("/", response_model=SpeakerResponse, status_code=201)
|
||||||
|
async def create_new_speaker(
|
||||||
|
name: Annotated[str, Form()],
|
||||||
|
audio_file: Annotated[UploadFile, File()],
|
||||||
|
service: Annotated[SpeakerManagementService, Depends(get_speaker_service)]
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Add a new speaker.
|
||||||
|
Requires speaker name (form data) and an audio sample file (file upload).
|
||||||
|
"""
|
||||||
|
if not audio_file.filename:
|
||||||
|
raise HTTPException(status_code=400, detail="No audio file provided.")
|
||||||
|
if not audio_file.content_type or not audio_file.content_type.startswith("audio/"):
|
||||||
|
raise HTTPException(status_code=400, detail="Invalid audio file type. Please upload a valid audio file (e.g., WAV, MP3).")
|
||||||
|
|
||||||
|
try:
|
||||||
|
new_speaker = await service.add_speaker(name=name, audio_file=audio_file)
|
||||||
|
return SpeakerResponse(
|
||||||
|
id=new_speaker.id,
|
||||||
|
name=new_speaker.name,
|
||||||
|
message="Speaker added successfully."
|
||||||
|
)
|
||||||
|
except HTTPException as e:
|
||||||
|
# Re-raise HTTPExceptions from the service (e.g., file save error)
|
||||||
|
raise e
|
||||||
|
except Exception as e:
|
||||||
|
# Catch-all for other unexpected errors
|
||||||
|
raise HTTPException(status_code=500, detail=f"An unexpected error occurred: {str(e)}")
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/{speaker_id}", response_model=Speaker)
|
||||||
|
async def get_speaker_details(
|
||||||
|
speaker_id: str,
|
||||||
|
service: Annotated[SpeakerManagementService, Depends(get_speaker_service)]
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Get details for a specific speaker by ID.
|
||||||
|
"""
|
||||||
|
speaker = service.get_speaker_by_id(speaker_id)
|
||||||
|
if not speaker:
|
||||||
|
raise HTTPException(status_code=404, detail="Speaker not found")
|
||||||
|
return speaker
|
||||||
|
|
||||||
|
@router.delete("/{speaker_id}", response_model=dict)
|
||||||
|
async def remove_speaker(
|
||||||
|
speaker_id: str,
|
||||||
|
service: Annotated[SpeakerManagementService, Depends(get_speaker_service)]
|
||||||
|
):
|
||||||
|
"""
|
||||||
|
Delete a speaker by ID.
|
||||||
|
"""
|
||||||
|
deleted = service.delete_speaker(speaker_id)
|
||||||
|
if not deleted:
|
||||||
|
raise HTTPException(status_code=404, detail="Speaker not found or could not be deleted.")
|
||||||
|
return {"message": "Speaker deleted successfully"}
|
||||||
|
|
|
@ -0,0 +1 @@
|
||||||
|
|
|
@ -0,0 +1,241 @@
|
||||||
|
import torch
|
||||||
|
import torchaudio
|
||||||
|
from pathlib import Path
|
||||||
|
from typing import List, Dict, Union, Tuple
|
||||||
|
import zipfile
|
||||||
|
|
||||||
|
# Define a common sample rate, e.g., from the TTS model. This should ideally be configurable or dynamically obtained.
|
||||||
|
# For now, let's assume the TTS model (ChatterboxTTS) outputs at a known sample rate.
|
||||||
|
# The ChatterboxTTS model.sr is 24000.
|
||||||
|
DEFAULT_SAMPLE_RATE = 24000
|
||||||
|
|
||||||
|
class AudioManipulationService:
|
||||||
|
def __init__(self, default_sample_rate: int = DEFAULT_SAMPLE_RATE):
|
||||||
|
self.sample_rate = default_sample_rate
|
||||||
|
|
||||||
|
def _load_audio(self, file_path: Union[str, Path]) -> Tuple[torch.Tensor, int]:
|
||||||
|
"""Loads an audio file and returns the waveform and sample rate."""
|
||||||
|
try:
|
||||||
|
waveform, sr = torchaudio.load(file_path)
|
||||||
|
return waveform, sr
|
||||||
|
except Exception as e:
|
||||||
|
raise RuntimeError(f"Error loading audio file {file_path}: {e}")
|
||||||
|
|
||||||
|
def _create_silence(self, duration_seconds: float) -> torch.Tensor:
|
||||||
|
"""Creates a silent audio tensor of a given duration."""
|
||||||
|
num_frames = int(duration_seconds * self.sample_rate)
|
||||||
|
return torch.zeros((1, num_frames)) # Mono silence
|
||||||
|
|
||||||
|
def concatenate_audio_segments(
|
||||||
|
self,
|
||||||
|
segment_results: List[Dict],
|
||||||
|
output_concatenated_path: Path
|
||||||
|
) -> Path:
|
||||||
|
"""
|
||||||
|
Concatenates audio segments and silences into a single audio file.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
segment_results: A list of dictionaries, where each dict represents an audio
|
||||||
|
segment or a silence. Expected format:
|
||||||
|
For speech: {'type': 'speech', 'path': 'path/to/audio.wav', ...}
|
||||||
|
For silence: {'type': 'silence', 'duration': 0.5, ...}
|
||||||
|
output_concatenated_path: The path to save the final concatenated audio file.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
The path to the concatenated audio file.
|
||||||
|
"""
|
||||||
|
all_waveforms: List[torch.Tensor] = []
|
||||||
|
current_sample_rate = self.sample_rate # Assume this initially, verify with first loaded audio
|
||||||
|
|
||||||
|
for i, segment_info in enumerate(segment_results):
|
||||||
|
segment_type = segment_info.get("type")
|
||||||
|
|
||||||
|
if segment_type == "speech":
|
||||||
|
audio_path_str = segment_info.get("path")
|
||||||
|
if not audio_path_str:
|
||||||
|
print(f"Warning: Speech segment {i} has no path. Skipping.")
|
||||||
|
continue
|
||||||
|
|
||||||
|
audio_path = Path(audio_path_str)
|
||||||
|
if not audio_path.exists():
|
||||||
|
print(f"Warning: Audio file {audio_path} for segment {i} not found. Skipping.")
|
||||||
|
continue
|
||||||
|
|
||||||
|
try:
|
||||||
|
waveform, sr = self._load_audio(audio_path)
|
||||||
|
# Ensure consistent sample rate. Resample if necessary.
|
||||||
|
# For simplicity, this example assumes all inputs will match self.sample_rate
|
||||||
|
# or the first loaded audio's sample rate. A more robust implementation
|
||||||
|
# would resample if sr != current_sample_rate.
|
||||||
|
if i == 0 and not all_waveforms: # First audio segment sets the reference SR if not default
|
||||||
|
current_sample_rate = sr
|
||||||
|
if sr != self.sample_rate:
|
||||||
|
print(f"Warning: First audio segment SR ({sr} Hz) differs from service default SR ({self.sample_rate} Hz). Using segment SR.")
|
||||||
|
|
||||||
|
if sr != current_sample_rate:
|
||||||
|
print(f"Warning: Sample rate mismatch for {audio_path} ({sr} Hz) vs expected ({current_sample_rate} Hz). Resampling...")
|
||||||
|
resampler = torchaudio.transforms.Resample(orig_freq=sr, new_freq=current_sample_rate)
|
||||||
|
waveform = resampler(waveform)
|
||||||
|
|
||||||
|
# Ensure mono. If stereo, take the mean or first channel.
|
||||||
|
if waveform.shape[0] > 1:
|
||||||
|
waveform = torch.mean(waveform, dim=0, keepdim=True)
|
||||||
|
|
||||||
|
all_waveforms.append(waveform)
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Error processing speech segment {audio_path}: {e}. Skipping.")
|
||||||
|
|
||||||
|
elif segment_type == "silence":
|
||||||
|
duration = segment_info.get("duration")
|
||||||
|
if duration is None or not isinstance(duration, (int, float)) or duration < 0:
|
||||||
|
print(f"Warning: Silence segment {i} has invalid duration. Skipping.")
|
||||||
|
continue
|
||||||
|
silence_waveform = self._create_silence(float(duration))
|
||||||
|
all_waveforms.append(silence_waveform)
|
||||||
|
|
||||||
|
elif segment_type == "error":
|
||||||
|
# Errors are already logged by DialogProcessorService, just skip here.
|
||||||
|
print(f"Skipping segment {i} due to previous error: {segment_info.get('message')}")
|
||||||
|
continue
|
||||||
|
|
||||||
|
else:
|
||||||
|
print(f"Warning: Unknown segment type '{segment_type}' at index {i}. Skipping.")
|
||||||
|
|
||||||
|
if not all_waveforms:
|
||||||
|
raise ValueError("No valid audio segments or silences found to concatenate.")
|
||||||
|
|
||||||
|
# Concatenate all waveforms
|
||||||
|
final_waveform = torch.cat(all_waveforms, dim=1)
|
||||||
|
|
||||||
|
# Ensure output directory exists
|
||||||
|
output_concatenated_path.parent.mkdir(parents=True, exist_ok=True)
|
||||||
|
|
||||||
|
# Save the concatenated audio
|
||||||
|
try:
|
||||||
|
torchaudio.save(str(output_concatenated_path), final_waveform, current_sample_rate)
|
||||||
|
print(f"Concatenated audio saved to: {output_concatenated_path}")
|
||||||
|
return output_concatenated_path
|
||||||
|
except Exception as e:
|
||||||
|
raise RuntimeError(f"Error saving concatenated audio to {output_concatenated_path}: {e}")
|
||||||
|
|
||||||
|
def create_zip_archive(
|
||||||
|
self,
|
||||||
|
segment_file_paths: List[Path],
|
||||||
|
concatenated_audio_path: Path,
|
||||||
|
output_zip_path: Path
|
||||||
|
) -> Path:
|
||||||
|
"""
|
||||||
|
Creates a ZIP archive containing individual audio segments and the concatenated audio file.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
segment_file_paths: A list of paths to the individual audio segment files.
|
||||||
|
concatenated_audio_path: Path to the final concatenated audio file.
|
||||||
|
output_zip_path: The path to save the output ZIP archive.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
The path to the created ZIP archive.
|
||||||
|
"""
|
||||||
|
output_zip_path.parent.mkdir(parents=True, exist_ok=True)
|
||||||
|
|
||||||
|
with zipfile.ZipFile(output_zip_path, 'w', zipfile.ZIP_DEFLATED) as zf:
|
||||||
|
# Add concatenated audio
|
||||||
|
if concatenated_audio_path.exists():
|
||||||
|
zf.write(concatenated_audio_path, arcname=concatenated_audio_path.name)
|
||||||
|
else:
|
||||||
|
print(f"Warning: Concatenated audio file {concatenated_audio_path} not found for zipping.")
|
||||||
|
|
||||||
|
# Add individual segments
|
||||||
|
segments_dir_name = "segments"
|
||||||
|
for file_path in segment_file_paths:
|
||||||
|
if file_path.exists() and file_path.is_file():
|
||||||
|
# Store segments in a subdirectory within the zip for organization
|
||||||
|
zf.write(file_path, arcname=Path(segments_dir_name) / file_path.name)
|
||||||
|
else:
|
||||||
|
print(f"Warning: Segment file {file_path} not found or is not a file. Skipping for zipping.")
|
||||||
|
|
||||||
|
print(f"ZIP archive created at: {output_zip_path}")
|
||||||
|
return output_zip_path
|
||||||
|
|
||||||
|
# Example Usage (Test Block)
|
||||||
|
if __name__ == "__main__":
|
||||||
|
import tempfile
|
||||||
|
import shutil
|
||||||
|
|
||||||
|
# Create a temporary directory for test files
|
||||||
|
test_temp_dir = Path(tempfile.mkdtemp(prefix="audio_manip_test_"))
|
||||||
|
print(f"Created temporary test directory: {test_temp_dir}")
|
||||||
|
|
||||||
|
# Instance of the service
|
||||||
|
audio_service = AudioManipulationService()
|
||||||
|
|
||||||
|
# --- Test Data Setup ---
|
||||||
|
# Create dummy audio files (e.g., short silences with different names)
|
||||||
|
dummy_sr = audio_service.sample_rate
|
||||||
|
segment1_path = test_temp_dir / "segment1_speech.wav"
|
||||||
|
segment2_path = test_temp_dir / "segment2_speech.wav"
|
||||||
|
|
||||||
|
torchaudio.save(str(segment1_path), audio_service._create_silence(1.0), dummy_sr)
|
||||||
|
# Create a dummy segment with a different sample rate to test resampling
|
||||||
|
dummy_sr_alt = 16000
|
||||||
|
temp_waveform_alt_sr = torch.rand((1, int(0.5 * dummy_sr_alt))) # 0.5s at 16kHz
|
||||||
|
torchaudio.save(str(segment2_path), temp_waveform_alt_sr, dummy_sr_alt)
|
||||||
|
|
||||||
|
segment_results_for_concat = [
|
||||||
|
{"type": "speech", "path": str(segment1_path), "speaker_id": "spk1", "text_chunk": "Test 1"},
|
||||||
|
{"type": "silence", "duration": 0.5},
|
||||||
|
{"type": "speech", "path": str(segment2_path), "speaker_id": "spk2", "text_chunk": "Test 2 (alt SR)"},
|
||||||
|
{"type": "error", "message": "Simulated error, should be skipped"},
|
||||||
|
{"type": "speech", "path": "non_existent_segment.wav"}, # Test non-existent file
|
||||||
|
{"type": "silence", "duration": -0.2} # Test invalid duration
|
||||||
|
]
|
||||||
|
|
||||||
|
concatenated_output_path = test_temp_dir / "final_concatenated_audio.wav"
|
||||||
|
zip_output_path = test_temp_dir / "audio_archive.zip"
|
||||||
|
|
||||||
|
all_segment_files_for_zip = [segment1_path, segment2_path]
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Test concatenation
|
||||||
|
print("\n--- Testing Concatenation ---")
|
||||||
|
actual_concat_path = audio_service.concatenate_audio_segments(
|
||||||
|
segment_results_for_concat,
|
||||||
|
concatenated_output_path
|
||||||
|
)
|
||||||
|
print(f"Concatenation test successful. Output: {actual_concat_path}")
|
||||||
|
assert actual_concat_path.exists()
|
||||||
|
# Basic check: load concatenated and verify duration (approx)
|
||||||
|
concat_wav, concat_sr = audio_service._load_audio(actual_concat_path)
|
||||||
|
expected_duration = 1.0 + 0.5 + 0.5 # seg1 (1.0s) + silence (0.5s) + seg2 (0.5s) = 2.0s
|
||||||
|
actual_duration = concat_wav.shape[1] / concat_sr
|
||||||
|
print(f"Expected duration (approx): {expected_duration}s, Actual duration: {actual_duration:.2f}s")
|
||||||
|
assert abs(actual_duration - expected_duration) < 0.1 # Allow small deviation
|
||||||
|
|
||||||
|
# Test Zipping
|
||||||
|
print("\n--- Testing Zipping ---")
|
||||||
|
actual_zip_path = audio_service.create_zip_archive(
|
||||||
|
all_segment_files_for_zip,
|
||||||
|
actual_concat_path,
|
||||||
|
zip_output_path
|
||||||
|
)
|
||||||
|
print(f"Zipping test successful. Output: {actual_zip_path}")
|
||||||
|
assert actual_zip_path.exists()
|
||||||
|
# Verify zip contents (basic check)
|
||||||
|
segments_dir_name = "segments" # Define this for the assertion below
|
||||||
|
with zipfile.ZipFile(actual_zip_path, 'r') as zf_read:
|
||||||
|
zip_contents = zf_read.namelist()
|
||||||
|
print(f"ZIP contents: {zip_contents}")
|
||||||
|
assert Path(segments_dir_name) / segment1_path.name in [Path(p) for p in zip_contents]
|
||||||
|
assert Path(segments_dir_name) / segment2_path.name in [Path(p) for p in zip_contents]
|
||||||
|
assert concatenated_output_path.name in zip_contents
|
||||||
|
|
||||||
|
print("\nAll AudioManipulationService tests passed!")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
import traceback
|
||||||
|
print(f"\nAn error occurred during AudioManipulationService tests:")
|
||||||
|
traceback.print_exc()
|
||||||
|
finally:
|
||||||
|
# Clean up temporary directory
|
||||||
|
# shutil.rmtree(test_temp_dir)
|
||||||
|
# print(f"Cleaned up temporary test directory: {test_temp_dir}")
|
||||||
|
print(f"Test files are in {test_temp_dir}. Please inspect and delete manually if needed.")
|
|
@ -0,0 +1,265 @@
|
||||||
|
from pathlib import Path
|
||||||
|
from typing import List, Dict, Any, Union
|
||||||
|
import re
|
||||||
|
|
||||||
|
from .tts_service import TTSService
|
||||||
|
from .speaker_service import SpeakerManagementService
|
||||||
|
from app import config
|
||||||
|
# Potentially models for dialog structure if we define them
|
||||||
|
# from ..models.dialog_models import DialogItem # Example
|
||||||
|
|
||||||
|
class DialogProcessorService:
|
||||||
|
def __init__(self, tts_service: TTSService, speaker_service: SpeakerManagementService):
|
||||||
|
self.tts_service = tts_service
|
||||||
|
self.speaker_service = speaker_service
|
||||||
|
# Base directory for storing individual audio segments during processing
|
||||||
|
self.temp_audio_dir = config.TTS_TEMP_OUTPUT_DIR
|
||||||
|
self.temp_audio_dir.mkdir(parents=True, exist_ok=True)
|
||||||
|
|
||||||
|
def _split_text(self, text: str, max_length: int = 300) -> List[str]:
|
||||||
|
"""
|
||||||
|
Splits text into chunks suitable for TTS processing, attempting to respect sentence boundaries.
|
||||||
|
Similar to split_text_at_sentence_boundaries from the original Gradio app.
|
||||||
|
Max_length is approximate, as it tries to finish sentences.
|
||||||
|
"""
|
||||||
|
# Basic sentence splitting using common delimiters. More sophisticated NLP could be used.
|
||||||
|
# This regex tries to split by '.', '!', '?', '...', followed by space or end of string.
|
||||||
|
# It also handles cases where these delimiters might be followed by quotes or parentheses.
|
||||||
|
sentences = re.split(r'(?<=[.!?\u2026])\s+|(?<=[.!?\u2026])(?=["\')\]\}\u201d\u2019])|(?<=[.!?\u2026])$', text.strip())
|
||||||
|
sentences = [s.strip() for s in sentences if s and s.strip()]
|
||||||
|
|
||||||
|
chunks = []
|
||||||
|
current_chunk = ""
|
||||||
|
for sentence in sentences:
|
||||||
|
if not sentence:
|
||||||
|
continue
|
||||||
|
if not current_chunk: # First sentence for this chunk
|
||||||
|
current_chunk = sentence
|
||||||
|
elif len(current_chunk) + len(sentence) + 1 <= max_length:
|
||||||
|
current_chunk += " " + sentence
|
||||||
|
else:
|
||||||
|
chunks.append(current_chunk)
|
||||||
|
current_chunk = sentence
|
||||||
|
|
||||||
|
if current_chunk: # Add the last chunk
|
||||||
|
chunks.append(current_chunk)
|
||||||
|
|
||||||
|
# Further split any chunks that are still too long (e.g., a single very long sentence)
|
||||||
|
final_chunks = []
|
||||||
|
for chunk in chunks:
|
||||||
|
if len(chunk) > max_length:
|
||||||
|
# Simple split by length if a sentence itself is too long
|
||||||
|
for i in range(0, len(chunk), max_length):
|
||||||
|
final_chunks.append(chunk[i:i+max_length])
|
||||||
|
else:
|
||||||
|
final_chunks.append(chunk)
|
||||||
|
return final_chunks
|
||||||
|
|
||||||
|
async def process_dialog(self, dialog_items: List[Dict[str, Any]], output_base_name: str) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Processes a list of dialog items (speech or silence) to generate audio segments.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
dialog_items: A list of dictionaries, where each item has:
|
||||||
|
- 'type': 'speech' or 'silence'
|
||||||
|
- For 'speech': 'speaker_id': str, 'text': str
|
||||||
|
- For 'silence': 'duration': float (in seconds)
|
||||||
|
output_base_name: The base name for the output files.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
A dictionary containing paths to generated segments and other processing info.
|
||||||
|
Example: {
|
||||||
|
"log": "Processing complete...",
|
||||||
|
"segment_files": [
|
||||||
|
{"type": "speech", "path": "/path/to/segment1.wav", "speaker_id": "X", "text_chunk": "..."},
|
||||||
|
{"type": "silence", "duration": 0.5},
|
||||||
|
{"type": "speech", "path": "/path/to/segment2.wav", "speaker_id": "Y", "text_chunk": "..."}
|
||||||
|
],
|
||||||
|
"temp_dir": str(self.temp_audio_dir / output_base_name)
|
||||||
|
}
|
||||||
|
"""
|
||||||
|
segment_results = []
|
||||||
|
processing_log = []
|
||||||
|
|
||||||
|
# Create a unique subdirectory for this dialog's temporary files
|
||||||
|
dialog_temp_dir = self.temp_audio_dir / output_base_name
|
||||||
|
dialog_temp_dir.mkdir(parents=True, exist_ok=True)
|
||||||
|
processing_log.append(f"Created temporary directory for segments: {dialog_temp_dir}")
|
||||||
|
|
||||||
|
segment_idx = 0
|
||||||
|
for i, item in enumerate(dialog_items):
|
||||||
|
item_type = item.get("type")
|
||||||
|
processing_log.append(f"Processing item {i+1}: type='{item_type}'")
|
||||||
|
|
||||||
|
if item_type == "speech":
|
||||||
|
speaker_id = item.get("speaker_id")
|
||||||
|
text = item.get("text")
|
||||||
|
if not speaker_id or not text:
|
||||||
|
processing_log.append(f"Skipping speech item {i+1} due to missing speaker_id or text.")
|
||||||
|
segment_results.append({"type": "error", "message": "Missing speaker_id or text"})
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Validate speaker_id and get speaker_sample_path
|
||||||
|
speaker_info = self.speaker_service.get_speaker_by_id(speaker_id)
|
||||||
|
if not speaker_info:
|
||||||
|
processing_log.append(f"Speaker ID '{speaker_id}' not found. Skipping item {i+1}.")
|
||||||
|
segment_results.append({"type": "error", "message": f"Speaker ID '{speaker_id}' not found"})
|
||||||
|
continue
|
||||||
|
if not speaker_info.sample_path:
|
||||||
|
processing_log.append(f"Speaker ID '{speaker_id}' has no sample path defined. Skipping item {i+1}.")
|
||||||
|
segment_results.append({"type": "error", "message": f"Speaker ID '{speaker_id}' has no sample path defined"})
|
||||||
|
continue
|
||||||
|
|
||||||
|
# speaker_info.sample_path is relative to config.SPEAKER_DATA_BASE_DIR
|
||||||
|
abs_speaker_sample_path = config.SPEAKER_DATA_BASE_DIR / speaker_info.sample_path
|
||||||
|
if not abs_speaker_sample_path.is_file():
|
||||||
|
processing_log.append(f"Speaker sample file not found or is not a file at '{abs_speaker_sample_path}' for speaker ID '{speaker_id}'. Skipping item {i+1}.")
|
||||||
|
segment_results.append({"type": "error", "message": f"Speaker sample not a file or not found: {abs_speaker_sample_path}"})
|
||||||
|
continue
|
||||||
|
|
||||||
|
text_chunks = self._split_text(text)
|
||||||
|
processing_log.append(f"Split text for speaker '{speaker_id}' into {len(text_chunks)} chunk(s).")
|
||||||
|
|
||||||
|
for chunk_idx, text_chunk in enumerate(text_chunks):
|
||||||
|
segment_filename_base = f"{output_base_name}_seg{segment_idx}_spk{speaker_id}_chunk{chunk_idx}"
|
||||||
|
processing_log.append(f"Generating speech for chunk: '{text_chunk[:50]}...' using speaker '{speaker_id}'")
|
||||||
|
|
||||||
|
try:
|
||||||
|
segment_output_path = await self.tts_service.generate_speech(
|
||||||
|
text=text_chunk,
|
||||||
|
speaker_id=speaker_id, # For metadata, actual sample path is used by TTS
|
||||||
|
speaker_sample_path=str(abs_speaker_sample_path),
|
||||||
|
output_filename_base=segment_filename_base,
|
||||||
|
output_dir=dialog_temp_dir, # Save to the dialog's temp dir
|
||||||
|
exaggeration=item.get('exaggeration', 0.5), # Default from Gradio, Pydantic model should provide this
|
||||||
|
cfg_weight=item.get('cfg_weight', 0.5), # Default from Gradio, Pydantic model should provide this
|
||||||
|
temperature=item.get('temperature', 0.8) # Default from Gradio, Pydantic model should provide this
|
||||||
|
)
|
||||||
|
segment_results.append({
|
||||||
|
"type": "speech",
|
||||||
|
"path": str(segment_output_path),
|
||||||
|
"speaker_id": speaker_id,
|
||||||
|
"text_chunk": text_chunk
|
||||||
|
})
|
||||||
|
processing_log.append(f"Successfully generated segment: {segment_output_path}")
|
||||||
|
except Exception as e:
|
||||||
|
error_message = f"Error generating speech for chunk '{text_chunk[:50]}...': {repr(e)}"
|
||||||
|
processing_log.append(error_message)
|
||||||
|
segment_results.append({"type": "error", "message": error_message, "text_chunk": text_chunk})
|
||||||
|
segment_idx += 1
|
||||||
|
|
||||||
|
elif item_type == "silence":
|
||||||
|
duration = item.get("duration")
|
||||||
|
if duration is None or duration < 0:
|
||||||
|
processing_log.append(f"Skipping silence item {i+1} due to invalid duration.")
|
||||||
|
segment_results.append({"type": "error", "message": "Invalid duration for silence"})
|
||||||
|
continue
|
||||||
|
segment_results.append({"type": "silence", "duration": float(duration)})
|
||||||
|
processing_log.append(f"Added silence of {duration}s.")
|
||||||
|
|
||||||
|
else:
|
||||||
|
processing_log.append(f"Unknown item type '{item_type}' at item {i+1}. Skipping.")
|
||||||
|
segment_results.append({"type": "error", "message": f"Unknown item type: {item_type}"})
|
||||||
|
|
||||||
|
return {
|
||||||
|
"log": "\n".join(processing_log),
|
||||||
|
"segment_files": segment_results,
|
||||||
|
"temp_dir": str(dialog_temp_dir) # For cleanup or zipping later
|
||||||
|
}
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
import asyncio
|
||||||
|
import pprint
|
||||||
|
|
||||||
|
async def main_test():
|
||||||
|
# Initialize services
|
||||||
|
tts_service = TTSService(device="mps") # or your preferred device
|
||||||
|
speaker_service = SpeakerManagementService()
|
||||||
|
dialog_processor = DialogProcessorService(tts_service, speaker_service)
|
||||||
|
|
||||||
|
# Ensure dummy speaker sample exists (TTSService test block usually creates this)
|
||||||
|
# For robustness, we can call the TTSService test logic or ensure it's run prior.
|
||||||
|
# Here, we assume dummy_speaker_test.wav is available as per previous steps.
|
||||||
|
# If not, the 'test_speaker_for_dialog_proc' will fail file validation.
|
||||||
|
|
||||||
|
# First, ensure the dummy speaker file is created by TTSService's own test logic
|
||||||
|
# This is a bit of a hack for testing; ideally, test assets are managed independently.
|
||||||
|
try:
|
||||||
|
print("Ensuring dummy speaker sample is created by running TTSService's main_test logic...")
|
||||||
|
from .tts_service import main_test as tts_main_test
|
||||||
|
await tts_main_test() # This will create the dummy_speaker_test.wav
|
||||||
|
print("TTSService main_test completed, dummy sample should exist.")
|
||||||
|
except ImportError:
|
||||||
|
print("Could not import tts_service.main_test directly. Ensure dummy_speaker_test.wav exists.")
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Error running tts_service.main_test for dummy sample creation: {e}")
|
||||||
|
print("Proceeding, but 'test_speaker_for_dialog_proc' might fail if sample is missing.")
|
||||||
|
|
||||||
|
sample_dialog_items = [
|
||||||
|
{
|
||||||
|
"type": "speech",
|
||||||
|
"speaker_id": "test_speaker_for_dialog_proc", # Defined in speakers.yaml
|
||||||
|
"text": "Hello world! This is the first speech segment."
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "silence",
|
||||||
|
"duration": 0.75
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "speech",
|
||||||
|
"speaker_id": "test_speaker_for_dialog_proc",
|
||||||
|
"text": "This is a much longer piece of text that should definitely be split into multiple, smaller chunks by the dialog processor. It contains several sentences. Let's see how it handles this. The maximum length is set to 300 characters, but it tries to respect sentence boundaries. This sentence itself is quite long and might even be split mid-sentence if it exceeds the hard limit after sentence splitting. We will observe the output carefully to ensure it works as expected, creating multiple audio files for this single text block if necessary."
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "speech",
|
||||||
|
"speaker_id": "non_existent_speaker_id",
|
||||||
|
"text": "This should fail because the speaker does not exist."
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "invalid_type",
|
||||||
|
"text": "This item has an invalid type."
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "speech",
|
||||||
|
"speaker_id": "test_speaker_for_dialog_proc",
|
||||||
|
"text": None # Test missing text
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "speech",
|
||||||
|
"speaker_id": None, # Test missing speaker_id
|
||||||
|
"text": "This is a test with a missing speaker ID."
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "silence",
|
||||||
|
"duration": -0.5 # Invalid duration
|
||||||
|
}
|
||||||
|
]
|
||||||
|
|
||||||
|
output_base_name = "dialog_processor_test_run"
|
||||||
|
|
||||||
|
try:
|
||||||
|
print(f"\nLoading TTS model for DialogProcessorService test...")
|
||||||
|
# TTSService's generate_speech will load the model if not already loaded.
|
||||||
|
# However, explicit load/unload is good practice for a test block.
|
||||||
|
tts_service.load_model()
|
||||||
|
|
||||||
|
print(f"\nProcessing dialog items with base name: {output_base_name}...")
|
||||||
|
results = await dialog_processor.process_dialog(sample_dialog_items, output_base_name)
|
||||||
|
|
||||||
|
print("\n--- Processing Log ---")
|
||||||
|
print(results.get("log"))
|
||||||
|
print("\n--- Segment Files / Results ---")
|
||||||
|
pprint.pprint(results.get("segment_files"))
|
||||||
|
print(f"\nTemporary directory used: {results.get('temp_dir')}")
|
||||||
|
print("\nPlease check the temporary directory for generated audio segments.")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
import traceback
|
||||||
|
print(f"\nAn error occurred during the DialogProcessorService test:")
|
||||||
|
traceback.print_exc()
|
||||||
|
finally:
|
||||||
|
print("\nUnloading TTS model...")
|
||||||
|
tts_service.unload_model()
|
||||||
|
print("DialogProcessorService test finished.")
|
||||||
|
|
||||||
|
asyncio.run(main_test())
|
|
@ -0,0 +1,147 @@
|
||||||
|
import yaml
|
||||||
|
import uuid
|
||||||
|
import os
|
||||||
|
import io # Added for BytesIO
|
||||||
|
import torchaudio # Added for audio processing
|
||||||
|
from pathlib import Path
|
||||||
|
from typing import List, Dict, Optional, Any
|
||||||
|
|
||||||
|
from fastapi import UploadFile, HTTPException
|
||||||
|
from app.models.speaker_models import Speaker, SpeakerCreate
|
||||||
|
from app import config
|
||||||
|
|
||||||
|
class SpeakerManagementService:
|
||||||
|
def __init__(self):
|
||||||
|
self._ensure_data_files_exist()
|
||||||
|
self.speakers_data = self._load_speakers_data()
|
||||||
|
|
||||||
|
def _ensure_data_files_exist(self):
|
||||||
|
"""Ensures the speaker data directory and YAML file exist."""
|
||||||
|
config.SPEAKER_DATA_BASE_DIR.mkdir(parents=True, exist_ok=True)
|
||||||
|
config.SPEAKER_SAMPLES_DIR.mkdir(parents=True, exist_ok=True)
|
||||||
|
if not config.SPEAKERS_YAML_FILE.exists():
|
||||||
|
with open(config.SPEAKERS_YAML_FILE, 'w') as f:
|
||||||
|
yaml.dump({}, f) # Initialize with an empty dict, as per previous fixes
|
||||||
|
|
||||||
|
def _load_speakers_data(self) -> Dict[str, Any]: # Changed return type to Dict
|
||||||
|
"""Loads speaker data from the YAML file."""
|
||||||
|
try:
|
||||||
|
with open(config.SPEAKERS_YAML_FILE, 'r') as f:
|
||||||
|
data = yaml.safe_load(f)
|
||||||
|
return data if isinstance(data, dict) else {} # Ensure it's a dict
|
||||||
|
except FileNotFoundError:
|
||||||
|
return {}
|
||||||
|
except yaml.YAMLError:
|
||||||
|
# Handle corrupted YAML file, e.g., log error and return empty list
|
||||||
|
print(f"Error: Corrupted speakers YAML file at {config.SPEAKERS_YAML_FILE}")
|
||||||
|
return {}
|
||||||
|
|
||||||
|
|
||||||
|
def _save_speakers_data(self):
|
||||||
|
"""Saves the current speaker data to the YAML file."""
|
||||||
|
with open(config.SPEAKERS_YAML_FILE, 'w') as f:
|
||||||
|
yaml.dump(self.speakers_data, f, sort_keys=False)
|
||||||
|
|
||||||
|
def get_speakers(self) -> List[Speaker]:
|
||||||
|
"""Returns a list of all speakers."""
|
||||||
|
# self.speakers_data is now a dict: {speaker_id: {name: ..., sample_path: ...}}
|
||||||
|
return [Speaker(id=spk_id, **spk_attrs) for spk_id, spk_attrs in self.speakers_data.items()]
|
||||||
|
|
||||||
|
def get_speaker_by_id(self, speaker_id: str) -> Optional[Speaker]:
|
||||||
|
"""Retrieves a speaker by their ID."""
|
||||||
|
if speaker_id in self.speakers_data:
|
||||||
|
speaker_attributes = self.speakers_data[speaker_id]
|
||||||
|
return Speaker(id=speaker_id, **speaker_attributes)
|
||||||
|
return None
|
||||||
|
|
||||||
|
async def add_speaker(self, name: str, audio_file: UploadFile) -> Speaker:
|
||||||
|
"""Adds a new speaker, converts sample to WAV, saves it, and updates YAML."""
|
||||||
|
speaker_id = str(uuid.uuid4())
|
||||||
|
|
||||||
|
# Define standardized sample filename and path (always WAV)
|
||||||
|
sample_filename = f"{speaker_id}.wav"
|
||||||
|
sample_path = config.SPEAKER_SAMPLES_DIR / sample_filename
|
||||||
|
|
||||||
|
try:
|
||||||
|
content = await audio_file.read()
|
||||||
|
# Use BytesIO to handle the in-memory audio data for torchaudio
|
||||||
|
audio_buffer = io.BytesIO(content)
|
||||||
|
|
||||||
|
# Load audio data using torchaudio, this handles various formats (MP3, WAV, etc.)
|
||||||
|
# waveform is a tensor, sample_rate is an int
|
||||||
|
waveform, sample_rate = torchaudio.load(audio_buffer)
|
||||||
|
|
||||||
|
# Save the audio data as WAV
|
||||||
|
# Ensure the SPEAKER_SAMPLES_DIR exists (though _ensure_data_files_exist should handle it)
|
||||||
|
config.SPEAKER_SAMPLES_DIR.mkdir(parents=True, exist_ok=True)
|
||||||
|
torchaudio.save(str(sample_path), waveform, sample_rate, format="wav")
|
||||||
|
|
||||||
|
except torchaudio.TorchaudioException as e:
|
||||||
|
# More specific error for torchaudio issues (e.g. unsupported format, corrupted file)
|
||||||
|
raise HTTPException(status_code=400, detail=f"Error processing audio file: {e}. Ensure it's a valid audio format (e.g., WAV, MP3).")
|
||||||
|
except Exception as e:
|
||||||
|
# General error handling for other issues (e.g., file system errors)
|
||||||
|
raise HTTPException(status_code=500, detail=f"Could not save audio file: {e}")
|
||||||
|
finally:
|
||||||
|
await audio_file.close()
|
||||||
|
|
||||||
|
new_speaker_data = {
|
||||||
|
"id": speaker_id,
|
||||||
|
"name": name,
|
||||||
|
"sample_path": str(sample_path.relative_to(config.SPEAKER_DATA_BASE_DIR)) # Store path relative to speaker_data dir
|
||||||
|
}
|
||||||
|
|
||||||
|
# self.speakers_data is now a dict
|
||||||
|
self.speakers_data[speaker_id] = {
|
||||||
|
"name": name,
|
||||||
|
"sample_path": str(sample_path.relative_to(config.SPEAKER_DATA_BASE_DIR))
|
||||||
|
}
|
||||||
|
self._save_speakers_data()
|
||||||
|
# Construct Speaker model for return, including the ID
|
||||||
|
return Speaker(id=speaker_id, name=name, sample_path=str(sample_path.relative_to(config.SPEAKER_DATA_BASE_DIR)))
|
||||||
|
|
||||||
|
def delete_speaker(self, speaker_id: str) -> bool:
|
||||||
|
"""Deletes a speaker and their audio sample."""
|
||||||
|
# Speaker data is now a dictionary, keyed by speaker_id
|
||||||
|
speaker_to_delete = self.speakers_data.pop(speaker_id, None)
|
||||||
|
|
||||||
|
if speaker_to_delete:
|
||||||
|
self._save_speakers_data()
|
||||||
|
sample_path_str = speaker_to_delete.get("sample_path")
|
||||||
|
if sample_path_str:
|
||||||
|
# sample_path_str is relative to SPEAKER_DATA_BASE_DIR
|
||||||
|
full_sample_path = config.SPEAKER_DATA_BASE_DIR / sample_path_str
|
||||||
|
try:
|
||||||
|
if full_sample_path.is_file(): # Check if it's a file before removing
|
||||||
|
os.remove(full_sample_path)
|
||||||
|
except OSError as e:
|
||||||
|
# Log error if file deletion fails but proceed
|
||||||
|
print(f"Error deleting sample file {full_sample_path}: {e}")
|
||||||
|
return True
|
||||||
|
return False
|
||||||
|
|
||||||
|
# Example usage (for testing, not part of the service itself)
|
||||||
|
if __name__ == "__main__":
|
||||||
|
service = SpeakerManagementService()
|
||||||
|
print("Initial speakers:", service.get_speakers())
|
||||||
|
|
||||||
|
# This part would require a mock UploadFile to run directly
|
||||||
|
# print("\nAdding a new speaker (manual test setup needed for UploadFile)")
|
||||||
|
# class MockUploadFile:
|
||||||
|
# def __init__(self, filename, content):
|
||||||
|
# self.filename = filename
|
||||||
|
# self._content = content
|
||||||
|
# async def read(self): return self._content
|
||||||
|
# async def close(self): pass
|
||||||
|
# import asyncio
|
||||||
|
# async def test_add():
|
||||||
|
# mock_file = MockUploadFile("test.wav", b"dummy audio content")
|
||||||
|
# new_speaker = await service.add_speaker(name="Test Speaker", audio_file=mock_file)
|
||||||
|
# print("\nAdded speaker:", new_speaker)
|
||||||
|
# print("Speakers after add:", service.get_speakers())
|
||||||
|
# return new_speaker.id
|
||||||
|
# speaker_id_to_delete = asyncio.run(test_add())
|
||||||
|
# if speaker_id_to_delete:
|
||||||
|
# print(f"\nDeleting speaker {speaker_id_to_delete}")
|
||||||
|
# service.delete_speaker(speaker_id_to_delete)
|
||||||
|
# print("Speakers after delete:", service.get_speakers())
|
|
@ -0,0 +1,155 @@
|
||||||
|
import torch
|
||||||
|
import torchaudio
|
||||||
|
from typing import Optional
|
||||||
|
from chatterbox.tts import ChatterboxTTS
|
||||||
|
from pathlib import Path
|
||||||
|
import gc # Garbage collector for memory management
|
||||||
|
|
||||||
|
# Define a directory for TTS model outputs, could be temporary or configurable
|
||||||
|
TTS_OUTPUT_DIR = Path("/Volumes/SAM2/CODE/chatterbox-test/tts_outputs") # Example path
|
||||||
|
|
||||||
|
class TTSService:
|
||||||
|
def __init__(self, device: str = "mps"): # Default to MPS for Macs, can be "cpu" or "cuda"
|
||||||
|
self.device = device
|
||||||
|
self.model = None
|
||||||
|
self._ensure_output_dir_exists()
|
||||||
|
|
||||||
|
def _ensure_output_dir_exists(self):
|
||||||
|
"""Ensures the TTS output directory exists."""
|
||||||
|
TTS_OUTPUT_DIR.mkdir(parents=True, exist_ok=True)
|
||||||
|
|
||||||
|
def load_model(self):
|
||||||
|
"""Loads the ChatterboxTTS model."""
|
||||||
|
if self.model is None:
|
||||||
|
print(f"Loading ChatterboxTTS model to device: {self.device}...")
|
||||||
|
try:
|
||||||
|
self.model = ChatterboxTTS.from_pretrained(device=self.device)
|
||||||
|
print("ChatterboxTTS model loaded successfully.")
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Error loading ChatterboxTTS model: {e}")
|
||||||
|
# Potentially raise an exception or handle appropriately
|
||||||
|
raise
|
||||||
|
else:
|
||||||
|
print("ChatterboxTTS model already loaded.")
|
||||||
|
|
||||||
|
def unload_model(self):
|
||||||
|
"""Unloads the model and clears memory."""
|
||||||
|
if self.model is not None:
|
||||||
|
print("Unloading ChatterboxTTS model and clearing cache...")
|
||||||
|
del self.model
|
||||||
|
self.model = None
|
||||||
|
if self.device == "cuda":
|
||||||
|
torch.cuda.empty_cache()
|
||||||
|
elif self.device == "mps":
|
||||||
|
if hasattr(torch.mps, "empty_cache"): # Check if empty_cache is available for MPS
|
||||||
|
torch.mps.empty_cache()
|
||||||
|
gc.collect() # Explicitly run garbage collection
|
||||||
|
print("Model unloaded and memory cleared.")
|
||||||
|
|
||||||
|
async def generate_speech(
|
||||||
|
self,
|
||||||
|
text: str,
|
||||||
|
speaker_sample_path: str, # Absolute path to the speaker's audio sample
|
||||||
|
output_filename_base: str, # e.g., "dialog_line_1_spk_X_chunk_0"
|
||||||
|
speaker_id: Optional[str] = None, # Optional, mainly for logging if needed, filename base is primary
|
||||||
|
output_dir: Optional[Path] = None, # Optional, defaults to TTS_OUTPUT_DIR from this module
|
||||||
|
exaggeration: float = 0.5, # Default from Gradio
|
||||||
|
cfg_weight: float = 0.5, # Default from Gradio
|
||||||
|
temperature: float = 0.8, # Default from Gradio
|
||||||
|
) -> Path:
|
||||||
|
"""
|
||||||
|
Generates speech from text using the loaded TTS model and a speaker sample.
|
||||||
|
Saves the output to a .wav file.
|
||||||
|
"""
|
||||||
|
if self.model is None:
|
||||||
|
self.load_model()
|
||||||
|
|
||||||
|
if self.model is None: # Check again if loading failed
|
||||||
|
raise RuntimeError("TTS model is not loaded. Cannot generate speech.")
|
||||||
|
|
||||||
|
# Ensure speaker_sample_path is valid
|
||||||
|
speaker_sample_p = Path(speaker_sample_path)
|
||||||
|
if not speaker_sample_p.exists() or not speaker_sample_p.is_file():
|
||||||
|
raise FileNotFoundError(f"Speaker sample audio file not found: {speaker_sample_path}")
|
||||||
|
|
||||||
|
target_output_dir = output_dir if output_dir is not None else TTS_OUTPUT_DIR
|
||||||
|
target_output_dir.mkdir(parents=True, exist_ok=True)
|
||||||
|
# output_filename_base from DialogProcessorService is expected to be comprehensive (e.g., includes speaker_id, segment info)
|
||||||
|
output_file_path = target_output_dir / f"{output_filename_base}.wav"
|
||||||
|
|
||||||
|
print(f"Generating audio for text: \"{text[:50]}...\" with speaker sample: {speaker_sample_path}")
|
||||||
|
try:
|
||||||
|
with torch.no_grad(): # Important for inference
|
||||||
|
wav = self.model.generate(
|
||||||
|
text=text,
|
||||||
|
audio_prompt_path=str(speaker_sample_p), # Must be a string path
|
||||||
|
exaggeration=exaggeration,
|
||||||
|
cfg_weight=cfg_weight,
|
||||||
|
temperature=temperature,
|
||||||
|
)
|
||||||
|
|
||||||
|
torchaudio.save(str(output_file_path), wav, self.model.sr)
|
||||||
|
print(f"Audio saved to: {output_file_path}")
|
||||||
|
return output_file_path
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Error during TTS generation or saving: {e}")
|
||||||
|
raise
|
||||||
|
finally:
|
||||||
|
# For now, we keep it loaded. Memory management might need refinement.
|
||||||
|
pass
|
||||||
|
|
||||||
|
# Example usage (for testing, not part of the service itself)
|
||||||
|
if __name__ == "__main__":
|
||||||
|
async def main_test():
|
||||||
|
tts_service = TTSService(device="mps")
|
||||||
|
try:
|
||||||
|
tts_service.load_model()
|
||||||
|
|
||||||
|
dummy_speaker_root = Path("/Volumes/SAM2/CODE/chatterbox-test/speaker_data/speaker_samples")
|
||||||
|
dummy_speaker_root.mkdir(parents=True, exist_ok=True)
|
||||||
|
dummy_sample_file = dummy_speaker_root / "dummy_speaker_test.wav"
|
||||||
|
import os # Added for os.remove
|
||||||
|
# Always try to remove an existing dummy file to ensure a fresh one is created
|
||||||
|
if dummy_sample_file.exists():
|
||||||
|
try:
|
||||||
|
os.remove(dummy_sample_file)
|
||||||
|
print(f"Removed existing dummy sample: {dummy_sample_file}")
|
||||||
|
except OSError as e:
|
||||||
|
print(f"Error removing existing dummy sample {dummy_sample_file}: {e}")
|
||||||
|
# Proceeding, but torchaudio.save might fail or overwrite
|
||||||
|
|
||||||
|
print(f"Creating new dummy speaker sample: {dummy_sample_file}")
|
||||||
|
# Create a minimal, silent WAV file for testing
|
||||||
|
sample_rate = 22050
|
||||||
|
duration = 1 # seconds
|
||||||
|
num_channels = 1
|
||||||
|
num_frames = sample_rate * duration
|
||||||
|
audio_data = torch.zeros((num_channels, num_frames))
|
||||||
|
try:
|
||||||
|
torchaudio.save(str(dummy_sample_file), audio_data, sample_rate)
|
||||||
|
print(f"Dummy sample created successfully: {dummy_sample_file}")
|
||||||
|
except Exception as save_e:
|
||||||
|
print(f"Could not create dummy sample: {save_e}")
|
||||||
|
# If creation fails, the subsequent generation test will likely also fail or be skipped.
|
||||||
|
|
||||||
|
|
||||||
|
if dummy_sample_file.exists():
|
||||||
|
output_path = await tts_service.generate_speech(
|
||||||
|
text="Hello, this is a test of the Text-to-Speech service.",
|
||||||
|
speaker_id="test_speaker",
|
||||||
|
speaker_sample_path=str(dummy_sample_file),
|
||||||
|
output_filename_base="test_generation"
|
||||||
|
)
|
||||||
|
print(f"Test generation output: {output_path}")
|
||||||
|
else:
|
||||||
|
print(f"Skipping generation test as dummy sample {dummy_sample_file} not found.")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
import traceback
|
||||||
|
print(f"Error during TTS generation or saving:")
|
||||||
|
traceback.print_exc()
|
||||||
|
finally:
|
||||||
|
tts_service.unload_model()
|
||||||
|
|
||||||
|
import asyncio
|
||||||
|
asyncio.run(main_test())
|
|
@ -0,0 +1,7 @@
|
||||||
|
fastapi
|
||||||
|
uvicorn[standard]
|
||||||
|
python-multipart
|
||||||
|
PyYAML
|
||||||
|
torch
|
||||||
|
torchaudio
|
||||||
|
chatterbox-tts
|
|
@ -0,0 +1,108 @@
|
||||||
|
import requests
|
||||||
|
import json
|
||||||
|
from pathlib import Path
|
||||||
|
import time
|
||||||
|
|
||||||
|
# Configuration
|
||||||
|
API_BASE_URL = "http://localhost:8000/api/dialog"
|
||||||
|
ENDPOINT_URL = f"{API_BASE_URL}/generate"
|
||||||
|
|
||||||
|
# Define project root relative to this test script (assuming it's in backend/)
|
||||||
|
PROJECT_ROOT = Path(__file__).resolve().parent
|
||||||
|
GENERATED_DIALOGS_DIR = PROJECT_ROOT / "tts_generated_dialogs"
|
||||||
|
|
||||||
|
DIALOG_PAYLOAD = {
|
||||||
|
"output_base_name": "test_dialog_from_script",
|
||||||
|
"dialog_items": [
|
||||||
|
{
|
||||||
|
"type": "speech",
|
||||||
|
"speaker_id": "dummy_speaker", # Ensure this speaker exists in your speakers.yaml and has a sample .wav
|
||||||
|
"text": "This is a test from the Python script. One, two, three.",
|
||||||
|
"exaggeration": 1.5,
|
||||||
|
"cfg_weight": 4.0,
|
||||||
|
"temperature": 0.5
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "silence",
|
||||||
|
"duration": 0.5
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "speech",
|
||||||
|
"speaker_id": "dummy_speaker",
|
||||||
|
"text": "Testing complete. All systems nominal."
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "speech",
|
||||||
|
"speaker_id": "non_existent_speaker", # Test case for invalid speaker
|
||||||
|
"text": "This should produce an error for this segment."
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "silence",
|
||||||
|
"duration": 0.25 # Changed to valid duration
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
|
||||||
|
def run_test():
|
||||||
|
print(f"Sending POST request to: {ENDPOINT_URL}")
|
||||||
|
print("Payload:")
|
||||||
|
print(json.dumps(DIALOG_PAYLOAD, indent=2))
|
||||||
|
print("-" * 50)
|
||||||
|
|
||||||
|
try:
|
||||||
|
start_time = time.time()
|
||||||
|
response = requests.post(ENDPOINT_URL, json=DIALOG_PAYLOAD, timeout=120) # Increased timeout for TTS processing
|
||||||
|
end_time = time.time()
|
||||||
|
|
||||||
|
print(f"Response received in {end_time - start_time:.2f} seconds.")
|
||||||
|
print(f"Status Code: {response.status_code}")
|
||||||
|
print("-" * 50)
|
||||||
|
|
||||||
|
if response.content:
|
||||||
|
try:
|
||||||
|
response_data = response.json()
|
||||||
|
print("Response JSON:")
|
||||||
|
print(json.dumps(response_data, indent=2))
|
||||||
|
print("-" * 50)
|
||||||
|
|
||||||
|
if response.status_code == 200:
|
||||||
|
print("Test PASSED (HTTP 200 OK)")
|
||||||
|
concatenated_url = response_data.get("concatenated_audio_url")
|
||||||
|
zip_url = response_data.get("zip_archive_url")
|
||||||
|
temp_dir = response_data.get("temp_dir_path")
|
||||||
|
|
||||||
|
if concatenated_url:
|
||||||
|
print(f"Concatenated audio URL: http://localhost:8000{concatenated_url}")
|
||||||
|
if zip_url:
|
||||||
|
print(f"ZIP archive URL: http://localhost:8000{zip_url}")
|
||||||
|
if temp_dir:
|
||||||
|
print(f"Temporary segment directory: {temp_dir}")
|
||||||
|
|
||||||
|
print("\nTo verify, check the generated files in:")
|
||||||
|
print(f" Concatenated/ZIP: {GENERATED_DIALOGS_DIR}")
|
||||||
|
print(f" Individual segments (if not cleaned up): {temp_dir}")
|
||||||
|
else:
|
||||||
|
print(f"Test FAILED (HTTP {response.status_code})")
|
||||||
|
if response_data.get("detail"):
|
||||||
|
print(f"Error Detail: {response_data.get('detail')}")
|
||||||
|
|
||||||
|
except json.JSONDecodeError:
|
||||||
|
print("Response content is not valid JSON:")
|
||||||
|
print(response.text)
|
||||||
|
print("Test FAILED (Invalid JSON Response)")
|
||||||
|
else:
|
||||||
|
print("Response content is empty.")
|
||||||
|
print(f"Test FAILED (Empty Response, HTTP {response.status_code})")
|
||||||
|
|
||||||
|
except requests.exceptions.ConnectionError as e:
|
||||||
|
print(f"Connection Error: {e}")
|
||||||
|
print("Test FAILED (Could not connect to the server. Is it running?)")
|
||||||
|
except requests.exceptions.Timeout as e:
|
||||||
|
print(f"Request Timeout: {e}")
|
||||||
|
print("Test FAILED (The request timed out. TTS processing might be too slow or stuck.)")
|
||||||
|
except Exception as e:
|
||||||
|
print(f"An unexpected error occurred: {e}")
|
||||||
|
print("Test FAILED (Unexpected error)")
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
run_test()
|
|
@ -1,74 +1,255 @@
|
||||||
/* Basic styles - to be expanded */
|
/* Modern, clean, and accessible UI styles for Chatterbox TTS */
|
||||||
body {
|
body {
|
||||||
font-family: sans-serif;
|
font-family: 'Segoe UI', 'Roboto', 'Arial', sans-serif;
|
||||||
line-height: 1.6;
|
line-height: 1.7;
|
||||||
margin: 0;
|
margin: 0;
|
||||||
padding: 0;
|
padding: 0;
|
||||||
background-color: #f4f4f4;
|
background-color: #f7f9fa;
|
||||||
color: #333;
|
color: #222;
|
||||||
|
}
|
||||||
|
|
||||||
|
.container {
|
||||||
|
max-width: 1100px;
|
||||||
|
margin: 0 auto;
|
||||||
|
padding: 0 18px;
|
||||||
}
|
}
|
||||||
|
|
||||||
header {
|
header {
|
||||||
background: #333;
|
background: #222e3a;
|
||||||
color: #fff;
|
color: #fff;
|
||||||
padding: 1rem 0;
|
padding: 1.5rem 0 1rem 0;
|
||||||
text-align: center;
|
text-align: center;
|
||||||
|
border-bottom: 3px solid #4a90e2;
|
||||||
|
}
|
||||||
|
|
||||||
|
h1 {
|
||||||
|
font-size: 2.4rem;
|
||||||
|
margin: 0;
|
||||||
|
letter-spacing: 1px;
|
||||||
}
|
}
|
||||||
|
|
||||||
main {
|
main {
|
||||||
padding: 20px;
|
margin-top: 30px;
|
||||||
max-width: 960px;
|
margin-bottom: 30px;
|
||||||
margin: auto;
|
}
|
||||||
|
|
||||||
|
.panel-grid {
|
||||||
|
display: flex;
|
||||||
|
flex-wrap: wrap;
|
||||||
|
gap: 28px;
|
||||||
|
justify-content: space-between;
|
||||||
|
}
|
||||||
|
|
||||||
|
.panel {
|
||||||
|
flex: 1 1 320px;
|
||||||
|
min-width: 320px;
|
||||||
|
background: none;
|
||||||
|
box-shadow: none;
|
||||||
|
border: none;
|
||||||
|
padding: 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
#results-display.panel {
|
||||||
|
flex: 1 1 100%;
|
||||||
|
min-width: 0;
|
||||||
|
margin-top: 32px;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* Dialog Table Styles */
|
||||||
|
#dialog-items-table {
|
||||||
|
width: 100%;
|
||||||
|
border-collapse: collapse;
|
||||||
|
background: #fff;
|
||||||
|
border-radius: 8px;
|
||||||
|
overflow: hidden;
|
||||||
|
font-size: 1rem;
|
||||||
|
margin-bottom: 0;
|
||||||
|
}
|
||||||
|
#dialog-items-table th, #dialog-items-table td {
|
||||||
|
padding: 10px 12px;
|
||||||
|
border-bottom: 1px solid #e3e3e3;
|
||||||
|
text-align: left;
|
||||||
|
}
|
||||||
|
#dialog-items-table th {
|
||||||
|
background: #f3f7fa;
|
||||||
|
color: #4a90e2;
|
||||||
|
font-weight: 600;
|
||||||
|
font-size: 1.05rem;
|
||||||
|
}
|
||||||
|
#dialog-items-table tr:last-child td {
|
||||||
|
border-bottom: none;
|
||||||
|
}
|
||||||
|
#dialog-items-table td.actions {
|
||||||
|
text-align: center;
|
||||||
|
min-width: 90px;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* Collapsible log details */
|
||||||
|
details#generation-log-details {
|
||||||
|
margin-bottom: 0;
|
||||||
|
border-radius: 4px;
|
||||||
|
background: #f3f5f7;
|
||||||
|
box-shadow: 0 1px 3px rgba(44,62,80,0.04);
|
||||||
|
padding: 0 0 0 0;
|
||||||
|
transition: box-shadow 0.15s;
|
||||||
|
}
|
||||||
|
details#generation-log-details[open] {
|
||||||
|
box-shadow: 0 2px 8px rgba(44,62,80,0.07);
|
||||||
|
background: #f9fafb;
|
||||||
|
}
|
||||||
|
details#generation-log-details summary {
|
||||||
|
font-size: 1rem;
|
||||||
|
color: #357ab8;
|
||||||
|
padding: 10px 0 6px 0;
|
||||||
|
outline: none;
|
||||||
|
}
|
||||||
|
details#generation-log-details summary:focus {
|
||||||
|
outline: 2px solid #4a90e2;
|
||||||
|
border-radius: 3px;
|
||||||
|
}
|
||||||
|
|
||||||
|
@media (max-width: 900px) {
|
||||||
|
.panel-grid {
|
||||||
|
display: block;
|
||||||
|
gap: 0;
|
||||||
|
}
|
||||||
|
.panel, .full-width-panel {
|
||||||
|
min-width: 0;
|
||||||
|
width: 100%;
|
||||||
|
flex: 1 1 100%;
|
||||||
|
}
|
||||||
|
#dialog-items-table th, #dialog-items-table td {
|
||||||
|
font-size: 0.97rem;
|
||||||
|
padding: 7px 8px;
|
||||||
|
}
|
||||||
|
#speaker-management.panel {
|
||||||
|
margin-bottom: 36px;
|
||||||
|
width: 100%;
|
||||||
|
max-width: 100%;
|
||||||
|
flex: 1 1 100%;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
.card {
|
||||||
|
background: #fff;
|
||||||
|
border-radius: 8px;
|
||||||
|
box-shadow: 0 2px 8px rgba(44,62,80,0.07);
|
||||||
|
padding: 18px 20px;
|
||||||
|
margin-bottom: 18px;
|
||||||
}
|
}
|
||||||
|
|
||||||
section {
|
section {
|
||||||
background: #fff;
|
margin-bottom: 0;
|
||||||
padding: 20px;
|
border-radius: 0;
|
||||||
margin-bottom: 20px;
|
padding: 0;
|
||||||
border-radius: 5px;
|
background: none;
|
||||||
}
|
}
|
||||||
|
|
||||||
hr {
|
hr {
|
||||||
margin: 20px 0;
|
display: none;
|
||||||
border: 0;
|
}
|
||||||
border-top: 1px solid #eee;
|
|
||||||
|
h2 {
|
||||||
|
font-size: 1.5rem;
|
||||||
|
margin-top: 0;
|
||||||
|
margin-bottom: 16px;
|
||||||
|
color: #4a90e2;
|
||||||
|
letter-spacing: 0.5px;
|
||||||
|
}
|
||||||
|
|
||||||
|
h3 {
|
||||||
|
font-size: 1.1rem;
|
||||||
|
margin-bottom: 10px;
|
||||||
|
color: #333;
|
||||||
|
}
|
||||||
|
|
||||||
|
.x-remove-btn {
|
||||||
|
background: #e74c3c;
|
||||||
|
color: #fff;
|
||||||
|
border: none;
|
||||||
|
border-radius: 50%;
|
||||||
|
width: 28px;
|
||||||
|
height: 28px;
|
||||||
|
font-size: 1.2rem;
|
||||||
|
line-height: 1;
|
||||||
|
display: inline-flex;
|
||||||
|
align-items: center;
|
||||||
|
justify-content: center;
|
||||||
|
cursor: pointer;
|
||||||
|
transition: background 0.15s;
|
||||||
|
margin: 0 2px;
|
||||||
|
box-shadow: 0 1px 2px rgba(44,62,80,0.06);
|
||||||
|
outline: none;
|
||||||
|
padding: 0;
|
||||||
|
}
|
||||||
|
.x-remove-btn:hover, .x-remove-btn:focus {
|
||||||
|
background: #c0392b;
|
||||||
|
color: #fff;
|
||||||
|
outline: 2px solid #e74c3c;
|
||||||
|
}
|
||||||
|
|
||||||
|
.form-row {
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
gap: 12px;
|
||||||
|
margin-bottom: 14px;
|
||||||
|
}
|
||||||
|
|
||||||
|
label {
|
||||||
|
min-width: 120px;
|
||||||
|
font-weight: 500;
|
||||||
|
margin-bottom: 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
input[type='text'], input[type='file'] {
|
||||||
|
padding: 8px 10px;
|
||||||
|
border: 1px solid #cfd8dc;
|
||||||
|
border-radius: 4px;
|
||||||
|
font-size: 1rem;
|
||||||
|
width: 100%;
|
||||||
|
box-sizing: border-box;
|
||||||
|
}
|
||||||
|
|
||||||
|
input[type='file'] {
|
||||||
|
background: #f7f7f7;
|
||||||
|
font-size: 0.97rem;
|
||||||
}
|
}
|
||||||
|
|
||||||
button {
|
button {
|
||||||
padding: 10px 15px;
|
padding: 9px 18px;
|
||||||
background: #333;
|
background: #4a90e2;
|
||||||
color: #fff;
|
color: #fff;
|
||||||
border: none;
|
border: none;
|
||||||
border-radius: 5px;
|
border-radius: 5px;
|
||||||
cursor: pointer;
|
cursor: pointer;
|
||||||
margin-right: 5px; /* Add some margin between buttons */
|
font-size: 1rem;
|
||||||
|
font-weight: 500;
|
||||||
|
transition: background 0.15s;
|
||||||
|
margin-right: 10px;
|
||||||
}
|
}
|
||||||
|
|
||||||
button:hover {
|
button:hover, button:focus {
|
||||||
background: #555;
|
background: #357ab8;
|
||||||
|
outline: none;
|
||||||
}
|
}
|
||||||
|
|
||||||
input[type='text'], input[type='file'] {
|
.dialog-controls {
|
||||||
padding: 8px;
|
|
||||||
margin-bottom: 10px;
|
margin-bottom: 10px;
|
||||||
border: 1px solid #ddd;
|
|
||||||
border-radius: 4px;
|
|
||||||
width: calc(100% - 20px); /* Adjust width considering padding */
|
|
||||||
}
|
|
||||||
|
|
||||||
label {
|
|
||||||
display: block;
|
|
||||||
margin-bottom: 5px;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
#speaker-list {
|
#speaker-list {
|
||||||
list-style: none;
|
list-style: none;
|
||||||
padding: 0;
|
padding: 0;
|
||||||
|
margin: 0;
|
||||||
}
|
}
|
||||||
|
|
||||||
#speaker-list li {
|
#speaker-list li {
|
||||||
padding: 5px 0;
|
padding: 7px 0;
|
||||||
border-bottom: 1px dotted #eee;
|
border-bottom: 1px solid #e3e3e3;
|
||||||
|
display: flex;
|
||||||
|
justify-content: space-between;
|
||||||
|
align-items: center;
|
||||||
}
|
}
|
||||||
|
|
||||||
#speaker-list li:last-child {
|
#speaker-list li:last-child {
|
||||||
|
@ -76,17 +257,74 @@ label {
|
||||||
}
|
}
|
||||||
|
|
||||||
pre {
|
pre {
|
||||||
background: #eee;
|
background: #f3f5f7;
|
||||||
padding: 10px;
|
padding: 12px;
|
||||||
border-radius: 4px;
|
border-radius: 4px;
|
||||||
white-space: pre-wrap; /* Allow wrapping */
|
font-size: 0.98rem;
|
||||||
word-wrap: break-word; /* Break long words */
|
white-space: pre-wrap;
|
||||||
|
word-wrap: break-word;
|
||||||
|
margin: 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
audio {
|
||||||
|
width: 100%;
|
||||||
|
margin-top: 8px;
|
||||||
|
margin-bottom: 8px;
|
||||||
|
}
|
||||||
|
|
||||||
|
#zip-archive-link {
|
||||||
|
display: inline-block;
|
||||||
|
margin-right: 10px;
|
||||||
|
color: #fff;
|
||||||
|
background: #4a90e2;
|
||||||
|
padding: 7px 16px;
|
||||||
|
border-radius: 4px;
|
||||||
|
text-decoration: none;
|
||||||
|
font-weight: 500;
|
||||||
|
transition: background 0.15s;
|
||||||
|
}
|
||||||
|
|
||||||
|
#zip-archive-link:hover, #zip-archive-link:focus {
|
||||||
|
background: #357ab8;
|
||||||
}
|
}
|
||||||
|
|
||||||
footer {
|
footer {
|
||||||
text-align: center;
|
text-align: center;
|
||||||
padding: 20px;
|
padding: 20px 0;
|
||||||
background: #333;
|
background: #222e3a;
|
||||||
color: #fff;
|
color: #fff;
|
||||||
margin-top: 30px;
|
margin-top: 40px;
|
||||||
|
font-size: 1rem;
|
||||||
|
border-top: 3px solid #4a90e2;
|
||||||
|
}
|
||||||
|
|
||||||
|
@media (max-width: 900px) {
|
||||||
|
.panel-grid {
|
||||||
|
flex-direction: column;
|
||||||
|
gap: 22px;
|
||||||
|
}
|
||||||
|
.panel {
|
||||||
|
min-width: 0;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/* Simple side-by-side layout for speaker management */
|
||||||
|
.speaker-mgmt-row {
|
||||||
|
display: flex;
|
||||||
|
gap: 20px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.speaker-mgmt-row .card {
|
||||||
|
flex: 1;
|
||||||
|
width: 50%;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* Stack on mobile */
|
||||||
|
@media (max-width: 768px) {
|
||||||
|
.speaker-mgmt-row {
|
||||||
|
flex-direction: column;
|
||||||
|
}
|
||||||
|
.speaker-mgmt-row .card {
|
||||||
|
width: 100%;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -8,77 +8,92 @@
|
||||||
</head>
|
</head>
|
||||||
<body>
|
<body>
|
||||||
<header>
|
<header>
|
||||||
|
<div class="container">
|
||||||
<h1>Chatterbox TTS</h1>
|
<h1>Chatterbox TTS</h1>
|
||||||
|
</div>
|
||||||
</header>
|
</header>
|
||||||
|
|
||||||
<main>
|
<main class="container" role="main">
|
||||||
<section id="speaker-management">
|
<div class="panel-grid">
|
||||||
<h2>Speaker Management</h2>
|
<section id="dialog-editor" class="panel full-width-panel" aria-labelledby="dialog-editor-title">
|
||||||
<div id="speaker-list-container">
|
<h2 id="dialog-editor-title">Dialog Editor</h2>
|
||||||
|
<div class="card">
|
||||||
|
<table id="dialog-items-table">
|
||||||
|
<thead>
|
||||||
|
<tr>
|
||||||
|
<th>Type</th>
|
||||||
|
<th>Speaker</th>
|
||||||
|
<th>Text / Duration</th>
|
||||||
|
<th>Actions</th>
|
||||||
|
</tr>
|
||||||
|
</thead>
|
||||||
|
<tbody id="dialog-items-container">
|
||||||
|
<!-- Dialog items will be rendered here by JavaScript as <tr> -->
|
||||||
|
</tbody>
|
||||||
|
</table>
|
||||||
|
</div>
|
||||||
|
<div id="temp-input-area" class="card">
|
||||||
|
<!-- Temporary inputs for speech/silence will go here -->
|
||||||
|
</div>
|
||||||
|
<div class="dialog-controls form-row">
|
||||||
|
<button id="add-speech-line-btn">Add Speech Line</button>
|
||||||
|
<button id="add-silence-line-btn">Add Silence Line</button>
|
||||||
|
</div>
|
||||||
|
<div class="dialog-controls form-row">
|
||||||
|
<label for="output-base-name">Output Base Name:</label>
|
||||||
|
<input type="text" id="output-base-name" name="output-base-name" value="dialog_output" required>
|
||||||
|
</div>
|
||||||
|
<button id="generate-dialog-btn">Generate Dialog</button>
|
||||||
|
</section>
|
||||||
|
</div>
|
||||||
|
<!-- Results below -->
|
||||||
|
<section id="results-display" class="panel" aria-labelledby="results-display-title">
|
||||||
|
<h2 id="results-display-title">Results</h2>
|
||||||
|
<div class="card">
|
||||||
|
<details id="generation-log-details">
|
||||||
|
<summary style="cursor:pointer;font-weight:500;">Show Generation Log</summary>
|
||||||
|
<pre id="generation-log-content" style="margin-top:12px;">(Generation log will appear here)</pre>
|
||||||
|
</details>
|
||||||
|
</div>
|
||||||
|
<div class="card">
|
||||||
|
<h3>Concatenated Audio:</h3>
|
||||||
|
<audio id="concatenated-audio-player" controls src=""></audio>
|
||||||
|
</div>
|
||||||
|
<div class="card">
|
||||||
|
<h3>Download Archive:</h3>
|
||||||
|
<a id="zip-archive-link" href="#" download style="display: none;">Download ZIP</a>
|
||||||
|
<p id="zip-archive-placeholder">(ZIP download link will appear here)</p>
|
||||||
|
</div>
|
||||||
|
</section>
|
||||||
|
<!-- Speaker management row below Results, side by side -->
|
||||||
|
<div class="speaker-mgmt-row">
|
||||||
|
<div id="speaker-list-container" class="card">
|
||||||
<h3>Available Speakers</h3>
|
<h3>Available Speakers</h3>
|
||||||
<ul id="speaker-list">
|
<ul id="speaker-list">
|
||||||
<!-- Speakers will be populated here by JavaScript -->
|
<!-- Speakers will be populated here by JavaScript -->
|
||||||
</ul>
|
</ul>
|
||||||
</div>
|
</div>
|
||||||
<div id="add-speaker-container">
|
<div id="add-speaker-container" class="card">
|
||||||
<h3>Add New Speaker</h3>
|
<h3>Add New Speaker</h3>
|
||||||
<form id="add-speaker-form">
|
<form id="add-speaker-form">
|
||||||
<div>
|
<div class="form-row">
|
||||||
<label for="speaker-name">Speaker Name:</label>
|
<label for="speaker-name">Speaker Name:</label>
|
||||||
<input type="text" id="speaker-name" name="name" required>
|
<input type="text" id="speaker-name" name="name" required>
|
||||||
</div>
|
</div>
|
||||||
<div>
|
<div class="form-row">
|
||||||
<label for="speaker-sample">Audio Sample (WAV or MP3):</label>
|
<label for="speaker-sample">Audio Sample (WAV or MP3):</label>
|
||||||
<input type="file" id="speaker-sample" name="audio_file" accept=".wav,.mp3" required>
|
<input type="file" id="speaker-sample" name="audio_file" accept=".wav,.mp3" required>
|
||||||
</div>
|
</div>
|
||||||
<button type="submit">Add Speaker</button>
|
<button type="submit">Add Speaker</button>
|
||||||
</form>
|
</form>
|
||||||
</div>
|
</div>
|
||||||
</section>
|
|
||||||
|
|
||||||
<hr>
|
|
||||||
|
|
||||||
<section id="dialog-editor">
|
|
||||||
<h2>Dialog Editor</h2>
|
|
||||||
<div id="dialog-items-container">
|
|
||||||
<!-- Dialog items will be added here by JavaScript -->
|
|
||||||
</div>
|
</div>
|
||||||
<div id="temp-input-area">
|
|
||||||
<!-- Temporary inputs for speech/silence will go here -->
|
|
||||||
</div>
|
|
||||||
<div class="dialog-controls">
|
|
||||||
<button id="add-speech-line-btn">Add Speech Line</button>
|
|
||||||
<button id="add-silence-line-btn">Add Silence Line</button>
|
|
||||||
</div>
|
|
||||||
<div class="dialog-controls">
|
|
||||||
<label for="output-base-name">Output Base Name:</label>
|
|
||||||
<input type="text" id="output-base-name" name="output-base-name" value="dialog_output" required>
|
|
||||||
</div>
|
|
||||||
<button id="generate-dialog-btn">Generate Dialog</button>
|
|
||||||
</section>
|
|
||||||
|
|
||||||
<hr>
|
|
||||||
|
|
||||||
<section id="results-display">
|
|
||||||
<h2>Results</h2>
|
|
||||||
<div>
|
|
||||||
<h3>Log:</h3>
|
|
||||||
<pre id="generation-log-content">(Generation log will appear here)</pre>
|
|
||||||
</div>
|
|
||||||
<div>
|
|
||||||
<h3>Concatenated Audio:</h3>
|
|
||||||
<audio id="concatenated-audio-player" controls src=""></audio>
|
|
||||||
</div>
|
|
||||||
<div>
|
|
||||||
<h3>Download Archive:</h3>
|
|
||||||
<a id="zip-archive-link" href="#" download style="display: none;">Download ZIP</a>
|
|
||||||
<p id="zip-archive-placeholder">(ZIP download link will appear here)</p>
|
|
||||||
</div>
|
|
||||||
</section>
|
|
||||||
</main>
|
</main>
|
||||||
|
|
||||||
<footer>
|
<footer>
|
||||||
|
<div class="container">
|
||||||
<p>© 2024 Chatterbox TTS</p>
|
<p>© 2024 Chatterbox TTS</p>
|
||||||
|
</div>
|
||||||
</footer>
|
</footer>
|
||||||
|
|
||||||
<script src="js/api.js" type="module"></script>
|
<script src="js/api.js" type="module"></script>
|
||||||
|
|
|
@ -114,30 +114,54 @@ function initializeDialogEditor() {
|
||||||
let dialogItems = [];
|
let dialogItems = [];
|
||||||
let availableSpeakersCache = []; // Cache for speaker names and IDs
|
let availableSpeakersCache = []; // Cache for speaker names and IDs
|
||||||
|
|
||||||
// Function to render the current dialogItems array to the DOM
|
// Function to render the current dialogItems array to the DOM as table rows
|
||||||
function renderDialogItems() {
|
function renderDialogItems() {
|
||||||
if (!dialogItemsContainer) return;
|
if (!dialogItemsContainer) return;
|
||||||
dialogItemsContainer.innerHTML = ''; // Clear existing items
|
dialogItemsContainer.innerHTML = '';
|
||||||
dialogItems.forEach((item, index) => {
|
dialogItems.forEach((item, index) => {
|
||||||
const li = document.createElement('li');
|
const tr = document.createElement('tr');
|
||||||
li.classList.add('dialog-item');
|
|
||||||
|
// Type column
|
||||||
|
const typeTd = document.createElement('td');
|
||||||
|
typeTd.textContent = item.type === 'speech' ? 'Speech' : 'Silence';
|
||||||
|
tr.appendChild(typeTd);
|
||||||
|
|
||||||
|
// Speaker column
|
||||||
|
const speakerTd = document.createElement('td');
|
||||||
if (item.type === 'speech') {
|
if (item.type === 'speech') {
|
||||||
const speaker = availableSpeakersCache.find(s => s.id === item.speaker_id);
|
const speaker = availableSpeakersCache.find(s => s.id === item.speaker_id);
|
||||||
const speakerName = speaker ? speaker.name : 'Unknown Speaker';
|
speakerTd.textContent = speaker ? speaker.name : 'Unknown Speaker';
|
||||||
li.textContent = `Speech: [${speakerName}] "${item.text.substring(0, 30)}${item.text.length > 30 ? '...' : ''}"`;
|
} else {
|
||||||
} else if (item.type === 'silence') {
|
speakerTd.textContent = '—';
|
||||||
li.textContent = `Silence: ${item.duration}s`;
|
|
||||||
}
|
}
|
||||||
|
tr.appendChild(speakerTd);
|
||||||
|
|
||||||
|
// Text/Duration column
|
||||||
|
const textTd = document.createElement('td');
|
||||||
|
if (item.type === 'speech') {
|
||||||
|
let txt = item.text.length > 60 ? item.text.substring(0, 57) + '…' : item.text;
|
||||||
|
textTd.textContent = `"${txt}"`;
|
||||||
|
} else {
|
||||||
|
textTd.textContent = `${item.duration}s`;
|
||||||
|
}
|
||||||
|
tr.appendChild(textTd);
|
||||||
|
|
||||||
|
// Actions column
|
||||||
|
const actionsTd = document.createElement('td');
|
||||||
|
actionsTd.classList.add('actions');
|
||||||
const removeBtn = document.createElement('button');
|
const removeBtn = document.createElement('button');
|
||||||
removeBtn.textContent = 'Remove';
|
removeBtn.innerHTML = '×'; // Unicode multiplication sign (X)
|
||||||
removeBtn.classList.add('remove-dialog-item-btn');
|
removeBtn.classList.add('remove-dialog-item-btn', 'x-remove-btn');
|
||||||
|
removeBtn.setAttribute('aria-label', 'Remove dialog line');
|
||||||
|
removeBtn.title = 'Remove';
|
||||||
removeBtn.onclick = () => {
|
removeBtn.onclick = () => {
|
||||||
dialogItems.splice(index, 1);
|
dialogItems.splice(index, 1);
|
||||||
renderDialogItems();
|
renderDialogItems();
|
||||||
};
|
};
|
||||||
li.appendChild(removeBtn);
|
actionsTd.appendChild(removeBtn);
|
||||||
dialogItemsContainer.appendChild(li);
|
tr.appendChild(actionsTd);
|
||||||
|
|
||||||
|
dialogItemsContainer.appendChild(tr);
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
|
@ -0,0 +1 @@
|
||||||
|
../browserslist/cli.js
|
|
@ -0,0 +1 @@
|
||||||
|
../create-jest/bin/create-jest.js
|
|
@ -0,0 +1 @@
|
||||||
|
../esprima/bin/esparse.js
|
|
@ -0,0 +1 @@
|
||||||
|
../esprima/bin/esvalidate.js
|
|
@ -0,0 +1 @@
|
||||||
|
../import-local/fixtures/cli.js
|
|
@ -0,0 +1 @@
|
||||||
|
../jest/bin/jest.js
|
|
@ -0,0 +1 @@
|
||||||
|
../js-yaml/bin/js-yaml.js
|
|
@ -0,0 +1 @@
|
||||||
|
../jsesc/bin/jsesc
|
|
@ -0,0 +1 @@
|
||||||
|
../json5/lib/cli.js
|
|
@ -0,0 +1 @@
|
||||||
|
../which/bin/node-which
|
|
@ -0,0 +1 @@
|
||||||
|
../@babel/parser/bin/babel-parser.js
|
|
@ -0,0 +1 @@
|
||||||
|
../regjsparser/bin/parser
|
|
@ -0,0 +1 @@
|
||||||
|
../resolve/bin/resolve
|
|
@ -0,0 +1 @@
|
||||||
|
../semver/bin/semver.js
|
|
@ -0,0 +1 @@
|
||||||
|
../update-browserslist-db/cli.js
|
File diff suppressed because it is too large
Load Diff
|
@ -0,0 +1,202 @@
|
||||||
|
|
||||||
|
Apache License
|
||||||
|
Version 2.0, January 2004
|
||||||
|
http://www.apache.org/licenses/
|
||||||
|
|
||||||
|
TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION
|
||||||
|
|
||||||
|
1. Definitions.
|
||||||
|
|
||||||
|
"License" shall mean the terms and conditions for use, reproduction,
|
||||||
|
and distribution as defined by Sections 1 through 9 of this document.
|
||||||
|
|
||||||
|
"Licensor" shall mean the copyright owner or entity authorized by
|
||||||
|
the copyright owner that is granting the License.
|
||||||
|
|
||||||
|
"Legal Entity" shall mean the union of the acting entity and all
|
||||||
|
other entities that control, are controlled by, or are under common
|
||||||
|
control with that entity. For the purposes of this definition,
|
||||||
|
"control" means (i) the power, direct or indirect, to cause the
|
||||||
|
direction or management of such entity, whether by contract or
|
||||||
|
otherwise, or (ii) ownership of fifty percent (50%) or more of the
|
||||||
|
outstanding shares, or (iii) beneficial ownership of such entity.
|
||||||
|
|
||||||
|
"You" (or "Your") shall mean an individual or Legal Entity
|
||||||
|
exercising permissions granted by this License.
|
||||||
|
|
||||||
|
"Source" form shall mean the preferred form for making modifications,
|
||||||
|
including but not limited to software source code, documentation
|
||||||
|
source, and configuration files.
|
||||||
|
|
||||||
|
"Object" form shall mean any form resulting from mechanical
|
||||||
|
transformation or translation of a Source form, including but
|
||||||
|
not limited to compiled object code, generated documentation,
|
||||||
|
and conversions to other media types.
|
||||||
|
|
||||||
|
"Work" shall mean the work of authorship, whether in Source or
|
||||||
|
Object form, made available under the License, as indicated by a
|
||||||
|
copyright notice that is included in or attached to the work
|
||||||
|
(an example is provided in the Appendix below).
|
||||||
|
|
||||||
|
"Derivative Works" shall mean any work, whether in Source or Object
|
||||||
|
form, that is based on (or derived from) the Work and for which the
|
||||||
|
editorial revisions, annotations, elaborations, or other modifications
|
||||||
|
represent, as a whole, an original work of authorship. For the purposes
|
||||||
|
of this License, Derivative Works shall not include works that remain
|
||||||
|
separable from, or merely link (or bind by name) to the interfaces of,
|
||||||
|
the Work and Derivative Works thereof.
|
||||||
|
|
||||||
|
"Contribution" shall mean any work of authorship, including
|
||||||
|
the original version of the Work and any modifications or additions
|
||||||
|
to that Work or Derivative Works thereof, that is intentionally
|
||||||
|
submitted to Licensor for inclusion in the Work by the copyright owner
|
||||||
|
or by an individual or Legal Entity authorized to submit on behalf of
|
||||||
|
the copyright owner. For the purposes of this definition, "submitted"
|
||||||
|
means any form of electronic, verbal, or written communication sent
|
||||||
|
to the Licensor or its representatives, including but not limited to
|
||||||
|
communication on electronic mailing lists, source code control systems,
|
||||||
|
and issue tracking systems that are managed by, or on behalf of, the
|
||||||
|
Licensor for the purpose of discussing and improving the Work, but
|
||||||
|
excluding communication that is conspicuously marked or otherwise
|
||||||
|
designated in writing by the copyright owner as "Not a Contribution."
|
||||||
|
|
||||||
|
"Contributor" shall mean Licensor and any individual or Legal Entity
|
||||||
|
on behalf of whom a Contribution has been received by Licensor and
|
||||||
|
subsequently incorporated within the Work.
|
||||||
|
|
||||||
|
2. Grant of Copyright License. Subject to the terms and conditions of
|
||||||
|
this License, each Contributor hereby grants to You a perpetual,
|
||||||
|
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||||
|
copyright license to reproduce, prepare Derivative Works of,
|
||||||
|
publicly display, publicly perform, sublicense, and distribute the
|
||||||
|
Work and such Derivative Works in Source or Object form.
|
||||||
|
|
||||||
|
3. Grant of Patent License. Subject to the terms and conditions of
|
||||||
|
this License, each Contributor hereby grants to You a perpetual,
|
||||||
|
worldwide, non-exclusive, no-charge, royalty-free, irrevocable
|
||||||
|
(except as stated in this section) patent license to make, have made,
|
||||||
|
use, offer to sell, sell, import, and otherwise transfer the Work,
|
||||||
|
where such license applies only to those patent claims licensable
|
||||||
|
by such Contributor that are necessarily infringed by their
|
||||||
|
Contribution(s) alone or by combination of their Contribution(s)
|
||||||
|
with the Work to which such Contribution(s) was submitted. If You
|
||||||
|
institute patent litigation against any entity (including a
|
||||||
|
cross-claim or counterclaim in a lawsuit) alleging that the Work
|
||||||
|
or a Contribution incorporated within the Work constitutes direct
|
||||||
|
or contributory patent infringement, then any patent licenses
|
||||||
|
granted to You under this License for that Work shall terminate
|
||||||
|
as of the date such litigation is filed.
|
||||||
|
|
||||||
|
4. Redistribution. You may reproduce and distribute copies of the
|
||||||
|
Work or Derivative Works thereof in any medium, with or without
|
||||||
|
modifications, and in Source or Object form, provided that You
|
||||||
|
meet the following conditions:
|
||||||
|
|
||||||
|
(a) You must give any other recipients of the Work or
|
||||||
|
Derivative Works a copy of this License; and
|
||||||
|
|
||||||
|
(b) You must cause any modified files to carry prominent notices
|
||||||
|
stating that You changed the files; and
|
||||||
|
|
||||||
|
(c) You must retain, in the Source form of any Derivative Works
|
||||||
|
that You distribute, all copyright, patent, trademark, and
|
||||||
|
attribution notices from the Source form of the Work,
|
||||||
|
excluding those notices that do not pertain to any part of
|
||||||
|
the Derivative Works; and
|
||||||
|
|
||||||
|
(d) If the Work includes a "NOTICE" text file as part of its
|
||||||
|
distribution, then any Derivative Works that You distribute must
|
||||||
|
include a readable copy of the attribution notices contained
|
||||||
|
within such NOTICE file, excluding those notices that do not
|
||||||
|
pertain to any part of the Derivative Works, in at least one
|
||||||
|
of the following places: within a NOTICE text file distributed
|
||||||
|
as part of the Derivative Works; within the Source form or
|
||||||
|
documentation, if provided along with the Derivative Works; or,
|
||||||
|
within a display generated by the Derivative Works, if and
|
||||||
|
wherever such third-party notices normally appear. The contents
|
||||||
|
of the NOTICE file are for informational purposes only and
|
||||||
|
do not modify the License. You may add Your own attribution
|
||||||
|
notices within Derivative Works that You distribute, alongside
|
||||||
|
or as an addendum to the NOTICE text from the Work, provided
|
||||||
|
that such additional attribution notices cannot be construed
|
||||||
|
as modifying the License.
|
||||||
|
|
||||||
|
You may add Your own copyright statement to Your modifications and
|
||||||
|
may provide additional or different license terms and conditions
|
||||||
|
for use, reproduction, or distribution of Your modifications, or
|
||||||
|
for any such Derivative Works as a whole, provided Your use,
|
||||||
|
reproduction, and distribution of the Work otherwise complies with
|
||||||
|
the conditions stated in this License.
|
||||||
|
|
||||||
|
5. Submission of Contributions. Unless You explicitly state otherwise,
|
||||||
|
any Contribution intentionally submitted for inclusion in the Work
|
||||||
|
by You to the Licensor shall be under the terms and conditions of
|
||||||
|
this License, without any additional terms or conditions.
|
||||||
|
Notwithstanding the above, nothing herein shall supersede or modify
|
||||||
|
the terms of any separate license agreement you may have executed
|
||||||
|
with Licensor regarding such Contributions.
|
||||||
|
|
||||||
|
6. Trademarks. This License does not grant permission to use the trade
|
||||||
|
names, trademarks, service marks, or product names of the Licensor,
|
||||||
|
except as required for reasonable and customary use in describing the
|
||||||
|
origin of the Work and reproducing the content of the NOTICE file.
|
||||||
|
|
||||||
|
7. Disclaimer of Warranty. Unless required by applicable law or
|
||||||
|
agreed to in writing, Licensor provides the Work (and each
|
||||||
|
Contributor provides its Contributions) on an "AS IS" BASIS,
|
||||||
|
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or
|
||||||
|
implied, including, without limitation, any warranties or conditions
|
||||||
|
of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A
|
||||||
|
PARTICULAR PURPOSE. You are solely responsible for determining the
|
||||||
|
appropriateness of using or redistributing the Work and assume any
|
||||||
|
risks associated with Your exercise of permissions under this License.
|
||||||
|
|
||||||
|
8. Limitation of Liability. In no event and under no legal theory,
|
||||||
|
whether in tort (including negligence), contract, or otherwise,
|
||||||
|
unless required by applicable law (such as deliberate and grossly
|
||||||
|
negligent acts) or agreed to in writing, shall any Contributor be
|
||||||
|
liable to You for damages, including any direct, indirect, special,
|
||||||
|
incidental, or consequential damages of any character arising as a
|
||||||
|
result of this License or out of the use or inability to use the
|
||||||
|
Work (including but not limited to damages for loss of goodwill,
|
||||||
|
work stoppage, computer failure or malfunction, or any and all
|
||||||
|
other commercial damages or losses), even if such Contributor
|
||||||
|
has been advised of the possibility of such damages.
|
||||||
|
|
||||||
|
9. Accepting Warranty or Additional Liability. While redistributing
|
||||||
|
the Work or Derivative Works thereof, You may choose to offer,
|
||||||
|
and charge a fee for, acceptance of support, warranty, indemnity,
|
||||||
|
or other liability obligations and/or rights consistent with this
|
||||||
|
License. However, in accepting such obligations, You may act only
|
||||||
|
on Your own behalf and on Your sole responsibility, not on behalf
|
||||||
|
of any other Contributor, and only if You agree to indemnify,
|
||||||
|
defend, and hold each Contributor harmless for any liability
|
||||||
|
incurred by, or claims asserted against, such Contributor by reason
|
||||||
|
of your accepting any such warranty or additional liability.
|
||||||
|
|
||||||
|
END OF TERMS AND CONDITIONS
|
||||||
|
|
||||||
|
APPENDIX: How to apply the Apache License to your work.
|
||||||
|
|
||||||
|
To apply the Apache License to your work, attach the following
|
||||||
|
boilerplate notice, with the fields enclosed by brackets "[]"
|
||||||
|
replaced with your own identifying information. (Don't include
|
||||||
|
the brackets!) The text should be enclosed in the appropriate
|
||||||
|
comment syntax for the file format. We also recommend that a
|
||||||
|
file or class name and description of purpose be included on the
|
||||||
|
same "printed page" as the copyright notice for easier
|
||||||
|
identification within third-party archives.
|
||||||
|
|
||||||
|
Copyright [yyyy] [name of copyright owner]
|
||||||
|
|
||||||
|
Licensed under the Apache License, Version 2.0 (the "License");
|
||||||
|
you may not use this file except in compliance with the License.
|
||||||
|
You may obtain a copy of the License at
|
||||||
|
|
||||||
|
http://www.apache.org/licenses/LICENSE-2.0
|
||||||
|
|
||||||
|
Unless required by applicable law or agreed to in writing, software
|
||||||
|
distributed under the License is distributed on an "AS IS" BASIS,
|
||||||
|
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||||
|
See the License for the specific language governing permissions and
|
||||||
|
limitations under the License.
|
|
@ -0,0 +1,218 @@
|
||||||
|
# @ampproject/remapping
|
||||||
|
|
||||||
|
> Remap sequential sourcemaps through transformations to point at the original source code
|
||||||
|
|
||||||
|
Remapping allows you to take the sourcemaps generated through transforming your code and "remap"
|
||||||
|
them to the original source locations. Think "my minified code, transformed with babel and bundled
|
||||||
|
with webpack", all pointing to the correct location in your original source code.
|
||||||
|
|
||||||
|
With remapping, none of your source code transformations need to be aware of the input's sourcemap,
|
||||||
|
they only need to generate an output sourcemap. This greatly simplifies building custom
|
||||||
|
transformations (think a find-and-replace).
|
||||||
|
|
||||||
|
## Installation
|
||||||
|
|
||||||
|
```sh
|
||||||
|
npm install @ampproject/remapping
|
||||||
|
```
|
||||||
|
|
||||||
|
## Usage
|
||||||
|
|
||||||
|
```typescript
|
||||||
|
function remapping(
|
||||||
|
map: SourceMap | SourceMap[],
|
||||||
|
loader: (file: string, ctx: LoaderContext) => (SourceMap | null | undefined),
|
||||||
|
options?: { excludeContent: boolean, decodedMappings: boolean }
|
||||||
|
): SourceMap;
|
||||||
|
|
||||||
|
// LoaderContext gives the loader the importing sourcemap, tree depth, the ability to override the
|
||||||
|
// "source" location (where child sources are resolved relative to, or the location of original
|
||||||
|
// source), and the ability to override the "content" of an original source for inclusion in the
|
||||||
|
// output sourcemap.
|
||||||
|
type LoaderContext = {
|
||||||
|
readonly importer: string;
|
||||||
|
readonly depth: number;
|
||||||
|
source: string;
|
||||||
|
content: string | null | undefined;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
`remapping` takes the final output sourcemap, and a `loader` function. For every source file pointer
|
||||||
|
in the sourcemap, the `loader` will be called with the resolved path. If the path itself represents
|
||||||
|
a transformed file (it has a sourcmap associated with it), then the `loader` should return that
|
||||||
|
sourcemap. If not, the path will be treated as an original, untransformed source code.
|
||||||
|
|
||||||
|
```js
|
||||||
|
// Babel transformed "helloworld.js" into "transformed.js"
|
||||||
|
const transformedMap = JSON.stringify({
|
||||||
|
file: 'transformed.js',
|
||||||
|
// 1st column of 2nd line of output file translates into the 1st source
|
||||||
|
// file, line 3, column 2
|
||||||
|
mappings: ';CAEE',
|
||||||
|
sources: ['helloworld.js'],
|
||||||
|
version: 3,
|
||||||
|
});
|
||||||
|
|
||||||
|
// Uglify minified "transformed.js" into "transformed.min.js"
|
||||||
|
const minifiedTransformedMap = JSON.stringify({
|
||||||
|
file: 'transformed.min.js',
|
||||||
|
// 0th column of 1st line of output file translates into the 1st source
|
||||||
|
// file, line 2, column 1.
|
||||||
|
mappings: 'AACC',
|
||||||
|
names: [],
|
||||||
|
sources: ['transformed.js'],
|
||||||
|
version: 3,
|
||||||
|
});
|
||||||
|
|
||||||
|
const remapped = remapping(
|
||||||
|
minifiedTransformedMap,
|
||||||
|
(file, ctx) => {
|
||||||
|
|
||||||
|
// The "transformed.js" file is an transformed file.
|
||||||
|
if (file === 'transformed.js') {
|
||||||
|
// The root importer is empty.
|
||||||
|
console.assert(ctx.importer === '');
|
||||||
|
// The depth in the sourcemap tree we're currently loading.
|
||||||
|
// The root `minifiedTransformedMap` is depth 0, and its source children are depth 1, etc.
|
||||||
|
console.assert(ctx.depth === 1);
|
||||||
|
|
||||||
|
return transformedMap;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Loader will be called to load transformedMap's source file pointers as well.
|
||||||
|
console.assert(file === 'helloworld.js');
|
||||||
|
// `transformed.js`'s sourcemap points into `helloworld.js`.
|
||||||
|
console.assert(ctx.importer === 'transformed.js');
|
||||||
|
// This is a source child of `transformed`, which is a source child of `minifiedTransformedMap`.
|
||||||
|
console.assert(ctx.depth === 2);
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
);
|
||||||
|
|
||||||
|
console.log(remapped);
|
||||||
|
// {
|
||||||
|
// file: 'transpiled.min.js',
|
||||||
|
// mappings: 'AAEE',
|
||||||
|
// sources: ['helloworld.js'],
|
||||||
|
// version: 3,
|
||||||
|
// };
|
||||||
|
```
|
||||||
|
|
||||||
|
In this example, `loader` will be called twice:
|
||||||
|
|
||||||
|
1. `"transformed.js"`, the first source file pointer in the `minifiedTransformedMap`. We return the
|
||||||
|
associated sourcemap for it (its a transformed file, after all) so that sourcemap locations can
|
||||||
|
be traced through it into the source files it represents.
|
||||||
|
2. `"helloworld.js"`, our original, unmodified source code. This file does not have a sourcemap, so
|
||||||
|
we return `null`.
|
||||||
|
|
||||||
|
The `remapped` sourcemap now points from `transformed.min.js` into locations in `helloworld.js`. If
|
||||||
|
you were to read the `mappings`, it says "0th column of the first line output line points to the 1st
|
||||||
|
column of the 2nd line of the file `helloworld.js`".
|
||||||
|
|
||||||
|
### Multiple transformations of a file
|
||||||
|
|
||||||
|
As a convenience, if you have multiple single-source transformations of a file, you may pass an
|
||||||
|
array of sourcemap files in the order of most-recent transformation sourcemap first. Note that this
|
||||||
|
changes the `importer` and `depth` of each call to our loader. So our above example could have been
|
||||||
|
written as:
|
||||||
|
|
||||||
|
```js
|
||||||
|
const remapped = remapping(
|
||||||
|
[minifiedTransformedMap, transformedMap],
|
||||||
|
() => null
|
||||||
|
);
|
||||||
|
|
||||||
|
console.log(remapped);
|
||||||
|
// {
|
||||||
|
// file: 'transpiled.min.js',
|
||||||
|
// mappings: 'AAEE',
|
||||||
|
// sources: ['helloworld.js'],
|
||||||
|
// version: 3,
|
||||||
|
// };
|
||||||
|
```
|
||||||
|
|
||||||
|
### Advanced control of the loading graph
|
||||||
|
|
||||||
|
#### `source`
|
||||||
|
|
||||||
|
The `source` property can overridden to any value to change the location of the current load. Eg,
|
||||||
|
for an original source file, it allows us to change the location to the original source regardless
|
||||||
|
of what the sourcemap source entry says. And for transformed files, it allows us to change the
|
||||||
|
relative resolving location for child sources of the loaded sourcemap.
|
||||||
|
|
||||||
|
```js
|
||||||
|
const remapped = remapping(
|
||||||
|
minifiedTransformedMap,
|
||||||
|
(file, ctx) => {
|
||||||
|
|
||||||
|
if (file === 'transformed.js') {
|
||||||
|
// We pretend the transformed.js file actually exists in the 'src/' directory. When the nested
|
||||||
|
// source files are loaded, they will now be relative to `src/`.
|
||||||
|
ctx.source = 'src/transformed.js';
|
||||||
|
return transformedMap;
|
||||||
|
}
|
||||||
|
|
||||||
|
console.assert(file === 'src/helloworld.js');
|
||||||
|
// We could futher change the source of this original file, eg, to be inside a nested directory
|
||||||
|
// itself. This will be reflected in the remapped sourcemap.
|
||||||
|
ctx.source = 'src/nested/transformed.js';
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
);
|
||||||
|
|
||||||
|
console.log(remapped);
|
||||||
|
// {
|
||||||
|
// …,
|
||||||
|
// sources: ['src/nested/helloworld.js'],
|
||||||
|
// };
|
||||||
|
```
|
||||||
|
|
||||||
|
|
||||||
|
#### `content`
|
||||||
|
|
||||||
|
The `content` property can be overridden when we encounter an original source file. Eg, this allows
|
||||||
|
you to manually provide the source content of the original file regardless of whether the
|
||||||
|
`sourcesContent` field is present in the parent sourcemap. It can also be set to `null` to remove
|
||||||
|
the source content.
|
||||||
|
|
||||||
|
```js
|
||||||
|
const remapped = remapping(
|
||||||
|
minifiedTransformedMap,
|
||||||
|
(file, ctx) => {
|
||||||
|
|
||||||
|
if (file === 'transformed.js') {
|
||||||
|
// transformedMap does not include a `sourcesContent` field, so usually the remapped sourcemap
|
||||||
|
// would not include any `sourcesContent` values.
|
||||||
|
return transformedMap;
|
||||||
|
}
|
||||||
|
|
||||||
|
console.assert(file === 'helloworld.js');
|
||||||
|
// We can read the file to provide the source content.
|
||||||
|
ctx.content = fs.readFileSync(file, 'utf8');
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
);
|
||||||
|
|
||||||
|
console.log(remapped);
|
||||||
|
// {
|
||||||
|
// …,
|
||||||
|
// sourcesContent: [
|
||||||
|
// 'console.log("Hello world!")',
|
||||||
|
// ],
|
||||||
|
// };
|
||||||
|
```
|
||||||
|
|
||||||
|
### Options
|
||||||
|
|
||||||
|
#### excludeContent
|
||||||
|
|
||||||
|
By default, `excludeContent` is `false`. Passing `{ excludeContent: true }` will exclude the
|
||||||
|
`sourcesContent` field from the returned sourcemap. This is mainly useful when you want to reduce
|
||||||
|
the size out the sourcemap.
|
||||||
|
|
||||||
|
#### decodedMappings
|
||||||
|
|
||||||
|
By default, `decodedMappings` is `false`. Passing `{ decodedMappings: true }` will leave the
|
||||||
|
`mappings` field in a [decoded state](https://github.com/rich-harris/sourcemap-codec) instead of
|
||||||
|
encoding into a VLQ string.
|
|
@ -0,0 +1,197 @@
|
||||||
|
import { decodedMappings, traceSegment, TraceMap } from '@jridgewell/trace-mapping';
|
||||||
|
import { GenMapping, maybeAddSegment, setSourceContent, setIgnore, toDecodedMap, toEncodedMap } from '@jridgewell/gen-mapping';
|
||||||
|
|
||||||
|
const SOURCELESS_MAPPING = /* #__PURE__ */ SegmentObject('', -1, -1, '', null, false);
|
||||||
|
const EMPTY_SOURCES = [];
|
||||||
|
function SegmentObject(source, line, column, name, content, ignore) {
|
||||||
|
return { source, line, column, name, content, ignore };
|
||||||
|
}
|
||||||
|
function Source(map, sources, source, content, ignore) {
|
||||||
|
return {
|
||||||
|
map,
|
||||||
|
sources,
|
||||||
|
source,
|
||||||
|
content,
|
||||||
|
ignore,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
/**
|
||||||
|
* MapSource represents a single sourcemap, with the ability to trace mappings into its child nodes
|
||||||
|
* (which may themselves be SourceMapTrees).
|
||||||
|
*/
|
||||||
|
function MapSource(map, sources) {
|
||||||
|
return Source(map, sources, '', null, false);
|
||||||
|
}
|
||||||
|
/**
|
||||||
|
* A "leaf" node in the sourcemap tree, representing an original, unmodified source file. Recursive
|
||||||
|
* segment tracing ends at the `OriginalSource`.
|
||||||
|
*/
|
||||||
|
function OriginalSource(source, content, ignore) {
|
||||||
|
return Source(null, EMPTY_SOURCES, source, content, ignore);
|
||||||
|
}
|
||||||
|
/**
|
||||||
|
* traceMappings is only called on the root level SourceMapTree, and begins the process of
|
||||||
|
* resolving each mapping in terms of the original source files.
|
||||||
|
*/
|
||||||
|
function traceMappings(tree) {
|
||||||
|
// TODO: Eventually support sourceRoot, which has to be removed because the sources are already
|
||||||
|
// fully resolved. We'll need to make sources relative to the sourceRoot before adding them.
|
||||||
|
const gen = new GenMapping({ file: tree.map.file });
|
||||||
|
const { sources: rootSources, map } = tree;
|
||||||
|
const rootNames = map.names;
|
||||||
|
const rootMappings = decodedMappings(map);
|
||||||
|
for (let i = 0; i < rootMappings.length; i++) {
|
||||||
|
const segments = rootMappings[i];
|
||||||
|
for (let j = 0; j < segments.length; j++) {
|
||||||
|
const segment = segments[j];
|
||||||
|
const genCol = segment[0];
|
||||||
|
let traced = SOURCELESS_MAPPING;
|
||||||
|
// 1-length segments only move the current generated column, there's no source information
|
||||||
|
// to gather from it.
|
||||||
|
if (segment.length !== 1) {
|
||||||
|
const source = rootSources[segment[1]];
|
||||||
|
traced = originalPositionFor(source, segment[2], segment[3], segment.length === 5 ? rootNames[segment[4]] : '');
|
||||||
|
// If the trace is invalid, then the trace ran into a sourcemap that doesn't contain a
|
||||||
|
// respective segment into an original source.
|
||||||
|
if (traced == null)
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
const { column, line, name, content, source, ignore } = traced;
|
||||||
|
maybeAddSegment(gen, i, genCol, source, line, column, name);
|
||||||
|
if (source && content != null)
|
||||||
|
setSourceContent(gen, source, content);
|
||||||
|
if (ignore)
|
||||||
|
setIgnore(gen, source, true);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return gen;
|
||||||
|
}
|
||||||
|
/**
|
||||||
|
* originalPositionFor is only called on children SourceMapTrees. It recurses down into its own
|
||||||
|
* child SourceMapTrees, until we find the original source map.
|
||||||
|
*/
|
||||||
|
function originalPositionFor(source, line, column, name) {
|
||||||
|
if (!source.map) {
|
||||||
|
return SegmentObject(source.source, line, column, name, source.content, source.ignore);
|
||||||
|
}
|
||||||
|
const segment = traceSegment(source.map, line, column);
|
||||||
|
// If we couldn't find a segment, then this doesn't exist in the sourcemap.
|
||||||
|
if (segment == null)
|
||||||
|
return null;
|
||||||
|
// 1-length segments only move the current generated column, there's no source information
|
||||||
|
// to gather from it.
|
||||||
|
if (segment.length === 1)
|
||||||
|
return SOURCELESS_MAPPING;
|
||||||
|
return originalPositionFor(source.sources[segment[1]], segment[2], segment[3], segment.length === 5 ? source.map.names[segment[4]] : name);
|
||||||
|
}
|
||||||
|
|
||||||
|
function asArray(value) {
|
||||||
|
if (Array.isArray(value))
|
||||||
|
return value;
|
||||||
|
return [value];
|
||||||
|
}
|
||||||
|
/**
|
||||||
|
* Recursively builds a tree structure out of sourcemap files, with each node
|
||||||
|
* being either an `OriginalSource` "leaf" or a `SourceMapTree` composed of
|
||||||
|
* `OriginalSource`s and `SourceMapTree`s.
|
||||||
|
*
|
||||||
|
* Every sourcemap is composed of a collection of source files and mappings
|
||||||
|
* into locations of those source files. When we generate a `SourceMapTree` for
|
||||||
|
* the sourcemap, we attempt to load each source file's own sourcemap. If it
|
||||||
|
* does not have an associated sourcemap, it is considered an original,
|
||||||
|
* unmodified source file.
|
||||||
|
*/
|
||||||
|
function buildSourceMapTree(input, loader) {
|
||||||
|
const maps = asArray(input).map((m) => new TraceMap(m, ''));
|
||||||
|
const map = maps.pop();
|
||||||
|
for (let i = 0; i < maps.length; i++) {
|
||||||
|
if (maps[i].sources.length > 1) {
|
||||||
|
throw new Error(`Transformation map ${i} must have exactly one source file.\n` +
|
||||||
|
'Did you specify these with the most recent transformation maps first?');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
let tree = build(map, loader, '', 0);
|
||||||
|
for (let i = maps.length - 1; i >= 0; i--) {
|
||||||
|
tree = MapSource(maps[i], [tree]);
|
||||||
|
}
|
||||||
|
return tree;
|
||||||
|
}
|
||||||
|
function build(map, loader, importer, importerDepth) {
|
||||||
|
const { resolvedSources, sourcesContent, ignoreList } = map;
|
||||||
|
const depth = importerDepth + 1;
|
||||||
|
const children = resolvedSources.map((sourceFile, i) => {
|
||||||
|
// The loading context gives the loader more information about why this file is being loaded
|
||||||
|
// (eg, from which importer). It also allows the loader to override the location of the loaded
|
||||||
|
// sourcemap/original source, or to override the content in the sourcesContent field if it's
|
||||||
|
// an unmodified source file.
|
||||||
|
const ctx = {
|
||||||
|
importer,
|
||||||
|
depth,
|
||||||
|
source: sourceFile || '',
|
||||||
|
content: undefined,
|
||||||
|
ignore: undefined,
|
||||||
|
};
|
||||||
|
// Use the provided loader callback to retrieve the file's sourcemap.
|
||||||
|
// TODO: We should eventually support async loading of sourcemap files.
|
||||||
|
const sourceMap = loader(ctx.source, ctx);
|
||||||
|
const { source, content, ignore } = ctx;
|
||||||
|
// If there is a sourcemap, then we need to recurse into it to load its source files.
|
||||||
|
if (sourceMap)
|
||||||
|
return build(new TraceMap(sourceMap, source), loader, source, depth);
|
||||||
|
// Else, it's an unmodified source file.
|
||||||
|
// The contents of this unmodified source file can be overridden via the loader context,
|
||||||
|
// allowing it to be explicitly null or a string. If it remains undefined, we fall back to
|
||||||
|
// the importing sourcemap's `sourcesContent` field.
|
||||||
|
const sourceContent = content !== undefined ? content : sourcesContent ? sourcesContent[i] : null;
|
||||||
|
const ignored = ignore !== undefined ? ignore : ignoreList ? ignoreList.includes(i) : false;
|
||||||
|
return OriginalSource(source, sourceContent, ignored);
|
||||||
|
});
|
||||||
|
return MapSource(map, children);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* A SourceMap v3 compatible sourcemap, which only includes fields that were
|
||||||
|
* provided to it.
|
||||||
|
*/
|
||||||
|
class SourceMap {
|
||||||
|
constructor(map, options) {
|
||||||
|
const out = options.decodedMappings ? toDecodedMap(map) : toEncodedMap(map);
|
||||||
|
this.version = out.version; // SourceMap spec says this should be first.
|
||||||
|
this.file = out.file;
|
||||||
|
this.mappings = out.mappings;
|
||||||
|
this.names = out.names;
|
||||||
|
this.ignoreList = out.ignoreList;
|
||||||
|
this.sourceRoot = out.sourceRoot;
|
||||||
|
this.sources = out.sources;
|
||||||
|
if (!options.excludeContent) {
|
||||||
|
this.sourcesContent = out.sourcesContent;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
toString() {
|
||||||
|
return JSON.stringify(this);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Traces through all the mappings in the root sourcemap, through the sources
|
||||||
|
* (and their sourcemaps), all the way back to the original source location.
|
||||||
|
*
|
||||||
|
* `loader` will be called every time we encounter a source file. If it returns
|
||||||
|
* a sourcemap, we will recurse into that sourcemap to continue the trace. If
|
||||||
|
* it returns a falsey value, that source file is treated as an original,
|
||||||
|
* unmodified source file.
|
||||||
|
*
|
||||||
|
* Pass `excludeContent` to exclude any self-containing source file content
|
||||||
|
* from the output sourcemap.
|
||||||
|
*
|
||||||
|
* Pass `decodedMappings` to receive a SourceMap with decoded (instead of
|
||||||
|
* VLQ encoded) mappings.
|
||||||
|
*/
|
||||||
|
function remapping(input, loader, options) {
|
||||||
|
const opts = typeof options === 'object' ? options : { excludeContent: !!options, decodedMappings: false };
|
||||||
|
const tree = buildSourceMapTree(input, loader);
|
||||||
|
return new SourceMap(traceMappings(tree), opts);
|
||||||
|
}
|
||||||
|
|
||||||
|
export { remapping as default };
|
||||||
|
//# sourceMappingURL=remapping.mjs.map
|
File diff suppressed because one or more lines are too long
|
@ -0,0 +1,202 @@
|
||||||
|
(function (global, factory) {
|
||||||
|
typeof exports === 'object' && typeof module !== 'undefined' ? module.exports = factory(require('@jridgewell/trace-mapping'), require('@jridgewell/gen-mapping')) :
|
||||||
|
typeof define === 'function' && define.amd ? define(['@jridgewell/trace-mapping', '@jridgewell/gen-mapping'], factory) :
|
||||||
|
(global = typeof globalThis !== 'undefined' ? globalThis : global || self, global.remapping = factory(global.traceMapping, global.genMapping));
|
||||||
|
})(this, (function (traceMapping, genMapping) { 'use strict';
|
||||||
|
|
||||||
|
const SOURCELESS_MAPPING = /* #__PURE__ */ SegmentObject('', -1, -1, '', null, false);
|
||||||
|
const EMPTY_SOURCES = [];
|
||||||
|
function SegmentObject(source, line, column, name, content, ignore) {
|
||||||
|
return { source, line, column, name, content, ignore };
|
||||||
|
}
|
||||||
|
function Source(map, sources, source, content, ignore) {
|
||||||
|
return {
|
||||||
|
map,
|
||||||
|
sources,
|
||||||
|
source,
|
||||||
|
content,
|
||||||
|
ignore,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
/**
|
||||||
|
* MapSource represents a single sourcemap, with the ability to trace mappings into its child nodes
|
||||||
|
* (which may themselves be SourceMapTrees).
|
||||||
|
*/
|
||||||
|
function MapSource(map, sources) {
|
||||||
|
return Source(map, sources, '', null, false);
|
||||||
|
}
|
||||||
|
/**
|
||||||
|
* A "leaf" node in the sourcemap tree, representing an original, unmodified source file. Recursive
|
||||||
|
* segment tracing ends at the `OriginalSource`.
|
||||||
|
*/
|
||||||
|
function OriginalSource(source, content, ignore) {
|
||||||
|
return Source(null, EMPTY_SOURCES, source, content, ignore);
|
||||||
|
}
|
||||||
|
/**
|
||||||
|
* traceMappings is only called on the root level SourceMapTree, and begins the process of
|
||||||
|
* resolving each mapping in terms of the original source files.
|
||||||
|
*/
|
||||||
|
function traceMappings(tree) {
|
||||||
|
// TODO: Eventually support sourceRoot, which has to be removed because the sources are already
|
||||||
|
// fully resolved. We'll need to make sources relative to the sourceRoot before adding them.
|
||||||
|
const gen = new genMapping.GenMapping({ file: tree.map.file });
|
||||||
|
const { sources: rootSources, map } = tree;
|
||||||
|
const rootNames = map.names;
|
||||||
|
const rootMappings = traceMapping.decodedMappings(map);
|
||||||
|
for (let i = 0; i < rootMappings.length; i++) {
|
||||||
|
const segments = rootMappings[i];
|
||||||
|
for (let j = 0; j < segments.length; j++) {
|
||||||
|
const segment = segments[j];
|
||||||
|
const genCol = segment[0];
|
||||||
|
let traced = SOURCELESS_MAPPING;
|
||||||
|
// 1-length segments only move the current generated column, there's no source information
|
||||||
|
// to gather from it.
|
||||||
|
if (segment.length !== 1) {
|
||||||
|
const source = rootSources[segment[1]];
|
||||||
|
traced = originalPositionFor(source, segment[2], segment[3], segment.length === 5 ? rootNames[segment[4]] : '');
|
||||||
|
// If the trace is invalid, then the trace ran into a sourcemap that doesn't contain a
|
||||||
|
// respective segment into an original source.
|
||||||
|
if (traced == null)
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
const { column, line, name, content, source, ignore } = traced;
|
||||||
|
genMapping.maybeAddSegment(gen, i, genCol, source, line, column, name);
|
||||||
|
if (source && content != null)
|
||||||
|
genMapping.setSourceContent(gen, source, content);
|
||||||
|
if (ignore)
|
||||||
|
genMapping.setIgnore(gen, source, true);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return gen;
|
||||||
|
}
|
||||||
|
/**
|
||||||
|
* originalPositionFor is only called on children SourceMapTrees. It recurses down into its own
|
||||||
|
* child SourceMapTrees, until we find the original source map.
|
||||||
|
*/
|
||||||
|
function originalPositionFor(source, line, column, name) {
|
||||||
|
if (!source.map) {
|
||||||
|
return SegmentObject(source.source, line, column, name, source.content, source.ignore);
|
||||||
|
}
|
||||||
|
const segment = traceMapping.traceSegment(source.map, line, column);
|
||||||
|
// If we couldn't find a segment, then this doesn't exist in the sourcemap.
|
||||||
|
if (segment == null)
|
||||||
|
return null;
|
||||||
|
// 1-length segments only move the current generated column, there's no source information
|
||||||
|
// to gather from it.
|
||||||
|
if (segment.length === 1)
|
||||||
|
return SOURCELESS_MAPPING;
|
||||||
|
return originalPositionFor(source.sources[segment[1]], segment[2], segment[3], segment.length === 5 ? source.map.names[segment[4]] : name);
|
||||||
|
}
|
||||||
|
|
||||||
|
function asArray(value) {
|
||||||
|
if (Array.isArray(value))
|
||||||
|
return value;
|
||||||
|
return [value];
|
||||||
|
}
|
||||||
|
/**
|
||||||
|
* Recursively builds a tree structure out of sourcemap files, with each node
|
||||||
|
* being either an `OriginalSource` "leaf" or a `SourceMapTree` composed of
|
||||||
|
* `OriginalSource`s and `SourceMapTree`s.
|
||||||
|
*
|
||||||
|
* Every sourcemap is composed of a collection of source files and mappings
|
||||||
|
* into locations of those source files. When we generate a `SourceMapTree` for
|
||||||
|
* the sourcemap, we attempt to load each source file's own sourcemap. If it
|
||||||
|
* does not have an associated sourcemap, it is considered an original,
|
||||||
|
* unmodified source file.
|
||||||
|
*/
|
||||||
|
function buildSourceMapTree(input, loader) {
|
||||||
|
const maps = asArray(input).map((m) => new traceMapping.TraceMap(m, ''));
|
||||||
|
const map = maps.pop();
|
||||||
|
for (let i = 0; i < maps.length; i++) {
|
||||||
|
if (maps[i].sources.length > 1) {
|
||||||
|
throw new Error(`Transformation map ${i} must have exactly one source file.\n` +
|
||||||
|
'Did you specify these with the most recent transformation maps first?');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
let tree = build(map, loader, '', 0);
|
||||||
|
for (let i = maps.length - 1; i >= 0; i--) {
|
||||||
|
tree = MapSource(maps[i], [tree]);
|
||||||
|
}
|
||||||
|
return tree;
|
||||||
|
}
|
||||||
|
function build(map, loader, importer, importerDepth) {
|
||||||
|
const { resolvedSources, sourcesContent, ignoreList } = map;
|
||||||
|
const depth = importerDepth + 1;
|
||||||
|
const children = resolvedSources.map((sourceFile, i) => {
|
||||||
|
// The loading context gives the loader more information about why this file is being loaded
|
||||||
|
// (eg, from which importer). It also allows the loader to override the location of the loaded
|
||||||
|
// sourcemap/original source, or to override the content in the sourcesContent field if it's
|
||||||
|
// an unmodified source file.
|
||||||
|
const ctx = {
|
||||||
|
importer,
|
||||||
|
depth,
|
||||||
|
source: sourceFile || '',
|
||||||
|
content: undefined,
|
||||||
|
ignore: undefined,
|
||||||
|
};
|
||||||
|
// Use the provided loader callback to retrieve the file's sourcemap.
|
||||||
|
// TODO: We should eventually support async loading of sourcemap files.
|
||||||
|
const sourceMap = loader(ctx.source, ctx);
|
||||||
|
const { source, content, ignore } = ctx;
|
||||||
|
// If there is a sourcemap, then we need to recurse into it to load its source files.
|
||||||
|
if (sourceMap)
|
||||||
|
return build(new traceMapping.TraceMap(sourceMap, source), loader, source, depth);
|
||||||
|
// Else, it's an unmodified source file.
|
||||||
|
// The contents of this unmodified source file can be overridden via the loader context,
|
||||||
|
// allowing it to be explicitly null or a string. If it remains undefined, we fall back to
|
||||||
|
// the importing sourcemap's `sourcesContent` field.
|
||||||
|
const sourceContent = content !== undefined ? content : sourcesContent ? sourcesContent[i] : null;
|
||||||
|
const ignored = ignore !== undefined ? ignore : ignoreList ? ignoreList.includes(i) : false;
|
||||||
|
return OriginalSource(source, sourceContent, ignored);
|
||||||
|
});
|
||||||
|
return MapSource(map, children);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* A SourceMap v3 compatible sourcemap, which only includes fields that were
|
||||||
|
* provided to it.
|
||||||
|
*/
|
||||||
|
class SourceMap {
|
||||||
|
constructor(map, options) {
|
||||||
|
const out = options.decodedMappings ? genMapping.toDecodedMap(map) : genMapping.toEncodedMap(map);
|
||||||
|
this.version = out.version; // SourceMap spec says this should be first.
|
||||||
|
this.file = out.file;
|
||||||
|
this.mappings = out.mappings;
|
||||||
|
this.names = out.names;
|
||||||
|
this.ignoreList = out.ignoreList;
|
||||||
|
this.sourceRoot = out.sourceRoot;
|
||||||
|
this.sources = out.sources;
|
||||||
|
if (!options.excludeContent) {
|
||||||
|
this.sourcesContent = out.sourcesContent;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
toString() {
|
||||||
|
return JSON.stringify(this);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Traces through all the mappings in the root sourcemap, through the sources
|
||||||
|
* (and their sourcemaps), all the way back to the original source location.
|
||||||
|
*
|
||||||
|
* `loader` will be called every time we encounter a source file. If it returns
|
||||||
|
* a sourcemap, we will recurse into that sourcemap to continue the trace. If
|
||||||
|
* it returns a falsey value, that source file is treated as an original,
|
||||||
|
* unmodified source file.
|
||||||
|
*
|
||||||
|
* Pass `excludeContent` to exclude any self-containing source file content
|
||||||
|
* from the output sourcemap.
|
||||||
|
*
|
||||||
|
* Pass `decodedMappings` to receive a SourceMap with decoded (instead of
|
||||||
|
* VLQ encoded) mappings.
|
||||||
|
*/
|
||||||
|
function remapping(input, loader, options) {
|
||||||
|
const opts = typeof options === 'object' ? options : { excludeContent: !!options, decodedMappings: false };
|
||||||
|
const tree = buildSourceMapTree(input, loader);
|
||||||
|
return new SourceMap(traceMappings(tree), opts);
|
||||||
|
}
|
||||||
|
|
||||||
|
return remapping;
|
||||||
|
|
||||||
|
}));
|
||||||
|
//# sourceMappingURL=remapping.umd.js.map
|
File diff suppressed because one or more lines are too long
14
node_modules/@ampproject/remapping/dist/types/build-source-map-tree.d.ts
generated
vendored
Normal file
14
node_modules/@ampproject/remapping/dist/types/build-source-map-tree.d.ts
generated
vendored
Normal file
|
@ -0,0 +1,14 @@
|
||||||
|
import type { MapSource as MapSourceType } from './source-map-tree';
|
||||||
|
import type { SourceMapInput, SourceMapLoader } from './types';
|
||||||
|
/**
|
||||||
|
* Recursively builds a tree structure out of sourcemap files, with each node
|
||||||
|
* being either an `OriginalSource` "leaf" or a `SourceMapTree` composed of
|
||||||
|
* `OriginalSource`s and `SourceMapTree`s.
|
||||||
|
*
|
||||||
|
* Every sourcemap is composed of a collection of source files and mappings
|
||||||
|
* into locations of those source files. When we generate a `SourceMapTree` for
|
||||||
|
* the sourcemap, we attempt to load each source file's own sourcemap. If it
|
||||||
|
* does not have an associated sourcemap, it is considered an original,
|
||||||
|
* unmodified source file.
|
||||||
|
*/
|
||||||
|
export default function buildSourceMapTree(input: SourceMapInput | SourceMapInput[], loader: SourceMapLoader): MapSourceType;
|
|
@ -0,0 +1,20 @@
|
||||||
|
import SourceMap from './source-map';
|
||||||
|
import type { SourceMapInput, SourceMapLoader, Options } from './types';
|
||||||
|
export type { SourceMapSegment, EncodedSourceMap, EncodedSourceMap as RawSourceMap, DecodedSourceMap, SourceMapInput, SourceMapLoader, LoaderContext, Options, } from './types';
|
||||||
|
export type { SourceMap };
|
||||||
|
/**
|
||||||
|
* Traces through all the mappings in the root sourcemap, through the sources
|
||||||
|
* (and their sourcemaps), all the way back to the original source location.
|
||||||
|
*
|
||||||
|
* `loader` will be called every time we encounter a source file. If it returns
|
||||||
|
* a sourcemap, we will recurse into that sourcemap to continue the trace. If
|
||||||
|
* it returns a falsey value, that source file is treated as an original,
|
||||||
|
* unmodified source file.
|
||||||
|
*
|
||||||
|
* Pass `excludeContent` to exclude any self-containing source file content
|
||||||
|
* from the output sourcemap.
|
||||||
|
*
|
||||||
|
* Pass `decodedMappings` to receive a SourceMap with decoded (instead of
|
||||||
|
* VLQ encoded) mappings.
|
||||||
|
*/
|
||||||
|
export default function remapping(input: SourceMapInput | SourceMapInput[], loader: SourceMapLoader, options?: boolean | Options): SourceMap;
|
45
node_modules/@ampproject/remapping/dist/types/source-map-tree.d.ts
generated
vendored
Normal file
45
node_modules/@ampproject/remapping/dist/types/source-map-tree.d.ts
generated
vendored
Normal file
|
@ -0,0 +1,45 @@
|
||||||
|
import { GenMapping } from '@jridgewell/gen-mapping';
|
||||||
|
import type { TraceMap } from '@jridgewell/trace-mapping';
|
||||||
|
export declare type SourceMapSegmentObject = {
|
||||||
|
column: number;
|
||||||
|
line: number;
|
||||||
|
name: string;
|
||||||
|
source: string;
|
||||||
|
content: string | null;
|
||||||
|
ignore: boolean;
|
||||||
|
};
|
||||||
|
export declare type OriginalSource = {
|
||||||
|
map: null;
|
||||||
|
sources: Sources[];
|
||||||
|
source: string;
|
||||||
|
content: string | null;
|
||||||
|
ignore: boolean;
|
||||||
|
};
|
||||||
|
export declare type MapSource = {
|
||||||
|
map: TraceMap;
|
||||||
|
sources: Sources[];
|
||||||
|
source: string;
|
||||||
|
content: null;
|
||||||
|
ignore: false;
|
||||||
|
};
|
||||||
|
export declare type Sources = OriginalSource | MapSource;
|
||||||
|
/**
|
||||||
|
* MapSource represents a single sourcemap, with the ability to trace mappings into its child nodes
|
||||||
|
* (which may themselves be SourceMapTrees).
|
||||||
|
*/
|
||||||
|
export declare function MapSource(map: TraceMap, sources: Sources[]): MapSource;
|
||||||
|
/**
|
||||||
|
* A "leaf" node in the sourcemap tree, representing an original, unmodified source file. Recursive
|
||||||
|
* segment tracing ends at the `OriginalSource`.
|
||||||
|
*/
|
||||||
|
export declare function OriginalSource(source: string, content: string | null, ignore: boolean): OriginalSource;
|
||||||
|
/**
|
||||||
|
* traceMappings is only called on the root level SourceMapTree, and begins the process of
|
||||||
|
* resolving each mapping in terms of the original source files.
|
||||||
|
*/
|
||||||
|
export declare function traceMappings(tree: MapSource): GenMapping;
|
||||||
|
/**
|
||||||
|
* originalPositionFor is only called on children SourceMapTrees. It recurses down into its own
|
||||||
|
* child SourceMapTrees, until we find the original source map.
|
||||||
|
*/
|
||||||
|
export declare function originalPositionFor(source: Sources, line: number, column: number, name: string): SourceMapSegmentObject | null;
|
|
@ -0,0 +1,18 @@
|
||||||
|
import type { GenMapping } from '@jridgewell/gen-mapping';
|
||||||
|
import type { DecodedSourceMap, EncodedSourceMap, Options } from './types';
|
||||||
|
/**
|
||||||
|
* A SourceMap v3 compatible sourcemap, which only includes fields that were
|
||||||
|
* provided to it.
|
||||||
|
*/
|
||||||
|
export default class SourceMap {
|
||||||
|
file?: string | null;
|
||||||
|
mappings: EncodedSourceMap['mappings'] | DecodedSourceMap['mappings'];
|
||||||
|
sourceRoot?: string;
|
||||||
|
names: string[];
|
||||||
|
sources: (string | null)[];
|
||||||
|
sourcesContent?: (string | null)[];
|
||||||
|
version: 3;
|
||||||
|
ignoreList: number[] | undefined;
|
||||||
|
constructor(map: GenMapping, options: Options);
|
||||||
|
toString(): string;
|
||||||
|
}
|
|
@ -0,0 +1,15 @@
|
||||||
|
import type { SourceMapInput } from '@jridgewell/trace-mapping';
|
||||||
|
export type { SourceMapSegment, DecodedSourceMap, EncodedSourceMap, } from '@jridgewell/trace-mapping';
|
||||||
|
export type { SourceMapInput };
|
||||||
|
export declare type LoaderContext = {
|
||||||
|
readonly importer: string;
|
||||||
|
readonly depth: number;
|
||||||
|
source: string;
|
||||||
|
content: string | null | undefined;
|
||||||
|
ignore: boolean | undefined;
|
||||||
|
};
|
||||||
|
export declare type SourceMapLoader = (file: string, ctx: LoaderContext) => SourceMapInput | null | undefined | void;
|
||||||
|
export declare type Options = {
|
||||||
|
excludeContent?: boolean;
|
||||||
|
decodedMappings?: boolean;
|
||||||
|
};
|
|
@ -0,0 +1,75 @@
|
||||||
|
{
|
||||||
|
"name": "@ampproject/remapping",
|
||||||
|
"version": "2.3.0",
|
||||||
|
"description": "Remap sequential sourcemaps through transformations to point at the original source code",
|
||||||
|
"keywords": [
|
||||||
|
"source",
|
||||||
|
"map",
|
||||||
|
"remap"
|
||||||
|
],
|
||||||
|
"main": "dist/remapping.umd.js",
|
||||||
|
"module": "dist/remapping.mjs",
|
||||||
|
"types": "dist/types/remapping.d.ts",
|
||||||
|
"exports": {
|
||||||
|
".": [
|
||||||
|
{
|
||||||
|
"types": "./dist/types/remapping.d.ts",
|
||||||
|
"browser": "./dist/remapping.umd.js",
|
||||||
|
"require": "./dist/remapping.umd.js",
|
||||||
|
"import": "./dist/remapping.mjs"
|
||||||
|
},
|
||||||
|
"./dist/remapping.umd.js"
|
||||||
|
],
|
||||||
|
"./package.json": "./package.json"
|
||||||
|
},
|
||||||
|
"files": [
|
||||||
|
"dist"
|
||||||
|
],
|
||||||
|
"author": "Justin Ridgewell <jridgewell@google.com>",
|
||||||
|
"repository": {
|
||||||
|
"type": "git",
|
||||||
|
"url": "git+https://github.com/ampproject/remapping.git"
|
||||||
|
},
|
||||||
|
"license": "Apache-2.0",
|
||||||
|
"engines": {
|
||||||
|
"node": ">=6.0.0"
|
||||||
|
},
|
||||||
|
"scripts": {
|
||||||
|
"build": "run-s -n build:*",
|
||||||
|
"build:rollup": "rollup -c rollup.config.js",
|
||||||
|
"build:ts": "tsc --project tsconfig.build.json",
|
||||||
|
"lint": "run-s -n lint:*",
|
||||||
|
"lint:prettier": "npm run test:lint:prettier -- --write",
|
||||||
|
"lint:ts": "npm run test:lint:ts -- --fix",
|
||||||
|
"prebuild": "rm -rf dist",
|
||||||
|
"prepublishOnly": "npm run preversion",
|
||||||
|
"preversion": "run-s test build",
|
||||||
|
"test": "run-s -n test:lint test:only",
|
||||||
|
"test:debug": "node --inspect-brk node_modules/.bin/jest --runInBand",
|
||||||
|
"test:lint": "run-s -n test:lint:*",
|
||||||
|
"test:lint:prettier": "prettier --check '{src,test}/**/*.ts'",
|
||||||
|
"test:lint:ts": "eslint '{src,test}/**/*.ts'",
|
||||||
|
"test:only": "jest --coverage",
|
||||||
|
"test:watch": "jest --coverage --watch"
|
||||||
|
},
|
||||||
|
"devDependencies": {
|
||||||
|
"@rollup/plugin-typescript": "8.3.2",
|
||||||
|
"@types/jest": "27.4.1",
|
||||||
|
"@typescript-eslint/eslint-plugin": "5.20.0",
|
||||||
|
"@typescript-eslint/parser": "5.20.0",
|
||||||
|
"eslint": "8.14.0",
|
||||||
|
"eslint-config-prettier": "8.5.0",
|
||||||
|
"jest": "27.5.1",
|
||||||
|
"jest-config": "27.5.1",
|
||||||
|
"npm-run-all": "4.1.5",
|
||||||
|
"prettier": "2.6.2",
|
||||||
|
"rollup": "2.70.2",
|
||||||
|
"ts-jest": "27.1.4",
|
||||||
|
"tslib": "2.4.0",
|
||||||
|
"typescript": "4.6.3"
|
||||||
|
},
|
||||||
|
"dependencies": {
|
||||||
|
"@jridgewell/gen-mapping": "^0.3.5",
|
||||||
|
"@jridgewell/trace-mapping": "^0.3.24"
|
||||||
|
}
|
||||||
|
}
|
|
@ -0,0 +1,22 @@
|
||||||
|
MIT License
|
||||||
|
|
||||||
|
Copyright (c) 2014-present Sebastian McKenzie and other contributors
|
||||||
|
|
||||||
|
Permission is hereby granted, free of charge, to any person obtaining
|
||||||
|
a copy of this software and associated documentation files (the
|
||||||
|
"Software"), to deal in the Software without restriction, including
|
||||||
|
without limitation the rights to use, copy, modify, merge, publish,
|
||||||
|
distribute, sublicense, and/or sell copies of the Software, and to
|
||||||
|
permit persons to whom the Software is furnished to do so, subject to
|
||||||
|
the following conditions:
|
||||||
|
|
||||||
|
The above copyright notice and this permission notice shall be
|
||||||
|
included in all copies or substantial portions of the Software.
|
||||||
|
|
||||||
|
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
|
||||||
|
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
|
||||||
|
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
|
||||||
|
NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE
|
||||||
|
LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
|
||||||
|
OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
|
||||||
|
WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
|
@ -0,0 +1,19 @@
|
||||||
|
# @babel/code-frame
|
||||||
|
|
||||||
|
> Generate errors that contain a code frame that point to source locations.
|
||||||
|
|
||||||
|
See our website [@babel/code-frame](https://babeljs.io/docs/babel-code-frame) for more information.
|
||||||
|
|
||||||
|
## Install
|
||||||
|
|
||||||
|
Using npm:
|
||||||
|
|
||||||
|
```sh
|
||||||
|
npm install --save-dev @babel/code-frame
|
||||||
|
```
|
||||||
|
|
||||||
|
or using yarn:
|
||||||
|
|
||||||
|
```sh
|
||||||
|
yarn add @babel/code-frame --dev
|
||||||
|
```
|
|
@ -0,0 +1,216 @@
|
||||||
|
'use strict';
|
||||||
|
|
||||||
|
Object.defineProperty(exports, '__esModule', { value: true });
|
||||||
|
|
||||||
|
var picocolors = require('picocolors');
|
||||||
|
var jsTokens = require('js-tokens');
|
||||||
|
var helperValidatorIdentifier = require('@babel/helper-validator-identifier');
|
||||||
|
|
||||||
|
function isColorSupported() {
|
||||||
|
return (typeof process === "object" && (process.env.FORCE_COLOR === "0" || process.env.FORCE_COLOR === "false") ? false : picocolors.isColorSupported
|
||||||
|
);
|
||||||
|
}
|
||||||
|
const compose = (f, g) => v => f(g(v));
|
||||||
|
function buildDefs(colors) {
|
||||||
|
return {
|
||||||
|
keyword: colors.cyan,
|
||||||
|
capitalized: colors.yellow,
|
||||||
|
jsxIdentifier: colors.yellow,
|
||||||
|
punctuator: colors.yellow,
|
||||||
|
number: colors.magenta,
|
||||||
|
string: colors.green,
|
||||||
|
regex: colors.magenta,
|
||||||
|
comment: colors.gray,
|
||||||
|
invalid: compose(compose(colors.white, colors.bgRed), colors.bold),
|
||||||
|
gutter: colors.gray,
|
||||||
|
marker: compose(colors.red, colors.bold),
|
||||||
|
message: compose(colors.red, colors.bold),
|
||||||
|
reset: colors.reset
|
||||||
|
};
|
||||||
|
}
|
||||||
|
const defsOn = buildDefs(picocolors.createColors(true));
|
||||||
|
const defsOff = buildDefs(picocolors.createColors(false));
|
||||||
|
function getDefs(enabled) {
|
||||||
|
return enabled ? defsOn : defsOff;
|
||||||
|
}
|
||||||
|
|
||||||
|
const sometimesKeywords = new Set(["as", "async", "from", "get", "of", "set"]);
|
||||||
|
const NEWLINE$1 = /\r\n|[\n\r\u2028\u2029]/;
|
||||||
|
const BRACKET = /^[()[\]{}]$/;
|
||||||
|
let tokenize;
|
||||||
|
{
|
||||||
|
const JSX_TAG = /^[a-z][\w-]*$/i;
|
||||||
|
const getTokenType = function (token, offset, text) {
|
||||||
|
if (token.type === "name") {
|
||||||
|
if (helperValidatorIdentifier.isKeyword(token.value) || helperValidatorIdentifier.isStrictReservedWord(token.value, true) || sometimesKeywords.has(token.value)) {
|
||||||
|
return "keyword";
|
||||||
|
}
|
||||||
|
if (JSX_TAG.test(token.value) && (text[offset - 1] === "<" || text.slice(offset - 2, offset) === "</")) {
|
||||||
|
return "jsxIdentifier";
|
||||||
|
}
|
||||||
|
if (token.value[0] !== token.value[0].toLowerCase()) {
|
||||||
|
return "capitalized";
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (token.type === "punctuator" && BRACKET.test(token.value)) {
|
||||||
|
return "bracket";
|
||||||
|
}
|
||||||
|
if (token.type === "invalid" && (token.value === "@" || token.value === "#")) {
|
||||||
|
return "punctuator";
|
||||||
|
}
|
||||||
|
return token.type;
|
||||||
|
};
|
||||||
|
tokenize = function* (text) {
|
||||||
|
let match;
|
||||||
|
while (match = jsTokens.default.exec(text)) {
|
||||||
|
const token = jsTokens.matchToToken(match);
|
||||||
|
yield {
|
||||||
|
type: getTokenType(token, match.index, text),
|
||||||
|
value: token.value
|
||||||
|
};
|
||||||
|
}
|
||||||
|
};
|
||||||
|
}
|
||||||
|
function highlight(text) {
|
||||||
|
if (text === "") return "";
|
||||||
|
const defs = getDefs(true);
|
||||||
|
let highlighted = "";
|
||||||
|
for (const {
|
||||||
|
type,
|
||||||
|
value
|
||||||
|
} of tokenize(text)) {
|
||||||
|
if (type in defs) {
|
||||||
|
highlighted += value.split(NEWLINE$1).map(str => defs[type](str)).join("\n");
|
||||||
|
} else {
|
||||||
|
highlighted += value;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return highlighted;
|
||||||
|
}
|
||||||
|
|
||||||
|
let deprecationWarningShown = false;
|
||||||
|
const NEWLINE = /\r\n|[\n\r\u2028\u2029]/;
|
||||||
|
function getMarkerLines(loc, source, opts) {
|
||||||
|
const startLoc = Object.assign({
|
||||||
|
column: 0,
|
||||||
|
line: -1
|
||||||
|
}, loc.start);
|
||||||
|
const endLoc = Object.assign({}, startLoc, loc.end);
|
||||||
|
const {
|
||||||
|
linesAbove = 2,
|
||||||
|
linesBelow = 3
|
||||||
|
} = opts || {};
|
||||||
|
const startLine = startLoc.line;
|
||||||
|
const startColumn = startLoc.column;
|
||||||
|
const endLine = endLoc.line;
|
||||||
|
const endColumn = endLoc.column;
|
||||||
|
let start = Math.max(startLine - (linesAbove + 1), 0);
|
||||||
|
let end = Math.min(source.length, endLine + linesBelow);
|
||||||
|
if (startLine === -1) {
|
||||||
|
start = 0;
|
||||||
|
}
|
||||||
|
if (endLine === -1) {
|
||||||
|
end = source.length;
|
||||||
|
}
|
||||||
|
const lineDiff = endLine - startLine;
|
||||||
|
const markerLines = {};
|
||||||
|
if (lineDiff) {
|
||||||
|
for (let i = 0; i <= lineDiff; i++) {
|
||||||
|
const lineNumber = i + startLine;
|
||||||
|
if (!startColumn) {
|
||||||
|
markerLines[lineNumber] = true;
|
||||||
|
} else if (i === 0) {
|
||||||
|
const sourceLength = source[lineNumber - 1].length;
|
||||||
|
markerLines[lineNumber] = [startColumn, sourceLength - startColumn + 1];
|
||||||
|
} else if (i === lineDiff) {
|
||||||
|
markerLines[lineNumber] = [0, endColumn];
|
||||||
|
} else {
|
||||||
|
const sourceLength = source[lineNumber - i].length;
|
||||||
|
markerLines[lineNumber] = [0, sourceLength];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
if (startColumn === endColumn) {
|
||||||
|
if (startColumn) {
|
||||||
|
markerLines[startLine] = [startColumn, 0];
|
||||||
|
} else {
|
||||||
|
markerLines[startLine] = true;
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
markerLines[startLine] = [startColumn, endColumn - startColumn];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return {
|
||||||
|
start,
|
||||||
|
end,
|
||||||
|
markerLines
|
||||||
|
};
|
||||||
|
}
|
||||||
|
function codeFrameColumns(rawLines, loc, opts = {}) {
|
||||||
|
const shouldHighlight = opts.forceColor || isColorSupported() && opts.highlightCode;
|
||||||
|
const defs = getDefs(shouldHighlight);
|
||||||
|
const lines = rawLines.split(NEWLINE);
|
||||||
|
const {
|
||||||
|
start,
|
||||||
|
end,
|
||||||
|
markerLines
|
||||||
|
} = getMarkerLines(loc, lines, opts);
|
||||||
|
const hasColumns = loc.start && typeof loc.start.column === "number";
|
||||||
|
const numberMaxWidth = String(end).length;
|
||||||
|
const highlightedLines = shouldHighlight ? highlight(rawLines) : rawLines;
|
||||||
|
let frame = highlightedLines.split(NEWLINE, end).slice(start, end).map((line, index) => {
|
||||||
|
const number = start + 1 + index;
|
||||||
|
const paddedNumber = ` ${number}`.slice(-numberMaxWidth);
|
||||||
|
const gutter = ` ${paddedNumber} |`;
|
||||||
|
const hasMarker = markerLines[number];
|
||||||
|
const lastMarkerLine = !markerLines[number + 1];
|
||||||
|
if (hasMarker) {
|
||||||
|
let markerLine = "";
|
||||||
|
if (Array.isArray(hasMarker)) {
|
||||||
|
const markerSpacing = line.slice(0, Math.max(hasMarker[0] - 1, 0)).replace(/[^\t]/g, " ");
|
||||||
|
const numberOfMarkers = hasMarker[1] || 1;
|
||||||
|
markerLine = ["\n ", defs.gutter(gutter.replace(/\d/g, " ")), " ", markerSpacing, defs.marker("^").repeat(numberOfMarkers)].join("");
|
||||||
|
if (lastMarkerLine && opts.message) {
|
||||||
|
markerLine += " " + defs.message(opts.message);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return [defs.marker(">"), defs.gutter(gutter), line.length > 0 ? ` ${line}` : "", markerLine].join("");
|
||||||
|
} else {
|
||||||
|
return ` ${defs.gutter(gutter)}${line.length > 0 ? ` ${line}` : ""}`;
|
||||||
|
}
|
||||||
|
}).join("\n");
|
||||||
|
if (opts.message && !hasColumns) {
|
||||||
|
frame = `${" ".repeat(numberMaxWidth + 1)}${opts.message}\n${frame}`;
|
||||||
|
}
|
||||||
|
if (shouldHighlight) {
|
||||||
|
return defs.reset(frame);
|
||||||
|
} else {
|
||||||
|
return frame;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
function index (rawLines, lineNumber, colNumber, opts = {}) {
|
||||||
|
if (!deprecationWarningShown) {
|
||||||
|
deprecationWarningShown = true;
|
||||||
|
const message = "Passing lineNumber and colNumber is deprecated to @babel/code-frame. Please use `codeFrameColumns`.";
|
||||||
|
if (process.emitWarning) {
|
||||||
|
process.emitWarning(message, "DeprecationWarning");
|
||||||
|
} else {
|
||||||
|
const deprecationError = new Error(message);
|
||||||
|
deprecationError.name = "DeprecationWarning";
|
||||||
|
console.warn(new Error(message));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
colNumber = Math.max(colNumber, 0);
|
||||||
|
const location = {
|
||||||
|
start: {
|
||||||
|
column: colNumber,
|
||||||
|
line: lineNumber
|
||||||
|
}
|
||||||
|
};
|
||||||
|
return codeFrameColumns(rawLines, location, opts);
|
||||||
|
}
|
||||||
|
|
||||||
|
exports.codeFrameColumns = codeFrameColumns;
|
||||||
|
exports.default = index;
|
||||||
|
exports.highlight = highlight;
|
||||||
|
//# sourceMappingURL=index.js.map
|
File diff suppressed because one or more lines are too long
|
@ -0,0 +1,31 @@
|
||||||
|
{
|
||||||
|
"name": "@babel/code-frame",
|
||||||
|
"version": "7.27.1",
|
||||||
|
"description": "Generate errors that contain a code frame that point to source locations.",
|
||||||
|
"author": "The Babel Team (https://babel.dev/team)",
|
||||||
|
"homepage": "https://babel.dev/docs/en/next/babel-code-frame",
|
||||||
|
"bugs": "https://github.com/babel/babel/issues?utf8=%E2%9C%93&q=is%3Aissue+is%3Aopen",
|
||||||
|
"license": "MIT",
|
||||||
|
"publishConfig": {
|
||||||
|
"access": "public"
|
||||||
|
},
|
||||||
|
"repository": {
|
||||||
|
"type": "git",
|
||||||
|
"url": "https://github.com/babel/babel.git",
|
||||||
|
"directory": "packages/babel-code-frame"
|
||||||
|
},
|
||||||
|
"main": "./lib/index.js",
|
||||||
|
"dependencies": {
|
||||||
|
"@babel/helper-validator-identifier": "^7.27.1",
|
||||||
|
"js-tokens": "^4.0.0",
|
||||||
|
"picocolors": "^1.1.1"
|
||||||
|
},
|
||||||
|
"devDependencies": {
|
||||||
|
"import-meta-resolve": "^4.1.0",
|
||||||
|
"strip-ansi": "^4.0.0"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">=6.9.0"
|
||||||
|
},
|
||||||
|
"type": "commonjs"
|
||||||
|
}
|
|
@ -0,0 +1,22 @@
|
||||||
|
MIT License
|
||||||
|
|
||||||
|
Copyright (c) 2014-present Sebastian McKenzie and other contributors
|
||||||
|
|
||||||
|
Permission is hereby granted, free of charge, to any person obtaining
|
||||||
|
a copy of this software and associated documentation files (the
|
||||||
|
"Software"), to deal in the Software without restriction, including
|
||||||
|
without limitation the rights to use, copy, modify, merge, publish,
|
||||||
|
distribute, sublicense, and/or sell copies of the Software, and to
|
||||||
|
permit persons to whom the Software is furnished to do so, subject to
|
||||||
|
the following conditions:
|
||||||
|
|
||||||
|
The above copyright notice and this permission notice shall be
|
||||||
|
included in all copies or substantial portions of the Software.
|
||||||
|
|
||||||
|
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
|
||||||
|
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
|
||||||
|
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
|
||||||
|
NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE
|
||||||
|
LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
|
||||||
|
OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
|
||||||
|
WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
|
@ -0,0 +1,19 @@
|
||||||
|
# @babel/compat-data
|
||||||
|
|
||||||
|
> The compat-data to determine required Babel plugins
|
||||||
|
|
||||||
|
See our website [@babel/compat-data](https://babeljs.io/docs/babel-compat-data) for more information.
|
||||||
|
|
||||||
|
## Install
|
||||||
|
|
||||||
|
Using npm:
|
||||||
|
|
||||||
|
```sh
|
||||||
|
npm install --save @babel/compat-data
|
||||||
|
```
|
||||||
|
|
||||||
|
or using yarn:
|
||||||
|
|
||||||
|
```sh
|
||||||
|
yarn add @babel/compat-data
|
||||||
|
```
|
|
@ -0,0 +1,2 @@
|
||||||
|
// Todo (Babel 8): remove this file as Babel 8 drop support of core-js 2
|
||||||
|
module.exports = require("./data/corejs2-built-ins.json");
|
|
@ -0,0 +1,2 @@
|
||||||
|
// Todo (Babel 8): remove this file now that it is included in babel-plugin-polyfill-corejs3
|
||||||
|
module.exports = require("./data/corejs3-shipped-proposals.json");
|
File diff suppressed because it is too large
Load Diff
5
node_modules/@babel/compat-data/data/corejs3-shipped-proposals.json
generated
vendored
Normal file
5
node_modules/@babel/compat-data/data/corejs3-shipped-proposals.json
generated
vendored
Normal file
|
@ -0,0 +1,5 @@
|
||||||
|
[
|
||||||
|
"esnext.promise.all-settled",
|
||||||
|
"esnext.string.match-all",
|
||||||
|
"esnext.global-this"
|
||||||
|
]
|
|
@ -0,0 +1,18 @@
|
||||||
|
{
|
||||||
|
"es6.module": {
|
||||||
|
"chrome": "61",
|
||||||
|
"and_chr": "61",
|
||||||
|
"edge": "16",
|
||||||
|
"firefox": "60",
|
||||||
|
"and_ff": "60",
|
||||||
|
"node": "13.2.0",
|
||||||
|
"opera": "48",
|
||||||
|
"op_mob": "45",
|
||||||
|
"safari": "10.1",
|
||||||
|
"ios": "10.3",
|
||||||
|
"samsung": "8.2",
|
||||||
|
"android": "61",
|
||||||
|
"electron": "2.0",
|
||||||
|
"ios_saf": "10.3"
|
||||||
|
}
|
||||||
|
}
|
|
@ -0,0 +1,35 @@
|
||||||
|
{
|
||||||
|
"transform-async-to-generator": [
|
||||||
|
"bugfix/transform-async-arrows-in-class"
|
||||||
|
],
|
||||||
|
"transform-parameters": [
|
||||||
|
"bugfix/transform-edge-default-parameters",
|
||||||
|
"bugfix/transform-safari-id-destructuring-collision-in-function-expression"
|
||||||
|
],
|
||||||
|
"transform-function-name": [
|
||||||
|
"bugfix/transform-edge-function-name"
|
||||||
|
],
|
||||||
|
"transform-block-scoping": [
|
||||||
|
"bugfix/transform-safari-block-shadowing",
|
||||||
|
"bugfix/transform-safari-for-shadowing"
|
||||||
|
],
|
||||||
|
"transform-template-literals": [
|
||||||
|
"bugfix/transform-tagged-template-caching"
|
||||||
|
],
|
||||||
|
"transform-optional-chaining": [
|
||||||
|
"bugfix/transform-v8-spread-parameters-in-optional-chaining"
|
||||||
|
],
|
||||||
|
"proposal-optional-chaining": [
|
||||||
|
"bugfix/transform-v8-spread-parameters-in-optional-chaining"
|
||||||
|
],
|
||||||
|
"transform-class-properties": [
|
||||||
|
"bugfix/transform-v8-static-class-fields-redefine-readonly",
|
||||||
|
"bugfix/transform-firefox-class-in-computed-class-key",
|
||||||
|
"bugfix/transform-safari-class-field-initializer-scope"
|
||||||
|
],
|
||||||
|
"proposal-class-properties": [
|
||||||
|
"bugfix/transform-v8-static-class-fields-redefine-readonly",
|
||||||
|
"bugfix/transform-firefox-class-in-computed-class-key",
|
||||||
|
"bugfix/transform-safari-class-field-initializer-scope"
|
||||||
|
]
|
||||||
|
}
|
|
@ -0,0 +1,203 @@
|
||||||
|
{
|
||||||
|
"bugfix/transform-async-arrows-in-class": {
|
||||||
|
"chrome": "55",
|
||||||
|
"opera": "42",
|
||||||
|
"edge": "15",
|
||||||
|
"firefox": "52",
|
||||||
|
"safari": "11",
|
||||||
|
"node": "7.6",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "11",
|
||||||
|
"samsung": "6",
|
||||||
|
"opera_mobile": "42",
|
||||||
|
"electron": "1.6"
|
||||||
|
},
|
||||||
|
"bugfix/transform-edge-default-parameters": {
|
||||||
|
"chrome": "49",
|
||||||
|
"opera": "36",
|
||||||
|
"edge": "18",
|
||||||
|
"firefox": "52",
|
||||||
|
"safari": "10",
|
||||||
|
"node": "6",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "10",
|
||||||
|
"samsung": "5",
|
||||||
|
"opera_mobile": "36",
|
||||||
|
"electron": "0.37"
|
||||||
|
},
|
||||||
|
"bugfix/transform-edge-function-name": {
|
||||||
|
"chrome": "51",
|
||||||
|
"opera": "38",
|
||||||
|
"edge": "79",
|
||||||
|
"firefox": "53",
|
||||||
|
"safari": "10",
|
||||||
|
"node": "6.5",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "10",
|
||||||
|
"samsung": "5",
|
||||||
|
"opera_mobile": "41",
|
||||||
|
"electron": "1.2"
|
||||||
|
},
|
||||||
|
"bugfix/transform-safari-block-shadowing": {
|
||||||
|
"chrome": "49",
|
||||||
|
"opera": "36",
|
||||||
|
"edge": "12",
|
||||||
|
"firefox": "44",
|
||||||
|
"safari": "11",
|
||||||
|
"node": "6",
|
||||||
|
"deno": "1",
|
||||||
|
"ie": "11",
|
||||||
|
"ios": "11",
|
||||||
|
"samsung": "5",
|
||||||
|
"opera_mobile": "36",
|
||||||
|
"electron": "0.37"
|
||||||
|
},
|
||||||
|
"bugfix/transform-safari-for-shadowing": {
|
||||||
|
"chrome": "49",
|
||||||
|
"opera": "36",
|
||||||
|
"edge": "12",
|
||||||
|
"firefox": "4",
|
||||||
|
"safari": "11",
|
||||||
|
"node": "6",
|
||||||
|
"deno": "1",
|
||||||
|
"ie": "11",
|
||||||
|
"ios": "11",
|
||||||
|
"samsung": "5",
|
||||||
|
"rhino": "1.7.13",
|
||||||
|
"opera_mobile": "36",
|
||||||
|
"electron": "0.37"
|
||||||
|
},
|
||||||
|
"bugfix/transform-safari-id-destructuring-collision-in-function-expression": {
|
||||||
|
"chrome": "49",
|
||||||
|
"opera": "36",
|
||||||
|
"edge": "14",
|
||||||
|
"firefox": "2",
|
||||||
|
"safari": "16.3",
|
||||||
|
"node": "6",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "16.3",
|
||||||
|
"samsung": "5",
|
||||||
|
"opera_mobile": "36",
|
||||||
|
"electron": "0.37"
|
||||||
|
},
|
||||||
|
"bugfix/transform-tagged-template-caching": {
|
||||||
|
"chrome": "41",
|
||||||
|
"opera": "28",
|
||||||
|
"edge": "12",
|
||||||
|
"firefox": "34",
|
||||||
|
"safari": "13",
|
||||||
|
"node": "4",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "13",
|
||||||
|
"samsung": "3.4",
|
||||||
|
"rhino": "1.7.14",
|
||||||
|
"opera_mobile": "28",
|
||||||
|
"electron": "0.21"
|
||||||
|
},
|
||||||
|
"bugfix/transform-v8-spread-parameters-in-optional-chaining": {
|
||||||
|
"chrome": "91",
|
||||||
|
"opera": "77",
|
||||||
|
"edge": "91",
|
||||||
|
"firefox": "74",
|
||||||
|
"safari": "13.1",
|
||||||
|
"node": "16.9",
|
||||||
|
"deno": "1.9",
|
||||||
|
"ios": "13.4",
|
||||||
|
"samsung": "16",
|
||||||
|
"opera_mobile": "64",
|
||||||
|
"electron": "13.0"
|
||||||
|
},
|
||||||
|
"transform-optional-chaining": {
|
||||||
|
"chrome": "80",
|
||||||
|
"opera": "67",
|
||||||
|
"edge": "80",
|
||||||
|
"firefox": "74",
|
||||||
|
"safari": "13.1",
|
||||||
|
"node": "14",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "13.4",
|
||||||
|
"samsung": "13",
|
||||||
|
"rhino": "1.8",
|
||||||
|
"opera_mobile": "57",
|
||||||
|
"electron": "8.0"
|
||||||
|
},
|
||||||
|
"proposal-optional-chaining": {
|
||||||
|
"chrome": "80",
|
||||||
|
"opera": "67",
|
||||||
|
"edge": "80",
|
||||||
|
"firefox": "74",
|
||||||
|
"safari": "13.1",
|
||||||
|
"node": "14",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "13.4",
|
||||||
|
"samsung": "13",
|
||||||
|
"rhino": "1.8",
|
||||||
|
"opera_mobile": "57",
|
||||||
|
"electron": "8.0"
|
||||||
|
},
|
||||||
|
"transform-parameters": {
|
||||||
|
"chrome": "49",
|
||||||
|
"opera": "36",
|
||||||
|
"edge": "15",
|
||||||
|
"firefox": "52",
|
||||||
|
"safari": "10",
|
||||||
|
"node": "6",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "10",
|
||||||
|
"samsung": "5",
|
||||||
|
"opera_mobile": "36",
|
||||||
|
"electron": "0.37"
|
||||||
|
},
|
||||||
|
"transform-async-to-generator": {
|
||||||
|
"chrome": "55",
|
||||||
|
"opera": "42",
|
||||||
|
"edge": "15",
|
||||||
|
"firefox": "52",
|
||||||
|
"safari": "10.1",
|
||||||
|
"node": "7.6",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "10.3",
|
||||||
|
"samsung": "6",
|
||||||
|
"opera_mobile": "42",
|
||||||
|
"electron": "1.6"
|
||||||
|
},
|
||||||
|
"transform-template-literals": {
|
||||||
|
"chrome": "41",
|
||||||
|
"opera": "28",
|
||||||
|
"edge": "13",
|
||||||
|
"firefox": "34",
|
||||||
|
"safari": "9",
|
||||||
|
"node": "4",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "9",
|
||||||
|
"samsung": "3.4",
|
||||||
|
"opera_mobile": "28",
|
||||||
|
"electron": "0.21"
|
||||||
|
},
|
||||||
|
"transform-function-name": {
|
||||||
|
"chrome": "51",
|
||||||
|
"opera": "38",
|
||||||
|
"edge": "14",
|
||||||
|
"firefox": "53",
|
||||||
|
"safari": "10",
|
||||||
|
"node": "6.5",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "10",
|
||||||
|
"samsung": "5",
|
||||||
|
"opera_mobile": "41",
|
||||||
|
"electron": "1.2"
|
||||||
|
},
|
||||||
|
"transform-block-scoping": {
|
||||||
|
"chrome": "50",
|
||||||
|
"opera": "37",
|
||||||
|
"edge": "14",
|
||||||
|
"firefox": "53",
|
||||||
|
"safari": "10",
|
||||||
|
"node": "6",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "10",
|
||||||
|
"samsung": "5",
|
||||||
|
"opera_mobile": "37",
|
||||||
|
"electron": "1.1"
|
||||||
|
}
|
||||||
|
}
|
|
@ -0,0 +1,831 @@
|
||||||
|
{
|
||||||
|
"transform-duplicate-named-capturing-groups-regex": {
|
||||||
|
"chrome": "126",
|
||||||
|
"opera": "112",
|
||||||
|
"edge": "126",
|
||||||
|
"firefox": "129",
|
||||||
|
"safari": "17.4",
|
||||||
|
"node": "23",
|
||||||
|
"ios": "17.4",
|
||||||
|
"electron": "31.0"
|
||||||
|
},
|
||||||
|
"transform-regexp-modifiers": {
|
||||||
|
"chrome": "125",
|
||||||
|
"opera": "111",
|
||||||
|
"edge": "125",
|
||||||
|
"firefox": "132",
|
||||||
|
"node": "23",
|
||||||
|
"samsung": "27",
|
||||||
|
"electron": "31.0"
|
||||||
|
},
|
||||||
|
"transform-unicode-sets-regex": {
|
||||||
|
"chrome": "112",
|
||||||
|
"opera": "98",
|
||||||
|
"edge": "112",
|
||||||
|
"firefox": "116",
|
||||||
|
"safari": "17",
|
||||||
|
"node": "20",
|
||||||
|
"deno": "1.32",
|
||||||
|
"ios": "17",
|
||||||
|
"samsung": "23",
|
||||||
|
"opera_mobile": "75",
|
||||||
|
"electron": "24.0"
|
||||||
|
},
|
||||||
|
"bugfix/transform-v8-static-class-fields-redefine-readonly": {
|
||||||
|
"chrome": "98",
|
||||||
|
"opera": "84",
|
||||||
|
"edge": "98",
|
||||||
|
"firefox": "75",
|
||||||
|
"safari": "15",
|
||||||
|
"node": "12",
|
||||||
|
"deno": "1.18",
|
||||||
|
"ios": "15",
|
||||||
|
"samsung": "11",
|
||||||
|
"opera_mobile": "52",
|
||||||
|
"electron": "17.0"
|
||||||
|
},
|
||||||
|
"bugfix/transform-firefox-class-in-computed-class-key": {
|
||||||
|
"chrome": "74",
|
||||||
|
"opera": "62",
|
||||||
|
"edge": "79",
|
||||||
|
"firefox": "126",
|
||||||
|
"safari": "16",
|
||||||
|
"node": "12",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "16",
|
||||||
|
"samsung": "11",
|
||||||
|
"opera_mobile": "53",
|
||||||
|
"electron": "6.0"
|
||||||
|
},
|
||||||
|
"bugfix/transform-safari-class-field-initializer-scope": {
|
||||||
|
"chrome": "74",
|
||||||
|
"opera": "62",
|
||||||
|
"edge": "79",
|
||||||
|
"firefox": "69",
|
||||||
|
"safari": "16",
|
||||||
|
"node": "12",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "16",
|
||||||
|
"samsung": "11",
|
||||||
|
"opera_mobile": "53",
|
||||||
|
"electron": "6.0"
|
||||||
|
},
|
||||||
|
"transform-class-static-block": {
|
||||||
|
"chrome": "94",
|
||||||
|
"opera": "80",
|
||||||
|
"edge": "94",
|
||||||
|
"firefox": "93",
|
||||||
|
"safari": "16.4",
|
||||||
|
"node": "16.11",
|
||||||
|
"deno": "1.14",
|
||||||
|
"ios": "16.4",
|
||||||
|
"samsung": "17",
|
||||||
|
"opera_mobile": "66",
|
||||||
|
"electron": "15.0"
|
||||||
|
},
|
||||||
|
"proposal-class-static-block": {
|
||||||
|
"chrome": "94",
|
||||||
|
"opera": "80",
|
||||||
|
"edge": "94",
|
||||||
|
"firefox": "93",
|
||||||
|
"safari": "16.4",
|
||||||
|
"node": "16.11",
|
||||||
|
"deno": "1.14",
|
||||||
|
"ios": "16.4",
|
||||||
|
"samsung": "17",
|
||||||
|
"opera_mobile": "66",
|
||||||
|
"electron": "15.0"
|
||||||
|
},
|
||||||
|
"transform-private-property-in-object": {
|
||||||
|
"chrome": "91",
|
||||||
|
"opera": "77",
|
||||||
|
"edge": "91",
|
||||||
|
"firefox": "90",
|
||||||
|
"safari": "15",
|
||||||
|
"node": "16.9",
|
||||||
|
"deno": "1.9",
|
||||||
|
"ios": "15",
|
||||||
|
"samsung": "16",
|
||||||
|
"opera_mobile": "64",
|
||||||
|
"electron": "13.0"
|
||||||
|
},
|
||||||
|
"proposal-private-property-in-object": {
|
||||||
|
"chrome": "91",
|
||||||
|
"opera": "77",
|
||||||
|
"edge": "91",
|
||||||
|
"firefox": "90",
|
||||||
|
"safari": "15",
|
||||||
|
"node": "16.9",
|
||||||
|
"deno": "1.9",
|
||||||
|
"ios": "15",
|
||||||
|
"samsung": "16",
|
||||||
|
"opera_mobile": "64",
|
||||||
|
"electron": "13.0"
|
||||||
|
},
|
||||||
|
"transform-class-properties": {
|
||||||
|
"chrome": "74",
|
||||||
|
"opera": "62",
|
||||||
|
"edge": "79",
|
||||||
|
"firefox": "90",
|
||||||
|
"safari": "14.1",
|
||||||
|
"node": "12",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "14.5",
|
||||||
|
"samsung": "11",
|
||||||
|
"opera_mobile": "53",
|
||||||
|
"electron": "6.0"
|
||||||
|
},
|
||||||
|
"proposal-class-properties": {
|
||||||
|
"chrome": "74",
|
||||||
|
"opera": "62",
|
||||||
|
"edge": "79",
|
||||||
|
"firefox": "90",
|
||||||
|
"safari": "14.1",
|
||||||
|
"node": "12",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "14.5",
|
||||||
|
"samsung": "11",
|
||||||
|
"opera_mobile": "53",
|
||||||
|
"electron": "6.0"
|
||||||
|
},
|
||||||
|
"transform-private-methods": {
|
||||||
|
"chrome": "84",
|
||||||
|
"opera": "70",
|
||||||
|
"edge": "84",
|
||||||
|
"firefox": "90",
|
||||||
|
"safari": "15",
|
||||||
|
"node": "14.6",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "15",
|
||||||
|
"samsung": "14",
|
||||||
|
"opera_mobile": "60",
|
||||||
|
"electron": "10.0"
|
||||||
|
},
|
||||||
|
"proposal-private-methods": {
|
||||||
|
"chrome": "84",
|
||||||
|
"opera": "70",
|
||||||
|
"edge": "84",
|
||||||
|
"firefox": "90",
|
||||||
|
"safari": "15",
|
||||||
|
"node": "14.6",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "15",
|
||||||
|
"samsung": "14",
|
||||||
|
"opera_mobile": "60",
|
||||||
|
"electron": "10.0"
|
||||||
|
},
|
||||||
|
"transform-numeric-separator": {
|
||||||
|
"chrome": "75",
|
||||||
|
"opera": "62",
|
||||||
|
"edge": "79",
|
||||||
|
"firefox": "70",
|
||||||
|
"safari": "13",
|
||||||
|
"node": "12.5",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "13",
|
||||||
|
"samsung": "11",
|
||||||
|
"rhino": "1.7.14",
|
||||||
|
"opera_mobile": "54",
|
||||||
|
"electron": "6.0"
|
||||||
|
},
|
||||||
|
"proposal-numeric-separator": {
|
||||||
|
"chrome": "75",
|
||||||
|
"opera": "62",
|
||||||
|
"edge": "79",
|
||||||
|
"firefox": "70",
|
||||||
|
"safari": "13",
|
||||||
|
"node": "12.5",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "13",
|
||||||
|
"samsung": "11",
|
||||||
|
"rhino": "1.7.14",
|
||||||
|
"opera_mobile": "54",
|
||||||
|
"electron": "6.0"
|
||||||
|
},
|
||||||
|
"transform-logical-assignment-operators": {
|
||||||
|
"chrome": "85",
|
||||||
|
"opera": "71",
|
||||||
|
"edge": "85",
|
||||||
|
"firefox": "79",
|
||||||
|
"safari": "14",
|
||||||
|
"node": "15",
|
||||||
|
"deno": "1.2",
|
||||||
|
"ios": "14",
|
||||||
|
"samsung": "14",
|
||||||
|
"opera_mobile": "60",
|
||||||
|
"electron": "10.0"
|
||||||
|
},
|
||||||
|
"proposal-logical-assignment-operators": {
|
||||||
|
"chrome": "85",
|
||||||
|
"opera": "71",
|
||||||
|
"edge": "85",
|
||||||
|
"firefox": "79",
|
||||||
|
"safari": "14",
|
||||||
|
"node": "15",
|
||||||
|
"deno": "1.2",
|
||||||
|
"ios": "14",
|
||||||
|
"samsung": "14",
|
||||||
|
"opera_mobile": "60",
|
||||||
|
"electron": "10.0"
|
||||||
|
},
|
||||||
|
"transform-nullish-coalescing-operator": {
|
||||||
|
"chrome": "80",
|
||||||
|
"opera": "67",
|
||||||
|
"edge": "80",
|
||||||
|
"firefox": "72",
|
||||||
|
"safari": "13.1",
|
||||||
|
"node": "14",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "13.4",
|
||||||
|
"samsung": "13",
|
||||||
|
"rhino": "1.8",
|
||||||
|
"opera_mobile": "57",
|
||||||
|
"electron": "8.0"
|
||||||
|
},
|
||||||
|
"proposal-nullish-coalescing-operator": {
|
||||||
|
"chrome": "80",
|
||||||
|
"opera": "67",
|
||||||
|
"edge": "80",
|
||||||
|
"firefox": "72",
|
||||||
|
"safari": "13.1",
|
||||||
|
"node": "14",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "13.4",
|
||||||
|
"samsung": "13",
|
||||||
|
"rhino": "1.8",
|
||||||
|
"opera_mobile": "57",
|
||||||
|
"electron": "8.0"
|
||||||
|
},
|
||||||
|
"transform-optional-chaining": {
|
||||||
|
"chrome": "91",
|
||||||
|
"opera": "77",
|
||||||
|
"edge": "91",
|
||||||
|
"firefox": "74",
|
||||||
|
"safari": "13.1",
|
||||||
|
"node": "16.9",
|
||||||
|
"deno": "1.9",
|
||||||
|
"ios": "13.4",
|
||||||
|
"samsung": "16",
|
||||||
|
"opera_mobile": "64",
|
||||||
|
"electron": "13.0"
|
||||||
|
},
|
||||||
|
"proposal-optional-chaining": {
|
||||||
|
"chrome": "91",
|
||||||
|
"opera": "77",
|
||||||
|
"edge": "91",
|
||||||
|
"firefox": "74",
|
||||||
|
"safari": "13.1",
|
||||||
|
"node": "16.9",
|
||||||
|
"deno": "1.9",
|
||||||
|
"ios": "13.4",
|
||||||
|
"samsung": "16",
|
||||||
|
"opera_mobile": "64",
|
||||||
|
"electron": "13.0"
|
||||||
|
},
|
||||||
|
"transform-json-strings": {
|
||||||
|
"chrome": "66",
|
||||||
|
"opera": "53",
|
||||||
|
"edge": "79",
|
||||||
|
"firefox": "62",
|
||||||
|
"safari": "12",
|
||||||
|
"node": "10",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "12",
|
||||||
|
"samsung": "9",
|
||||||
|
"rhino": "1.7.14",
|
||||||
|
"opera_mobile": "47",
|
||||||
|
"electron": "3.0"
|
||||||
|
},
|
||||||
|
"proposal-json-strings": {
|
||||||
|
"chrome": "66",
|
||||||
|
"opera": "53",
|
||||||
|
"edge": "79",
|
||||||
|
"firefox": "62",
|
||||||
|
"safari": "12",
|
||||||
|
"node": "10",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "12",
|
||||||
|
"samsung": "9",
|
||||||
|
"rhino": "1.7.14",
|
||||||
|
"opera_mobile": "47",
|
||||||
|
"electron": "3.0"
|
||||||
|
},
|
||||||
|
"transform-optional-catch-binding": {
|
||||||
|
"chrome": "66",
|
||||||
|
"opera": "53",
|
||||||
|
"edge": "79",
|
||||||
|
"firefox": "58",
|
||||||
|
"safari": "11.1",
|
||||||
|
"node": "10",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "11.3",
|
||||||
|
"samsung": "9",
|
||||||
|
"opera_mobile": "47",
|
||||||
|
"electron": "3.0"
|
||||||
|
},
|
||||||
|
"proposal-optional-catch-binding": {
|
||||||
|
"chrome": "66",
|
||||||
|
"opera": "53",
|
||||||
|
"edge": "79",
|
||||||
|
"firefox": "58",
|
||||||
|
"safari": "11.1",
|
||||||
|
"node": "10",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "11.3",
|
||||||
|
"samsung": "9",
|
||||||
|
"opera_mobile": "47",
|
||||||
|
"electron": "3.0"
|
||||||
|
},
|
||||||
|
"transform-parameters": {
|
||||||
|
"chrome": "49",
|
||||||
|
"opera": "36",
|
||||||
|
"edge": "18",
|
||||||
|
"firefox": "52",
|
||||||
|
"safari": "16.3",
|
||||||
|
"node": "6",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "16.3",
|
||||||
|
"samsung": "5",
|
||||||
|
"opera_mobile": "36",
|
||||||
|
"electron": "0.37"
|
||||||
|
},
|
||||||
|
"transform-async-generator-functions": {
|
||||||
|
"chrome": "63",
|
||||||
|
"opera": "50",
|
||||||
|
"edge": "79",
|
||||||
|
"firefox": "57",
|
||||||
|
"safari": "12",
|
||||||
|
"node": "10",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "12",
|
||||||
|
"samsung": "8",
|
||||||
|
"opera_mobile": "46",
|
||||||
|
"electron": "3.0"
|
||||||
|
},
|
||||||
|
"proposal-async-generator-functions": {
|
||||||
|
"chrome": "63",
|
||||||
|
"opera": "50",
|
||||||
|
"edge": "79",
|
||||||
|
"firefox": "57",
|
||||||
|
"safari": "12",
|
||||||
|
"node": "10",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "12",
|
||||||
|
"samsung": "8",
|
||||||
|
"opera_mobile": "46",
|
||||||
|
"electron": "3.0"
|
||||||
|
},
|
||||||
|
"transform-object-rest-spread": {
|
||||||
|
"chrome": "60",
|
||||||
|
"opera": "47",
|
||||||
|
"edge": "79",
|
||||||
|
"firefox": "55",
|
||||||
|
"safari": "11.1",
|
||||||
|
"node": "8.3",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "11.3",
|
||||||
|
"samsung": "8",
|
||||||
|
"opera_mobile": "44",
|
||||||
|
"electron": "2.0"
|
||||||
|
},
|
||||||
|
"proposal-object-rest-spread": {
|
||||||
|
"chrome": "60",
|
||||||
|
"opera": "47",
|
||||||
|
"edge": "79",
|
||||||
|
"firefox": "55",
|
||||||
|
"safari": "11.1",
|
||||||
|
"node": "8.3",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "11.3",
|
||||||
|
"samsung": "8",
|
||||||
|
"opera_mobile": "44",
|
||||||
|
"electron": "2.0"
|
||||||
|
},
|
||||||
|
"transform-dotall-regex": {
|
||||||
|
"chrome": "62",
|
||||||
|
"opera": "49",
|
||||||
|
"edge": "79",
|
||||||
|
"firefox": "78",
|
||||||
|
"safari": "11.1",
|
||||||
|
"node": "8.10",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "11.3",
|
||||||
|
"samsung": "8",
|
||||||
|
"rhino": "1.7.15",
|
||||||
|
"opera_mobile": "46",
|
||||||
|
"electron": "3.0"
|
||||||
|
},
|
||||||
|
"transform-unicode-property-regex": {
|
||||||
|
"chrome": "64",
|
||||||
|
"opera": "51",
|
||||||
|
"edge": "79",
|
||||||
|
"firefox": "78",
|
||||||
|
"safari": "11.1",
|
||||||
|
"node": "10",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "11.3",
|
||||||
|
"samsung": "9",
|
||||||
|
"opera_mobile": "47",
|
||||||
|
"electron": "3.0"
|
||||||
|
},
|
||||||
|
"proposal-unicode-property-regex": {
|
||||||
|
"chrome": "64",
|
||||||
|
"opera": "51",
|
||||||
|
"edge": "79",
|
||||||
|
"firefox": "78",
|
||||||
|
"safari": "11.1",
|
||||||
|
"node": "10",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "11.3",
|
||||||
|
"samsung": "9",
|
||||||
|
"opera_mobile": "47",
|
||||||
|
"electron": "3.0"
|
||||||
|
},
|
||||||
|
"transform-named-capturing-groups-regex": {
|
||||||
|
"chrome": "64",
|
||||||
|
"opera": "51",
|
||||||
|
"edge": "79",
|
||||||
|
"firefox": "78",
|
||||||
|
"safari": "11.1",
|
||||||
|
"node": "10",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "11.3",
|
||||||
|
"samsung": "9",
|
||||||
|
"opera_mobile": "47",
|
||||||
|
"electron": "3.0"
|
||||||
|
},
|
||||||
|
"transform-async-to-generator": {
|
||||||
|
"chrome": "55",
|
||||||
|
"opera": "42",
|
||||||
|
"edge": "15",
|
||||||
|
"firefox": "52",
|
||||||
|
"safari": "11",
|
||||||
|
"node": "7.6",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "11",
|
||||||
|
"samsung": "6",
|
||||||
|
"opera_mobile": "42",
|
||||||
|
"electron": "1.6"
|
||||||
|
},
|
||||||
|
"transform-exponentiation-operator": {
|
||||||
|
"chrome": "52",
|
||||||
|
"opera": "39",
|
||||||
|
"edge": "14",
|
||||||
|
"firefox": "52",
|
||||||
|
"safari": "10.1",
|
||||||
|
"node": "7",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "10.3",
|
||||||
|
"samsung": "6",
|
||||||
|
"rhino": "1.7.14",
|
||||||
|
"opera_mobile": "41",
|
||||||
|
"electron": "1.3"
|
||||||
|
},
|
||||||
|
"transform-template-literals": {
|
||||||
|
"chrome": "41",
|
||||||
|
"opera": "28",
|
||||||
|
"edge": "13",
|
||||||
|
"firefox": "34",
|
||||||
|
"safari": "13",
|
||||||
|
"node": "4",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "13",
|
||||||
|
"samsung": "3.4",
|
||||||
|
"opera_mobile": "28",
|
||||||
|
"electron": "0.21"
|
||||||
|
},
|
||||||
|
"transform-literals": {
|
||||||
|
"chrome": "44",
|
||||||
|
"opera": "31",
|
||||||
|
"edge": "12",
|
||||||
|
"firefox": "53",
|
||||||
|
"safari": "9",
|
||||||
|
"node": "4",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "9",
|
||||||
|
"samsung": "4",
|
||||||
|
"rhino": "1.7.15",
|
||||||
|
"opera_mobile": "32",
|
||||||
|
"electron": "0.30"
|
||||||
|
},
|
||||||
|
"transform-function-name": {
|
||||||
|
"chrome": "51",
|
||||||
|
"opera": "38",
|
||||||
|
"edge": "79",
|
||||||
|
"firefox": "53",
|
||||||
|
"safari": "10",
|
||||||
|
"node": "6.5",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "10",
|
||||||
|
"samsung": "5",
|
||||||
|
"opera_mobile": "41",
|
||||||
|
"electron": "1.2"
|
||||||
|
},
|
||||||
|
"transform-arrow-functions": {
|
||||||
|
"chrome": "47",
|
||||||
|
"opera": "34",
|
||||||
|
"edge": "13",
|
||||||
|
"firefox": "43",
|
||||||
|
"safari": "10",
|
||||||
|
"node": "6",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "10",
|
||||||
|
"samsung": "5",
|
||||||
|
"rhino": "1.7.13",
|
||||||
|
"opera_mobile": "34",
|
||||||
|
"electron": "0.36"
|
||||||
|
},
|
||||||
|
"transform-block-scoped-functions": {
|
||||||
|
"chrome": "41",
|
||||||
|
"opera": "28",
|
||||||
|
"edge": "12",
|
||||||
|
"firefox": "46",
|
||||||
|
"safari": "10",
|
||||||
|
"node": "4",
|
||||||
|
"deno": "1",
|
||||||
|
"ie": "11",
|
||||||
|
"ios": "10",
|
||||||
|
"samsung": "3.4",
|
||||||
|
"opera_mobile": "28",
|
||||||
|
"electron": "0.21"
|
||||||
|
},
|
||||||
|
"transform-classes": {
|
||||||
|
"chrome": "46",
|
||||||
|
"opera": "33",
|
||||||
|
"edge": "13",
|
||||||
|
"firefox": "45",
|
||||||
|
"safari": "10",
|
||||||
|
"node": "5",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "10",
|
||||||
|
"samsung": "5",
|
||||||
|
"opera_mobile": "33",
|
||||||
|
"electron": "0.36"
|
||||||
|
},
|
||||||
|
"transform-object-super": {
|
||||||
|
"chrome": "46",
|
||||||
|
"opera": "33",
|
||||||
|
"edge": "13",
|
||||||
|
"firefox": "45",
|
||||||
|
"safari": "10",
|
||||||
|
"node": "5",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "10",
|
||||||
|
"samsung": "5",
|
||||||
|
"opera_mobile": "33",
|
||||||
|
"electron": "0.36"
|
||||||
|
},
|
||||||
|
"transform-shorthand-properties": {
|
||||||
|
"chrome": "43",
|
||||||
|
"opera": "30",
|
||||||
|
"edge": "12",
|
||||||
|
"firefox": "33",
|
||||||
|
"safari": "9",
|
||||||
|
"node": "4",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "9",
|
||||||
|
"samsung": "4",
|
||||||
|
"rhino": "1.7.14",
|
||||||
|
"opera_mobile": "30",
|
||||||
|
"electron": "0.27"
|
||||||
|
},
|
||||||
|
"transform-duplicate-keys": {
|
||||||
|
"chrome": "42",
|
||||||
|
"opera": "29",
|
||||||
|
"edge": "12",
|
||||||
|
"firefox": "34",
|
||||||
|
"safari": "9",
|
||||||
|
"node": "4",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "9",
|
||||||
|
"samsung": "3.4",
|
||||||
|
"opera_mobile": "29",
|
||||||
|
"electron": "0.25"
|
||||||
|
},
|
||||||
|
"transform-computed-properties": {
|
||||||
|
"chrome": "44",
|
||||||
|
"opera": "31",
|
||||||
|
"edge": "12",
|
||||||
|
"firefox": "34",
|
||||||
|
"safari": "7.1",
|
||||||
|
"node": "4",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "8",
|
||||||
|
"samsung": "4",
|
||||||
|
"rhino": "1.8",
|
||||||
|
"opera_mobile": "32",
|
||||||
|
"electron": "0.30"
|
||||||
|
},
|
||||||
|
"transform-for-of": {
|
||||||
|
"chrome": "51",
|
||||||
|
"opera": "38",
|
||||||
|
"edge": "15",
|
||||||
|
"firefox": "53",
|
||||||
|
"safari": "10",
|
||||||
|
"node": "6.5",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "10",
|
||||||
|
"samsung": "5",
|
||||||
|
"opera_mobile": "41",
|
||||||
|
"electron": "1.2"
|
||||||
|
},
|
||||||
|
"transform-sticky-regex": {
|
||||||
|
"chrome": "49",
|
||||||
|
"opera": "36",
|
||||||
|
"edge": "13",
|
||||||
|
"firefox": "3",
|
||||||
|
"safari": "10",
|
||||||
|
"node": "6",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "10",
|
||||||
|
"samsung": "5",
|
||||||
|
"rhino": "1.7.15",
|
||||||
|
"opera_mobile": "36",
|
||||||
|
"electron": "0.37"
|
||||||
|
},
|
||||||
|
"transform-unicode-escapes": {
|
||||||
|
"chrome": "44",
|
||||||
|
"opera": "31",
|
||||||
|
"edge": "12",
|
||||||
|
"firefox": "53",
|
||||||
|
"safari": "9",
|
||||||
|
"node": "4",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "9",
|
||||||
|
"samsung": "4",
|
||||||
|
"rhino": "1.7.15",
|
||||||
|
"opera_mobile": "32",
|
||||||
|
"electron": "0.30"
|
||||||
|
},
|
||||||
|
"transform-unicode-regex": {
|
||||||
|
"chrome": "50",
|
||||||
|
"opera": "37",
|
||||||
|
"edge": "13",
|
||||||
|
"firefox": "46",
|
||||||
|
"safari": "12",
|
||||||
|
"node": "6",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "12",
|
||||||
|
"samsung": "5",
|
||||||
|
"opera_mobile": "37",
|
||||||
|
"electron": "1.1"
|
||||||
|
},
|
||||||
|
"transform-spread": {
|
||||||
|
"chrome": "46",
|
||||||
|
"opera": "33",
|
||||||
|
"edge": "13",
|
||||||
|
"firefox": "45",
|
||||||
|
"safari": "10",
|
||||||
|
"node": "5",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "10",
|
||||||
|
"samsung": "5",
|
||||||
|
"opera_mobile": "33",
|
||||||
|
"electron": "0.36"
|
||||||
|
},
|
||||||
|
"transform-destructuring": {
|
||||||
|
"chrome": "51",
|
||||||
|
"opera": "38",
|
||||||
|
"edge": "15",
|
||||||
|
"firefox": "53",
|
||||||
|
"safari": "10",
|
||||||
|
"node": "6.5",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "10",
|
||||||
|
"samsung": "5",
|
||||||
|
"opera_mobile": "41",
|
||||||
|
"electron": "1.2"
|
||||||
|
},
|
||||||
|
"transform-block-scoping": {
|
||||||
|
"chrome": "50",
|
||||||
|
"opera": "37",
|
||||||
|
"edge": "14",
|
||||||
|
"firefox": "53",
|
||||||
|
"safari": "11",
|
||||||
|
"node": "6",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "11",
|
||||||
|
"samsung": "5",
|
||||||
|
"opera_mobile": "37",
|
||||||
|
"electron": "1.1"
|
||||||
|
},
|
||||||
|
"transform-typeof-symbol": {
|
||||||
|
"chrome": "48",
|
||||||
|
"opera": "35",
|
||||||
|
"edge": "12",
|
||||||
|
"firefox": "36",
|
||||||
|
"safari": "9",
|
||||||
|
"node": "6",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "9",
|
||||||
|
"samsung": "5",
|
||||||
|
"rhino": "1.8",
|
||||||
|
"opera_mobile": "35",
|
||||||
|
"electron": "0.37"
|
||||||
|
},
|
||||||
|
"transform-new-target": {
|
||||||
|
"chrome": "46",
|
||||||
|
"opera": "33",
|
||||||
|
"edge": "14",
|
||||||
|
"firefox": "41",
|
||||||
|
"safari": "10",
|
||||||
|
"node": "5",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "10",
|
||||||
|
"samsung": "5",
|
||||||
|
"opera_mobile": "33",
|
||||||
|
"electron": "0.36"
|
||||||
|
},
|
||||||
|
"transform-regenerator": {
|
||||||
|
"chrome": "50",
|
||||||
|
"opera": "37",
|
||||||
|
"edge": "13",
|
||||||
|
"firefox": "53",
|
||||||
|
"safari": "10",
|
||||||
|
"node": "6",
|
||||||
|
"deno": "1",
|
||||||
|
"ios": "10",
|
||||||
|
"samsung": "5",
|
||||||
|
"opera_mobile": "37",
|
||||||
|
"electron": "1.1"
|
||||||
|
},
|
||||||
|
"transform-member-expression-literals": {
|
||||||
|
"chrome": "7",
|
||||||
|
"opera": "12",
|
||||||
|
"edge": "12",
|
||||||
|
"firefox": "2",
|
||||||
|
"safari": "5.1",
|
||||||
|
"node": "0.4",
|
||||||
|
"deno": "1",
|
||||||
|
"ie": "9",
|
||||||
|
"android": "4",
|
||||||
|
"ios": "6",
|
||||||
|
"phantom": "1.9",
|
||||||
|
"samsung": "1",
|
||||||
|
"rhino": "1.7.13",
|
||||||
|
"opera_mobile": "12",
|
||||||
|
"electron": "0.20"
|
||||||
|
},
|
||||||
|
"transform-property-literals": {
|
||||||
|
"chrome": "7",
|
||||||
|
"opera": "12",
|
||||||
|
"edge": "12",
|
||||||
|
"firefox": "2",
|
||||||
|
"safari": "5.1",
|
||||||
|
"node": "0.4",
|
||||||
|
"deno": "1",
|
||||||
|
"ie": "9",
|
||||||
|
"android": "4",
|
||||||
|
"ios": "6",
|
||||||
|
"phantom": "1.9",
|
||||||
|
"samsung": "1",
|
||||||
|
"rhino": "1.7.13",
|
||||||
|
"opera_mobile": "12",
|
||||||
|
"electron": "0.20"
|
||||||
|
},
|
||||||
|
"transform-reserved-words": {
|
||||||
|
"chrome": "13",
|
||||||
|
"opera": "10.50",
|
||||||
|
"edge": "12",
|
||||||
|
"firefox": "2",
|
||||||
|
"safari": "3.1",
|
||||||
|
"node": "0.6",
|
||||||
|
"deno": "1",
|
||||||
|
"ie": "9",
|
||||||
|
"android": "4.4",
|
||||||
|
"ios": "6",
|
||||||
|
"phantom": "1.9",
|
||||||
|
"samsung": "1",
|
||||||
|
"rhino": "1.7.13",
|
||||||
|
"opera_mobile": "10.1",
|
||||||
|
"electron": "0.20"
|
||||||
|
},
|
||||||
|
"transform-export-namespace-from": {
|
||||||
|
"chrome": "72",
|
||||||
|
"deno": "1.0",
|
||||||
|
"edge": "79",
|
||||||
|
"firefox": "80",
|
||||||
|
"node": "13.2.0",
|
||||||
|
"opera": "60",
|
||||||
|
"opera_mobile": "51",
|
||||||
|
"safari": "14.1",
|
||||||
|
"ios": "14.5",
|
||||||
|
"samsung": "11.0",
|
||||||
|
"android": "72",
|
||||||
|
"electron": "5.0"
|
||||||
|
},
|
||||||
|
"proposal-export-namespace-from": {
|
||||||
|
"chrome": "72",
|
||||||
|
"deno": "1.0",
|
||||||
|
"edge": "79",
|
||||||
|
"firefox": "80",
|
||||||
|
"node": "13.2.0",
|
||||||
|
"opera": "60",
|
||||||
|
"opera_mobile": "51",
|
||||||
|
"safari": "14.1",
|
||||||
|
"ios": "14.5",
|
||||||
|
"samsung": "11.0",
|
||||||
|
"android": "72",
|
||||||
|
"electron": "5.0"
|
||||||
|
}
|
||||||
|
}
|
|
@ -0,0 +1,2 @@
|
||||||
|
// Todo (Babel 8): remove this file, in Babel 8 users import the .json directly
|
||||||
|
module.exports = require("./data/native-modules.json");
|
|
@ -0,0 +1,2 @@
|
||||||
|
// Todo (Babel 8): remove this file, in Babel 8 users import the .json directly
|
||||||
|
module.exports = require("./data/overlapping-plugins.json");
|
|
@ -0,0 +1,40 @@
|
||||||
|
{
|
||||||
|
"name": "@babel/compat-data",
|
||||||
|
"version": "7.27.5",
|
||||||
|
"author": "The Babel Team (https://babel.dev/team)",
|
||||||
|
"license": "MIT",
|
||||||
|
"description": "The compat-data to determine required Babel plugins",
|
||||||
|
"repository": {
|
||||||
|
"type": "git",
|
||||||
|
"url": "https://github.com/babel/babel.git",
|
||||||
|
"directory": "packages/babel-compat-data"
|
||||||
|
},
|
||||||
|
"publishConfig": {
|
||||||
|
"access": "public"
|
||||||
|
},
|
||||||
|
"exports": {
|
||||||
|
"./plugins": "./plugins.js",
|
||||||
|
"./native-modules": "./native-modules.js",
|
||||||
|
"./corejs2-built-ins": "./corejs2-built-ins.js",
|
||||||
|
"./corejs3-shipped-proposals": "./corejs3-shipped-proposals.js",
|
||||||
|
"./overlapping-plugins": "./overlapping-plugins.js",
|
||||||
|
"./plugin-bugfixes": "./plugin-bugfixes.js"
|
||||||
|
},
|
||||||
|
"scripts": {
|
||||||
|
"build-data": "./scripts/download-compat-table.sh && node ./scripts/build-data.mjs && node ./scripts/build-modules-support.mjs && node ./scripts/build-bugfixes-targets.mjs"
|
||||||
|
},
|
||||||
|
"keywords": [
|
||||||
|
"babel",
|
||||||
|
"compat-table",
|
||||||
|
"compat-data"
|
||||||
|
],
|
||||||
|
"devDependencies": {
|
||||||
|
"@mdn/browser-compat-data": "^6.0.8",
|
||||||
|
"core-js-compat": "^3.41.0",
|
||||||
|
"electron-to-chromium": "^1.5.140"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">=6.9.0"
|
||||||
|
},
|
||||||
|
"type": "commonjs"
|
||||||
|
}
|
|
@ -0,0 +1,2 @@
|
||||||
|
// Todo (Babel 8): remove this file, in Babel 8 users import the .json directly
|
||||||
|
module.exports = require("./data/plugin-bugfixes.json");
|
|
@ -0,0 +1,2 @@
|
||||||
|
// Todo (Babel 8): remove this file, in Babel 8 users import the .json directly
|
||||||
|
module.exports = require("./data/plugins.json");
|
|
@ -0,0 +1,22 @@
|
||||||
|
MIT License
|
||||||
|
|
||||||
|
Copyright (c) 2014-present Sebastian McKenzie and other contributors
|
||||||
|
|
||||||
|
Permission is hereby granted, free of charge, to any person obtaining
|
||||||
|
a copy of this software and associated documentation files (the
|
||||||
|
"Software"), to deal in the Software without restriction, including
|
||||||
|
without limitation the rights to use, copy, modify, merge, publish,
|
||||||
|
distribute, sublicense, and/or sell copies of the Software, and to
|
||||||
|
permit persons to whom the Software is furnished to do so, subject to
|
||||||
|
the following conditions:
|
||||||
|
|
||||||
|
The above copyright notice and this permission notice shall be
|
||||||
|
included in all copies or substantial portions of the Software.
|
||||||
|
|
||||||
|
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
|
||||||
|
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
|
||||||
|
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND
|
||||||
|
NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE
|
||||||
|
LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION
|
||||||
|
OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION
|
||||||
|
WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
|
|
@ -0,0 +1,19 @@
|
||||||
|
# @babel/core
|
||||||
|
|
||||||
|
> Babel compiler core.
|
||||||
|
|
||||||
|
See our website [@babel/core](https://babeljs.io/docs/babel-core) for more information or the [issues](https://github.com/babel/babel/issues?utf8=%E2%9C%93&q=is%3Aissue+label%3A%22pkg%3A%20core%22+is%3Aopen) associated with this package.
|
||||||
|
|
||||||
|
## Install
|
||||||
|
|
||||||
|
Using npm:
|
||||||
|
|
||||||
|
```sh
|
||||||
|
npm install --save-dev @babel/core
|
||||||
|
```
|
||||||
|
|
||||||
|
or using yarn:
|
||||||
|
|
||||||
|
```sh
|
||||||
|
yarn add @babel/core --dev
|
||||||
|
```
|
|
@ -0,0 +1,3 @@
|
||||||
|
0 && 0;
|
||||||
|
|
||||||
|
//# sourceMappingURL=cache-contexts.js.map
|
|
@ -0,0 +1 @@
|
||||||
|
{"version":3,"names":[],"sources":["../../src/config/cache-contexts.ts"],"sourcesContent":["import type { Targets } from \"@babel/helper-compilation-targets\";\n\nimport type { ConfigContext } from \"./config-chain.ts\";\nimport type { CallerMetadata } from \"./validation/options.ts\";\n\nexport type { ConfigContext as FullConfig };\n\nexport type FullPreset = {\n targets: Targets;\n} & ConfigContext;\nexport type FullPlugin = {\n assumptions: { [name: string]: boolean };\n} & FullPreset;\n\n// Context not including filename since it is used in places that cannot\n// process 'ignore'/'only' and other filename-based logic.\nexport type SimpleConfig = {\n envName: string;\n caller: CallerMetadata | undefined;\n};\nexport type SimplePreset = {\n targets: Targets;\n} & SimpleConfig;\nexport type SimplePlugin = {\n assumptions: {\n [name: string]: boolean;\n };\n} & SimplePreset;\n"],"mappings":"","ignoreList":[]}
|
|
@ -0,0 +1,261 @@
|
||||||
|
"use strict";
|
||||||
|
|
||||||
|
Object.defineProperty(exports, "__esModule", {
|
||||||
|
value: true
|
||||||
|
});
|
||||||
|
exports.assertSimpleType = assertSimpleType;
|
||||||
|
exports.makeStrongCache = makeStrongCache;
|
||||||
|
exports.makeStrongCacheSync = makeStrongCacheSync;
|
||||||
|
exports.makeWeakCache = makeWeakCache;
|
||||||
|
exports.makeWeakCacheSync = makeWeakCacheSync;
|
||||||
|
function _gensync() {
|
||||||
|
const data = require("gensync");
|
||||||
|
_gensync = function () {
|
||||||
|
return data;
|
||||||
|
};
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
var _async = require("../gensync-utils/async.js");
|
||||||
|
var _util = require("./util.js");
|
||||||
|
const synchronize = gen => {
|
||||||
|
return _gensync()(gen).sync;
|
||||||
|
};
|
||||||
|
function* genTrue() {
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
function makeWeakCache(handler) {
|
||||||
|
return makeCachedFunction(WeakMap, handler);
|
||||||
|
}
|
||||||
|
function makeWeakCacheSync(handler) {
|
||||||
|
return synchronize(makeWeakCache(handler));
|
||||||
|
}
|
||||||
|
function makeStrongCache(handler) {
|
||||||
|
return makeCachedFunction(Map, handler);
|
||||||
|
}
|
||||||
|
function makeStrongCacheSync(handler) {
|
||||||
|
return synchronize(makeStrongCache(handler));
|
||||||
|
}
|
||||||
|
function makeCachedFunction(CallCache, handler) {
|
||||||
|
const callCacheSync = new CallCache();
|
||||||
|
const callCacheAsync = new CallCache();
|
||||||
|
const futureCache = new CallCache();
|
||||||
|
return function* cachedFunction(arg, data) {
|
||||||
|
const asyncContext = yield* (0, _async.isAsync)();
|
||||||
|
const callCache = asyncContext ? callCacheAsync : callCacheSync;
|
||||||
|
const cached = yield* getCachedValueOrWait(asyncContext, callCache, futureCache, arg, data);
|
||||||
|
if (cached.valid) return cached.value;
|
||||||
|
const cache = new CacheConfigurator(data);
|
||||||
|
const handlerResult = handler(arg, cache);
|
||||||
|
let finishLock;
|
||||||
|
let value;
|
||||||
|
if ((0, _util.isIterableIterator)(handlerResult)) {
|
||||||
|
value = yield* (0, _async.onFirstPause)(handlerResult, () => {
|
||||||
|
finishLock = setupAsyncLocks(cache, futureCache, arg);
|
||||||
|
});
|
||||||
|
} else {
|
||||||
|
value = handlerResult;
|
||||||
|
}
|
||||||
|
updateFunctionCache(callCache, cache, arg, value);
|
||||||
|
if (finishLock) {
|
||||||
|
futureCache.delete(arg);
|
||||||
|
finishLock.release(value);
|
||||||
|
}
|
||||||
|
return value;
|
||||||
|
};
|
||||||
|
}
|
||||||
|
function* getCachedValue(cache, arg, data) {
|
||||||
|
const cachedValue = cache.get(arg);
|
||||||
|
if (cachedValue) {
|
||||||
|
for (const {
|
||||||
|
value,
|
||||||
|
valid
|
||||||
|
} of cachedValue) {
|
||||||
|
if (yield* valid(data)) return {
|
||||||
|
valid: true,
|
||||||
|
value
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return {
|
||||||
|
valid: false,
|
||||||
|
value: null
|
||||||
|
};
|
||||||
|
}
|
||||||
|
function* getCachedValueOrWait(asyncContext, callCache, futureCache, arg, data) {
|
||||||
|
const cached = yield* getCachedValue(callCache, arg, data);
|
||||||
|
if (cached.valid) {
|
||||||
|
return cached;
|
||||||
|
}
|
||||||
|
if (asyncContext) {
|
||||||
|
const cached = yield* getCachedValue(futureCache, arg, data);
|
||||||
|
if (cached.valid) {
|
||||||
|
const value = yield* (0, _async.waitFor)(cached.value.promise);
|
||||||
|
return {
|
||||||
|
valid: true,
|
||||||
|
value
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return {
|
||||||
|
valid: false,
|
||||||
|
value: null
|
||||||
|
};
|
||||||
|
}
|
||||||
|
function setupAsyncLocks(config, futureCache, arg) {
|
||||||
|
const finishLock = new Lock();
|
||||||
|
updateFunctionCache(futureCache, config, arg, finishLock);
|
||||||
|
return finishLock;
|
||||||
|
}
|
||||||
|
function updateFunctionCache(cache, config, arg, value) {
|
||||||
|
if (!config.configured()) config.forever();
|
||||||
|
let cachedValue = cache.get(arg);
|
||||||
|
config.deactivate();
|
||||||
|
switch (config.mode()) {
|
||||||
|
case "forever":
|
||||||
|
cachedValue = [{
|
||||||
|
value,
|
||||||
|
valid: genTrue
|
||||||
|
}];
|
||||||
|
cache.set(arg, cachedValue);
|
||||||
|
break;
|
||||||
|
case "invalidate":
|
||||||
|
cachedValue = [{
|
||||||
|
value,
|
||||||
|
valid: config.validator()
|
||||||
|
}];
|
||||||
|
cache.set(arg, cachedValue);
|
||||||
|
break;
|
||||||
|
case "valid":
|
||||||
|
if (cachedValue) {
|
||||||
|
cachedValue.push({
|
||||||
|
value,
|
||||||
|
valid: config.validator()
|
||||||
|
});
|
||||||
|
} else {
|
||||||
|
cachedValue = [{
|
||||||
|
value,
|
||||||
|
valid: config.validator()
|
||||||
|
}];
|
||||||
|
cache.set(arg, cachedValue);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
class CacheConfigurator {
|
||||||
|
constructor(data) {
|
||||||
|
this._active = true;
|
||||||
|
this._never = false;
|
||||||
|
this._forever = false;
|
||||||
|
this._invalidate = false;
|
||||||
|
this._configured = false;
|
||||||
|
this._pairs = [];
|
||||||
|
this._data = void 0;
|
||||||
|
this._data = data;
|
||||||
|
}
|
||||||
|
simple() {
|
||||||
|
return makeSimpleConfigurator(this);
|
||||||
|
}
|
||||||
|
mode() {
|
||||||
|
if (this._never) return "never";
|
||||||
|
if (this._forever) return "forever";
|
||||||
|
if (this._invalidate) return "invalidate";
|
||||||
|
return "valid";
|
||||||
|
}
|
||||||
|
forever() {
|
||||||
|
if (!this._active) {
|
||||||
|
throw new Error("Cannot change caching after evaluation has completed.");
|
||||||
|
}
|
||||||
|
if (this._never) {
|
||||||
|
throw new Error("Caching has already been configured with .never()");
|
||||||
|
}
|
||||||
|
this._forever = true;
|
||||||
|
this._configured = true;
|
||||||
|
}
|
||||||
|
never() {
|
||||||
|
if (!this._active) {
|
||||||
|
throw new Error("Cannot change caching after evaluation has completed.");
|
||||||
|
}
|
||||||
|
if (this._forever) {
|
||||||
|
throw new Error("Caching has already been configured with .forever()");
|
||||||
|
}
|
||||||
|
this._never = true;
|
||||||
|
this._configured = true;
|
||||||
|
}
|
||||||
|
using(handler) {
|
||||||
|
if (!this._active) {
|
||||||
|
throw new Error("Cannot change caching after evaluation has completed.");
|
||||||
|
}
|
||||||
|
if (this._never || this._forever) {
|
||||||
|
throw new Error("Caching has already been configured with .never or .forever()");
|
||||||
|
}
|
||||||
|
this._configured = true;
|
||||||
|
const key = handler(this._data);
|
||||||
|
const fn = (0, _async.maybeAsync)(handler, `You appear to be using an async cache handler, but Babel has been called synchronously`);
|
||||||
|
if ((0, _async.isThenable)(key)) {
|
||||||
|
return key.then(key => {
|
||||||
|
this._pairs.push([key, fn]);
|
||||||
|
return key;
|
||||||
|
});
|
||||||
|
}
|
||||||
|
this._pairs.push([key, fn]);
|
||||||
|
return key;
|
||||||
|
}
|
||||||
|
invalidate(handler) {
|
||||||
|
this._invalidate = true;
|
||||||
|
return this.using(handler);
|
||||||
|
}
|
||||||
|
validator() {
|
||||||
|
const pairs = this._pairs;
|
||||||
|
return function* (data) {
|
||||||
|
for (const [key, fn] of pairs) {
|
||||||
|
if (key !== (yield* fn(data))) return false;
|
||||||
|
}
|
||||||
|
return true;
|
||||||
|
};
|
||||||
|
}
|
||||||
|
deactivate() {
|
||||||
|
this._active = false;
|
||||||
|
}
|
||||||
|
configured() {
|
||||||
|
return this._configured;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
function makeSimpleConfigurator(cache) {
|
||||||
|
function cacheFn(val) {
|
||||||
|
if (typeof val === "boolean") {
|
||||||
|
if (val) cache.forever();else cache.never();
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
return cache.using(() => assertSimpleType(val()));
|
||||||
|
}
|
||||||
|
cacheFn.forever = () => cache.forever();
|
||||||
|
cacheFn.never = () => cache.never();
|
||||||
|
cacheFn.using = cb => cache.using(() => assertSimpleType(cb()));
|
||||||
|
cacheFn.invalidate = cb => cache.invalidate(() => assertSimpleType(cb()));
|
||||||
|
return cacheFn;
|
||||||
|
}
|
||||||
|
function assertSimpleType(value) {
|
||||||
|
if ((0, _async.isThenable)(value)) {
|
||||||
|
throw new Error(`You appear to be using an async cache handler, ` + `which your current version of Babel does not support. ` + `We may add support for this in the future, ` + `but if you're on the most recent version of @babel/core and still ` + `seeing this error, then you'll need to synchronously handle your caching logic.`);
|
||||||
|
}
|
||||||
|
if (value != null && typeof value !== "string" && typeof value !== "boolean" && typeof value !== "number") {
|
||||||
|
throw new Error("Cache keys must be either string, boolean, number, null, or undefined.");
|
||||||
|
}
|
||||||
|
return value;
|
||||||
|
}
|
||||||
|
class Lock {
|
||||||
|
constructor() {
|
||||||
|
this.released = false;
|
||||||
|
this.promise = void 0;
|
||||||
|
this._resolve = void 0;
|
||||||
|
this.promise = new Promise(resolve => {
|
||||||
|
this._resolve = resolve;
|
||||||
|
});
|
||||||
|
}
|
||||||
|
release(value) {
|
||||||
|
this.released = true;
|
||||||
|
this._resolve(value);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
0 && 0;
|
||||||
|
|
||||||
|
//# sourceMappingURL=caching.js.map
|
File diff suppressed because one or more lines are too long
|
@ -0,0 +1,469 @@
|
||||||
|
"use strict";
|
||||||
|
|
||||||
|
Object.defineProperty(exports, "__esModule", {
|
||||||
|
value: true
|
||||||
|
});
|
||||||
|
exports.buildPresetChain = buildPresetChain;
|
||||||
|
exports.buildPresetChainWalker = void 0;
|
||||||
|
exports.buildRootChain = buildRootChain;
|
||||||
|
function _path() {
|
||||||
|
const data = require("path");
|
||||||
|
_path = function () {
|
||||||
|
return data;
|
||||||
|
};
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
function _debug() {
|
||||||
|
const data = require("debug");
|
||||||
|
_debug = function () {
|
||||||
|
return data;
|
||||||
|
};
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
var _options = require("./validation/options.js");
|
||||||
|
var _patternToRegex = require("./pattern-to-regex.js");
|
||||||
|
var _printer = require("./printer.js");
|
||||||
|
var _rewriteStackTrace = require("../errors/rewrite-stack-trace.js");
|
||||||
|
var _configError = require("../errors/config-error.js");
|
||||||
|
var _index = require("./files/index.js");
|
||||||
|
var _caching = require("./caching.js");
|
||||||
|
var _configDescriptors = require("./config-descriptors.js");
|
||||||
|
const debug = _debug()("babel:config:config-chain");
|
||||||
|
function* buildPresetChain(arg, context) {
|
||||||
|
const chain = yield* buildPresetChainWalker(arg, context);
|
||||||
|
if (!chain) return null;
|
||||||
|
return {
|
||||||
|
plugins: dedupDescriptors(chain.plugins),
|
||||||
|
presets: dedupDescriptors(chain.presets),
|
||||||
|
options: chain.options.map(o => normalizeOptions(o)),
|
||||||
|
files: new Set()
|
||||||
|
};
|
||||||
|
}
|
||||||
|
const buildPresetChainWalker = exports.buildPresetChainWalker = makeChainWalker({
|
||||||
|
root: preset => loadPresetDescriptors(preset),
|
||||||
|
env: (preset, envName) => loadPresetEnvDescriptors(preset)(envName),
|
||||||
|
overrides: (preset, index) => loadPresetOverridesDescriptors(preset)(index),
|
||||||
|
overridesEnv: (preset, index, envName) => loadPresetOverridesEnvDescriptors(preset)(index)(envName),
|
||||||
|
createLogger: () => () => {}
|
||||||
|
});
|
||||||
|
const loadPresetDescriptors = (0, _caching.makeWeakCacheSync)(preset => buildRootDescriptors(preset, preset.alias, _configDescriptors.createUncachedDescriptors));
|
||||||
|
const loadPresetEnvDescriptors = (0, _caching.makeWeakCacheSync)(preset => (0, _caching.makeStrongCacheSync)(envName => buildEnvDescriptors(preset, preset.alias, _configDescriptors.createUncachedDescriptors, envName)));
|
||||||
|
const loadPresetOverridesDescriptors = (0, _caching.makeWeakCacheSync)(preset => (0, _caching.makeStrongCacheSync)(index => buildOverrideDescriptors(preset, preset.alias, _configDescriptors.createUncachedDescriptors, index)));
|
||||||
|
const loadPresetOverridesEnvDescriptors = (0, _caching.makeWeakCacheSync)(preset => (0, _caching.makeStrongCacheSync)(index => (0, _caching.makeStrongCacheSync)(envName => buildOverrideEnvDescriptors(preset, preset.alias, _configDescriptors.createUncachedDescriptors, index, envName))));
|
||||||
|
function* buildRootChain(opts, context) {
|
||||||
|
let configReport, babelRcReport;
|
||||||
|
const programmaticLogger = new _printer.ConfigPrinter();
|
||||||
|
const programmaticChain = yield* loadProgrammaticChain({
|
||||||
|
options: opts,
|
||||||
|
dirname: context.cwd
|
||||||
|
}, context, undefined, programmaticLogger);
|
||||||
|
if (!programmaticChain) return null;
|
||||||
|
const programmaticReport = yield* programmaticLogger.output();
|
||||||
|
let configFile;
|
||||||
|
if (typeof opts.configFile === "string") {
|
||||||
|
configFile = yield* (0, _index.loadConfig)(opts.configFile, context.cwd, context.envName, context.caller);
|
||||||
|
} else if (opts.configFile !== false) {
|
||||||
|
configFile = yield* (0, _index.findRootConfig)(context.root, context.envName, context.caller);
|
||||||
|
}
|
||||||
|
let {
|
||||||
|
babelrc,
|
||||||
|
babelrcRoots
|
||||||
|
} = opts;
|
||||||
|
let babelrcRootsDirectory = context.cwd;
|
||||||
|
const configFileChain = emptyChain();
|
||||||
|
const configFileLogger = new _printer.ConfigPrinter();
|
||||||
|
if (configFile) {
|
||||||
|
const validatedFile = validateConfigFile(configFile);
|
||||||
|
const result = yield* loadFileChain(validatedFile, context, undefined, configFileLogger);
|
||||||
|
if (!result) return null;
|
||||||
|
configReport = yield* configFileLogger.output();
|
||||||
|
if (babelrc === undefined) {
|
||||||
|
babelrc = validatedFile.options.babelrc;
|
||||||
|
}
|
||||||
|
if (babelrcRoots === undefined) {
|
||||||
|
babelrcRootsDirectory = validatedFile.dirname;
|
||||||
|
babelrcRoots = validatedFile.options.babelrcRoots;
|
||||||
|
}
|
||||||
|
mergeChain(configFileChain, result);
|
||||||
|
}
|
||||||
|
let ignoreFile, babelrcFile;
|
||||||
|
let isIgnored = false;
|
||||||
|
const fileChain = emptyChain();
|
||||||
|
if ((babelrc === true || babelrc === undefined) && typeof context.filename === "string") {
|
||||||
|
const pkgData = yield* (0, _index.findPackageData)(context.filename);
|
||||||
|
if (pkgData && babelrcLoadEnabled(context, pkgData, babelrcRoots, babelrcRootsDirectory)) {
|
||||||
|
({
|
||||||
|
ignore: ignoreFile,
|
||||||
|
config: babelrcFile
|
||||||
|
} = yield* (0, _index.findRelativeConfig)(pkgData, context.envName, context.caller));
|
||||||
|
if (ignoreFile) {
|
||||||
|
fileChain.files.add(ignoreFile.filepath);
|
||||||
|
}
|
||||||
|
if (ignoreFile && shouldIgnore(context, ignoreFile.ignore, null, ignoreFile.dirname)) {
|
||||||
|
isIgnored = true;
|
||||||
|
}
|
||||||
|
if (babelrcFile && !isIgnored) {
|
||||||
|
const validatedFile = validateBabelrcFile(babelrcFile);
|
||||||
|
const babelrcLogger = new _printer.ConfigPrinter();
|
||||||
|
const result = yield* loadFileChain(validatedFile, context, undefined, babelrcLogger);
|
||||||
|
if (!result) {
|
||||||
|
isIgnored = true;
|
||||||
|
} else {
|
||||||
|
babelRcReport = yield* babelrcLogger.output();
|
||||||
|
mergeChain(fileChain, result);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (babelrcFile && isIgnored) {
|
||||||
|
fileChain.files.add(babelrcFile.filepath);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (context.showConfig) {
|
||||||
|
console.log(`Babel configs on "${context.filename}" (ascending priority):\n` + [configReport, babelRcReport, programmaticReport].filter(x => !!x).join("\n\n") + "\n-----End Babel configs-----");
|
||||||
|
}
|
||||||
|
const chain = mergeChain(mergeChain(mergeChain(emptyChain(), configFileChain), fileChain), programmaticChain);
|
||||||
|
return {
|
||||||
|
plugins: isIgnored ? [] : dedupDescriptors(chain.plugins),
|
||||||
|
presets: isIgnored ? [] : dedupDescriptors(chain.presets),
|
||||||
|
options: isIgnored ? [] : chain.options.map(o => normalizeOptions(o)),
|
||||||
|
fileHandling: isIgnored ? "ignored" : "transpile",
|
||||||
|
ignore: ignoreFile || undefined,
|
||||||
|
babelrc: babelrcFile || undefined,
|
||||||
|
config: configFile || undefined,
|
||||||
|
files: chain.files
|
||||||
|
};
|
||||||
|
}
|
||||||
|
function babelrcLoadEnabled(context, pkgData, babelrcRoots, babelrcRootsDirectory) {
|
||||||
|
if (typeof babelrcRoots === "boolean") return babelrcRoots;
|
||||||
|
const absoluteRoot = context.root;
|
||||||
|
if (babelrcRoots === undefined) {
|
||||||
|
return pkgData.directories.includes(absoluteRoot);
|
||||||
|
}
|
||||||
|
let babelrcPatterns = babelrcRoots;
|
||||||
|
if (!Array.isArray(babelrcPatterns)) {
|
||||||
|
babelrcPatterns = [babelrcPatterns];
|
||||||
|
}
|
||||||
|
babelrcPatterns = babelrcPatterns.map(pat => {
|
||||||
|
return typeof pat === "string" ? _path().resolve(babelrcRootsDirectory, pat) : pat;
|
||||||
|
});
|
||||||
|
if (babelrcPatterns.length === 1 && babelrcPatterns[0] === absoluteRoot) {
|
||||||
|
return pkgData.directories.includes(absoluteRoot);
|
||||||
|
}
|
||||||
|
return babelrcPatterns.some(pat => {
|
||||||
|
if (typeof pat === "string") {
|
||||||
|
pat = (0, _patternToRegex.default)(pat, babelrcRootsDirectory);
|
||||||
|
}
|
||||||
|
return pkgData.directories.some(directory => {
|
||||||
|
return matchPattern(pat, babelrcRootsDirectory, directory, context);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
}
|
||||||
|
const validateConfigFile = (0, _caching.makeWeakCacheSync)(file => ({
|
||||||
|
filepath: file.filepath,
|
||||||
|
dirname: file.dirname,
|
||||||
|
options: (0, _options.validate)("configfile", file.options, file.filepath)
|
||||||
|
}));
|
||||||
|
const validateBabelrcFile = (0, _caching.makeWeakCacheSync)(file => ({
|
||||||
|
filepath: file.filepath,
|
||||||
|
dirname: file.dirname,
|
||||||
|
options: (0, _options.validate)("babelrcfile", file.options, file.filepath)
|
||||||
|
}));
|
||||||
|
const validateExtendFile = (0, _caching.makeWeakCacheSync)(file => ({
|
||||||
|
filepath: file.filepath,
|
||||||
|
dirname: file.dirname,
|
||||||
|
options: (0, _options.validate)("extendsfile", file.options, file.filepath)
|
||||||
|
}));
|
||||||
|
const loadProgrammaticChain = makeChainWalker({
|
||||||
|
root: input => buildRootDescriptors(input, "base", _configDescriptors.createCachedDescriptors),
|
||||||
|
env: (input, envName) => buildEnvDescriptors(input, "base", _configDescriptors.createCachedDescriptors, envName),
|
||||||
|
overrides: (input, index) => buildOverrideDescriptors(input, "base", _configDescriptors.createCachedDescriptors, index),
|
||||||
|
overridesEnv: (input, index, envName) => buildOverrideEnvDescriptors(input, "base", _configDescriptors.createCachedDescriptors, index, envName),
|
||||||
|
createLogger: (input, context, baseLogger) => buildProgrammaticLogger(input, context, baseLogger)
|
||||||
|
});
|
||||||
|
const loadFileChainWalker = makeChainWalker({
|
||||||
|
root: file => loadFileDescriptors(file),
|
||||||
|
env: (file, envName) => loadFileEnvDescriptors(file)(envName),
|
||||||
|
overrides: (file, index) => loadFileOverridesDescriptors(file)(index),
|
||||||
|
overridesEnv: (file, index, envName) => loadFileOverridesEnvDescriptors(file)(index)(envName),
|
||||||
|
createLogger: (file, context, baseLogger) => buildFileLogger(file.filepath, context, baseLogger)
|
||||||
|
});
|
||||||
|
function* loadFileChain(input, context, files, baseLogger) {
|
||||||
|
const chain = yield* loadFileChainWalker(input, context, files, baseLogger);
|
||||||
|
chain == null || chain.files.add(input.filepath);
|
||||||
|
return chain;
|
||||||
|
}
|
||||||
|
const loadFileDescriptors = (0, _caching.makeWeakCacheSync)(file => buildRootDescriptors(file, file.filepath, _configDescriptors.createUncachedDescriptors));
|
||||||
|
const loadFileEnvDescriptors = (0, _caching.makeWeakCacheSync)(file => (0, _caching.makeStrongCacheSync)(envName => buildEnvDescriptors(file, file.filepath, _configDescriptors.createUncachedDescriptors, envName)));
|
||||||
|
const loadFileOverridesDescriptors = (0, _caching.makeWeakCacheSync)(file => (0, _caching.makeStrongCacheSync)(index => buildOverrideDescriptors(file, file.filepath, _configDescriptors.createUncachedDescriptors, index)));
|
||||||
|
const loadFileOverridesEnvDescriptors = (0, _caching.makeWeakCacheSync)(file => (0, _caching.makeStrongCacheSync)(index => (0, _caching.makeStrongCacheSync)(envName => buildOverrideEnvDescriptors(file, file.filepath, _configDescriptors.createUncachedDescriptors, index, envName))));
|
||||||
|
function buildFileLogger(filepath, context, baseLogger) {
|
||||||
|
if (!baseLogger) {
|
||||||
|
return () => {};
|
||||||
|
}
|
||||||
|
return baseLogger.configure(context.showConfig, _printer.ChainFormatter.Config, {
|
||||||
|
filepath
|
||||||
|
});
|
||||||
|
}
|
||||||
|
function buildRootDescriptors({
|
||||||
|
dirname,
|
||||||
|
options
|
||||||
|
}, alias, descriptors) {
|
||||||
|
return descriptors(dirname, options, alias);
|
||||||
|
}
|
||||||
|
function buildProgrammaticLogger(_, context, baseLogger) {
|
||||||
|
var _context$caller;
|
||||||
|
if (!baseLogger) {
|
||||||
|
return () => {};
|
||||||
|
}
|
||||||
|
return baseLogger.configure(context.showConfig, _printer.ChainFormatter.Programmatic, {
|
||||||
|
callerName: (_context$caller = context.caller) == null ? void 0 : _context$caller.name
|
||||||
|
});
|
||||||
|
}
|
||||||
|
function buildEnvDescriptors({
|
||||||
|
dirname,
|
||||||
|
options
|
||||||
|
}, alias, descriptors, envName) {
|
||||||
|
var _options$env;
|
||||||
|
const opts = (_options$env = options.env) == null ? void 0 : _options$env[envName];
|
||||||
|
return opts ? descriptors(dirname, opts, `${alias}.env["${envName}"]`) : null;
|
||||||
|
}
|
||||||
|
function buildOverrideDescriptors({
|
||||||
|
dirname,
|
||||||
|
options
|
||||||
|
}, alias, descriptors, index) {
|
||||||
|
var _options$overrides;
|
||||||
|
const opts = (_options$overrides = options.overrides) == null ? void 0 : _options$overrides[index];
|
||||||
|
if (!opts) throw new Error("Assertion failure - missing override");
|
||||||
|
return descriptors(dirname, opts, `${alias}.overrides[${index}]`);
|
||||||
|
}
|
||||||
|
function buildOverrideEnvDescriptors({
|
||||||
|
dirname,
|
||||||
|
options
|
||||||
|
}, alias, descriptors, index, envName) {
|
||||||
|
var _options$overrides2, _override$env;
|
||||||
|
const override = (_options$overrides2 = options.overrides) == null ? void 0 : _options$overrides2[index];
|
||||||
|
if (!override) throw new Error("Assertion failure - missing override");
|
||||||
|
const opts = (_override$env = override.env) == null ? void 0 : _override$env[envName];
|
||||||
|
return opts ? descriptors(dirname, opts, `${alias}.overrides[${index}].env["${envName}"]`) : null;
|
||||||
|
}
|
||||||
|
function makeChainWalker({
|
||||||
|
root,
|
||||||
|
env,
|
||||||
|
overrides,
|
||||||
|
overridesEnv,
|
||||||
|
createLogger
|
||||||
|
}) {
|
||||||
|
return function* chainWalker(input, context, files = new Set(), baseLogger) {
|
||||||
|
const {
|
||||||
|
dirname
|
||||||
|
} = input;
|
||||||
|
const flattenedConfigs = [];
|
||||||
|
const rootOpts = root(input);
|
||||||
|
if (configIsApplicable(rootOpts, dirname, context, input.filepath)) {
|
||||||
|
flattenedConfigs.push({
|
||||||
|
config: rootOpts,
|
||||||
|
envName: undefined,
|
||||||
|
index: undefined
|
||||||
|
});
|
||||||
|
const envOpts = env(input, context.envName);
|
||||||
|
if (envOpts && configIsApplicable(envOpts, dirname, context, input.filepath)) {
|
||||||
|
flattenedConfigs.push({
|
||||||
|
config: envOpts,
|
||||||
|
envName: context.envName,
|
||||||
|
index: undefined
|
||||||
|
});
|
||||||
|
}
|
||||||
|
(rootOpts.options.overrides || []).forEach((_, index) => {
|
||||||
|
const overrideOps = overrides(input, index);
|
||||||
|
if (configIsApplicable(overrideOps, dirname, context, input.filepath)) {
|
||||||
|
flattenedConfigs.push({
|
||||||
|
config: overrideOps,
|
||||||
|
index,
|
||||||
|
envName: undefined
|
||||||
|
});
|
||||||
|
const overrideEnvOpts = overridesEnv(input, index, context.envName);
|
||||||
|
if (overrideEnvOpts && configIsApplicable(overrideEnvOpts, dirname, context, input.filepath)) {
|
||||||
|
flattenedConfigs.push({
|
||||||
|
config: overrideEnvOpts,
|
||||||
|
index,
|
||||||
|
envName: context.envName
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
if (flattenedConfigs.some(({
|
||||||
|
config: {
|
||||||
|
options: {
|
||||||
|
ignore,
|
||||||
|
only
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}) => shouldIgnore(context, ignore, only, dirname))) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
const chain = emptyChain();
|
||||||
|
const logger = createLogger(input, context, baseLogger);
|
||||||
|
for (const {
|
||||||
|
config,
|
||||||
|
index,
|
||||||
|
envName
|
||||||
|
} of flattenedConfigs) {
|
||||||
|
if (!(yield* mergeExtendsChain(chain, config.options, dirname, context, files, baseLogger))) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
logger(config, index, envName);
|
||||||
|
yield* mergeChainOpts(chain, config);
|
||||||
|
}
|
||||||
|
return chain;
|
||||||
|
};
|
||||||
|
}
|
||||||
|
function* mergeExtendsChain(chain, opts, dirname, context, files, baseLogger) {
|
||||||
|
if (opts.extends === undefined) return true;
|
||||||
|
const file = yield* (0, _index.loadConfig)(opts.extends, dirname, context.envName, context.caller);
|
||||||
|
if (files.has(file)) {
|
||||||
|
throw new Error(`Configuration cycle detected loading ${file.filepath}.\n` + `File already loaded following the config chain:\n` + Array.from(files, file => ` - ${file.filepath}`).join("\n"));
|
||||||
|
}
|
||||||
|
files.add(file);
|
||||||
|
const fileChain = yield* loadFileChain(validateExtendFile(file), context, files, baseLogger);
|
||||||
|
files.delete(file);
|
||||||
|
if (!fileChain) return false;
|
||||||
|
mergeChain(chain, fileChain);
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
function mergeChain(target, source) {
|
||||||
|
target.options.push(...source.options);
|
||||||
|
target.plugins.push(...source.plugins);
|
||||||
|
target.presets.push(...source.presets);
|
||||||
|
for (const file of source.files) {
|
||||||
|
target.files.add(file);
|
||||||
|
}
|
||||||
|
return target;
|
||||||
|
}
|
||||||
|
function* mergeChainOpts(target, {
|
||||||
|
options,
|
||||||
|
plugins,
|
||||||
|
presets
|
||||||
|
}) {
|
||||||
|
target.options.push(options);
|
||||||
|
target.plugins.push(...(yield* plugins()));
|
||||||
|
target.presets.push(...(yield* presets()));
|
||||||
|
return target;
|
||||||
|
}
|
||||||
|
function emptyChain() {
|
||||||
|
return {
|
||||||
|
options: [],
|
||||||
|
presets: [],
|
||||||
|
plugins: [],
|
||||||
|
files: new Set()
|
||||||
|
};
|
||||||
|
}
|
||||||
|
function normalizeOptions(opts) {
|
||||||
|
const options = Object.assign({}, opts);
|
||||||
|
delete options.extends;
|
||||||
|
delete options.env;
|
||||||
|
delete options.overrides;
|
||||||
|
delete options.plugins;
|
||||||
|
delete options.presets;
|
||||||
|
delete options.passPerPreset;
|
||||||
|
delete options.ignore;
|
||||||
|
delete options.only;
|
||||||
|
delete options.test;
|
||||||
|
delete options.include;
|
||||||
|
delete options.exclude;
|
||||||
|
if (hasOwnProperty.call(options, "sourceMap")) {
|
||||||
|
options.sourceMaps = options.sourceMap;
|
||||||
|
delete options.sourceMap;
|
||||||
|
}
|
||||||
|
return options;
|
||||||
|
}
|
||||||
|
function dedupDescriptors(items) {
|
||||||
|
const map = new Map();
|
||||||
|
const descriptors = [];
|
||||||
|
for (const item of items) {
|
||||||
|
if (typeof item.value === "function") {
|
||||||
|
const fnKey = item.value;
|
||||||
|
let nameMap = map.get(fnKey);
|
||||||
|
if (!nameMap) {
|
||||||
|
nameMap = new Map();
|
||||||
|
map.set(fnKey, nameMap);
|
||||||
|
}
|
||||||
|
let desc = nameMap.get(item.name);
|
||||||
|
if (!desc) {
|
||||||
|
desc = {
|
||||||
|
value: item
|
||||||
|
};
|
||||||
|
descriptors.push(desc);
|
||||||
|
if (!item.ownPass) nameMap.set(item.name, desc);
|
||||||
|
} else {
|
||||||
|
desc.value = item;
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
descriptors.push({
|
||||||
|
value: item
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return descriptors.reduce((acc, desc) => {
|
||||||
|
acc.push(desc.value);
|
||||||
|
return acc;
|
||||||
|
}, []);
|
||||||
|
}
|
||||||
|
function configIsApplicable({
|
||||||
|
options
|
||||||
|
}, dirname, context, configName) {
|
||||||
|
return (options.test === undefined || configFieldIsApplicable(context, options.test, dirname, configName)) && (options.include === undefined || configFieldIsApplicable(context, options.include, dirname, configName)) && (options.exclude === undefined || !configFieldIsApplicable(context, options.exclude, dirname, configName));
|
||||||
|
}
|
||||||
|
function configFieldIsApplicable(context, test, dirname, configName) {
|
||||||
|
const patterns = Array.isArray(test) ? test : [test];
|
||||||
|
return matchesPatterns(context, patterns, dirname, configName);
|
||||||
|
}
|
||||||
|
function ignoreListReplacer(_key, value) {
|
||||||
|
if (value instanceof RegExp) {
|
||||||
|
return String(value);
|
||||||
|
}
|
||||||
|
return value;
|
||||||
|
}
|
||||||
|
function shouldIgnore(context, ignore, only, dirname) {
|
||||||
|
if (ignore && matchesPatterns(context, ignore, dirname)) {
|
||||||
|
var _context$filename;
|
||||||
|
const message = `No config is applied to "${(_context$filename = context.filename) != null ? _context$filename : "(unknown)"}" because it matches one of \`ignore: ${JSON.stringify(ignore, ignoreListReplacer)}\` from "${dirname}"`;
|
||||||
|
debug(message);
|
||||||
|
if (context.showConfig) {
|
||||||
|
console.log(message);
|
||||||
|
}
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
if (only && !matchesPatterns(context, only, dirname)) {
|
||||||
|
var _context$filename2;
|
||||||
|
const message = `No config is applied to "${(_context$filename2 = context.filename) != null ? _context$filename2 : "(unknown)"}" because it fails to match one of \`only: ${JSON.stringify(only, ignoreListReplacer)}\` from "${dirname}"`;
|
||||||
|
debug(message);
|
||||||
|
if (context.showConfig) {
|
||||||
|
console.log(message);
|
||||||
|
}
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
function matchesPatterns(context, patterns, dirname, configName) {
|
||||||
|
return patterns.some(pattern => matchPattern(pattern, dirname, context.filename, context, configName));
|
||||||
|
}
|
||||||
|
function matchPattern(pattern, dirname, pathToTest, context, configName) {
|
||||||
|
if (typeof pattern === "function") {
|
||||||
|
return !!(0, _rewriteStackTrace.endHiddenCallStack)(pattern)(pathToTest, {
|
||||||
|
dirname,
|
||||||
|
envName: context.envName,
|
||||||
|
caller: context.caller
|
||||||
|
});
|
||||||
|
}
|
||||||
|
if (typeof pathToTest !== "string") {
|
||||||
|
throw new _configError.default(`Configuration contains string/RegExp pattern, but no filename was passed to Babel`, configName);
|
||||||
|
}
|
||||||
|
if (typeof pattern === "string") {
|
||||||
|
pattern = (0, _patternToRegex.default)(pattern, dirname);
|
||||||
|
}
|
||||||
|
return pattern.test(pathToTest);
|
||||||
|
}
|
||||||
|
0 && 0;
|
||||||
|
|
||||||
|
//# sourceMappingURL=config-chain.js.map
|
File diff suppressed because one or more lines are too long
|
@ -0,0 +1,190 @@
|
||||||
|
"use strict";
|
||||||
|
|
||||||
|
Object.defineProperty(exports, "__esModule", {
|
||||||
|
value: true
|
||||||
|
});
|
||||||
|
exports.createCachedDescriptors = createCachedDescriptors;
|
||||||
|
exports.createDescriptor = createDescriptor;
|
||||||
|
exports.createUncachedDescriptors = createUncachedDescriptors;
|
||||||
|
function _gensync() {
|
||||||
|
const data = require("gensync");
|
||||||
|
_gensync = function () {
|
||||||
|
return data;
|
||||||
|
};
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
var _functional = require("../gensync-utils/functional.js");
|
||||||
|
var _index = require("./files/index.js");
|
||||||
|
var _item = require("./item.js");
|
||||||
|
var _caching = require("./caching.js");
|
||||||
|
var _resolveTargets = require("./resolve-targets.js");
|
||||||
|
function isEqualDescriptor(a, b) {
|
||||||
|
var _a$file, _b$file, _a$file2, _b$file2;
|
||||||
|
return a.name === b.name && a.value === b.value && a.options === b.options && a.dirname === b.dirname && a.alias === b.alias && a.ownPass === b.ownPass && ((_a$file = a.file) == null ? void 0 : _a$file.request) === ((_b$file = b.file) == null ? void 0 : _b$file.request) && ((_a$file2 = a.file) == null ? void 0 : _a$file2.resolved) === ((_b$file2 = b.file) == null ? void 0 : _b$file2.resolved);
|
||||||
|
}
|
||||||
|
function* handlerOf(value) {
|
||||||
|
return value;
|
||||||
|
}
|
||||||
|
function optionsWithResolvedBrowserslistConfigFile(options, dirname) {
|
||||||
|
if (typeof options.browserslistConfigFile === "string") {
|
||||||
|
options.browserslistConfigFile = (0, _resolveTargets.resolveBrowserslistConfigFile)(options.browserslistConfigFile, dirname);
|
||||||
|
}
|
||||||
|
return options;
|
||||||
|
}
|
||||||
|
function createCachedDescriptors(dirname, options, alias) {
|
||||||
|
const {
|
||||||
|
plugins,
|
||||||
|
presets,
|
||||||
|
passPerPreset
|
||||||
|
} = options;
|
||||||
|
return {
|
||||||
|
options: optionsWithResolvedBrowserslistConfigFile(options, dirname),
|
||||||
|
plugins: plugins ? () => createCachedPluginDescriptors(plugins, dirname)(alias) : () => handlerOf([]),
|
||||||
|
presets: presets ? () => createCachedPresetDescriptors(presets, dirname)(alias)(!!passPerPreset) : () => handlerOf([])
|
||||||
|
};
|
||||||
|
}
|
||||||
|
function createUncachedDescriptors(dirname, options, alias) {
|
||||||
|
return {
|
||||||
|
options: optionsWithResolvedBrowserslistConfigFile(options, dirname),
|
||||||
|
plugins: (0, _functional.once)(() => createPluginDescriptors(options.plugins || [], dirname, alias)),
|
||||||
|
presets: (0, _functional.once)(() => createPresetDescriptors(options.presets || [], dirname, alias, !!options.passPerPreset))
|
||||||
|
};
|
||||||
|
}
|
||||||
|
const PRESET_DESCRIPTOR_CACHE = new WeakMap();
|
||||||
|
const createCachedPresetDescriptors = (0, _caching.makeWeakCacheSync)((items, cache) => {
|
||||||
|
const dirname = cache.using(dir => dir);
|
||||||
|
return (0, _caching.makeStrongCacheSync)(alias => (0, _caching.makeStrongCache)(function* (passPerPreset) {
|
||||||
|
const descriptors = yield* createPresetDescriptors(items, dirname, alias, passPerPreset);
|
||||||
|
return descriptors.map(desc => loadCachedDescriptor(PRESET_DESCRIPTOR_CACHE, desc));
|
||||||
|
}));
|
||||||
|
});
|
||||||
|
const PLUGIN_DESCRIPTOR_CACHE = new WeakMap();
|
||||||
|
const createCachedPluginDescriptors = (0, _caching.makeWeakCacheSync)((items, cache) => {
|
||||||
|
const dirname = cache.using(dir => dir);
|
||||||
|
return (0, _caching.makeStrongCache)(function* (alias) {
|
||||||
|
const descriptors = yield* createPluginDescriptors(items, dirname, alias);
|
||||||
|
return descriptors.map(desc => loadCachedDescriptor(PLUGIN_DESCRIPTOR_CACHE, desc));
|
||||||
|
});
|
||||||
|
});
|
||||||
|
const DEFAULT_OPTIONS = {};
|
||||||
|
function loadCachedDescriptor(cache, desc) {
|
||||||
|
const {
|
||||||
|
value,
|
||||||
|
options = DEFAULT_OPTIONS
|
||||||
|
} = desc;
|
||||||
|
if (options === false) return desc;
|
||||||
|
let cacheByOptions = cache.get(value);
|
||||||
|
if (!cacheByOptions) {
|
||||||
|
cacheByOptions = new WeakMap();
|
||||||
|
cache.set(value, cacheByOptions);
|
||||||
|
}
|
||||||
|
let possibilities = cacheByOptions.get(options);
|
||||||
|
if (!possibilities) {
|
||||||
|
possibilities = [];
|
||||||
|
cacheByOptions.set(options, possibilities);
|
||||||
|
}
|
||||||
|
if (!possibilities.includes(desc)) {
|
||||||
|
const matches = possibilities.filter(possibility => isEqualDescriptor(possibility, desc));
|
||||||
|
if (matches.length > 0) {
|
||||||
|
return matches[0];
|
||||||
|
}
|
||||||
|
possibilities.push(desc);
|
||||||
|
}
|
||||||
|
return desc;
|
||||||
|
}
|
||||||
|
function* createPresetDescriptors(items, dirname, alias, passPerPreset) {
|
||||||
|
return yield* createDescriptors("preset", items, dirname, alias, passPerPreset);
|
||||||
|
}
|
||||||
|
function* createPluginDescriptors(items, dirname, alias) {
|
||||||
|
return yield* createDescriptors("plugin", items, dirname, alias);
|
||||||
|
}
|
||||||
|
function* createDescriptors(type, items, dirname, alias, ownPass) {
|
||||||
|
const descriptors = yield* _gensync().all(items.map((item, index) => createDescriptor(item, dirname, {
|
||||||
|
type,
|
||||||
|
alias: `${alias}$${index}`,
|
||||||
|
ownPass: !!ownPass
|
||||||
|
})));
|
||||||
|
assertNoDuplicates(descriptors);
|
||||||
|
return descriptors;
|
||||||
|
}
|
||||||
|
function* createDescriptor(pair, dirname, {
|
||||||
|
type,
|
||||||
|
alias,
|
||||||
|
ownPass
|
||||||
|
}) {
|
||||||
|
const desc = (0, _item.getItemDescriptor)(pair);
|
||||||
|
if (desc) {
|
||||||
|
return desc;
|
||||||
|
}
|
||||||
|
let name;
|
||||||
|
let options;
|
||||||
|
let value = pair;
|
||||||
|
if (Array.isArray(value)) {
|
||||||
|
if (value.length === 3) {
|
||||||
|
[value, options, name] = value;
|
||||||
|
} else {
|
||||||
|
[value, options] = value;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
let file = undefined;
|
||||||
|
let filepath = null;
|
||||||
|
if (typeof value === "string") {
|
||||||
|
if (typeof type !== "string") {
|
||||||
|
throw new Error("To resolve a string-based item, the type of item must be given");
|
||||||
|
}
|
||||||
|
const resolver = type === "plugin" ? _index.loadPlugin : _index.loadPreset;
|
||||||
|
const request = value;
|
||||||
|
({
|
||||||
|
filepath,
|
||||||
|
value
|
||||||
|
} = yield* resolver(value, dirname));
|
||||||
|
file = {
|
||||||
|
request,
|
||||||
|
resolved: filepath
|
||||||
|
};
|
||||||
|
}
|
||||||
|
if (!value) {
|
||||||
|
throw new Error(`Unexpected falsy value: ${String(value)}`);
|
||||||
|
}
|
||||||
|
if (typeof value === "object" && value.__esModule) {
|
||||||
|
if (value.default) {
|
||||||
|
value = value.default;
|
||||||
|
} else {
|
||||||
|
throw new Error("Must export a default export when using ES6 modules.");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (typeof value !== "object" && typeof value !== "function") {
|
||||||
|
throw new Error(`Unsupported format: ${typeof value}. Expected an object or a function.`);
|
||||||
|
}
|
||||||
|
if (filepath !== null && typeof value === "object" && value) {
|
||||||
|
throw new Error(`Plugin/Preset files are not allowed to export objects, only functions. In ${filepath}`);
|
||||||
|
}
|
||||||
|
return {
|
||||||
|
name,
|
||||||
|
alias: filepath || alias,
|
||||||
|
value,
|
||||||
|
options,
|
||||||
|
dirname,
|
||||||
|
ownPass,
|
||||||
|
file
|
||||||
|
};
|
||||||
|
}
|
||||||
|
function assertNoDuplicates(items) {
|
||||||
|
const map = new Map();
|
||||||
|
for (const item of items) {
|
||||||
|
if (typeof item.value !== "function") continue;
|
||||||
|
let nameMap = map.get(item.value);
|
||||||
|
if (!nameMap) {
|
||||||
|
nameMap = new Set();
|
||||||
|
map.set(item.value, nameMap);
|
||||||
|
}
|
||||||
|
if (nameMap.has(item.name)) {
|
||||||
|
const conflicts = items.filter(i => i.value === item.value);
|
||||||
|
throw new Error([`Duplicate plugin/preset detected.`, `If you'd like to use two separate instances of a plugin,`, `they need separate names, e.g.`, ``, ` plugins: [`, ` ['some-plugin', {}],`, ` ['some-plugin', {}, 'some unique name'],`, ` ]`, ``, `Duplicates detected are:`, `${JSON.stringify(conflicts, null, 2)}`].join("\n"));
|
||||||
|
}
|
||||||
|
nameMap.add(item.name);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
0 && 0;
|
||||||
|
|
||||||
|
//# sourceMappingURL=config-descriptors.js.map
|
File diff suppressed because one or more lines are too long
|
@ -0,0 +1,290 @@
|
||||||
|
"use strict";
|
||||||
|
|
||||||
|
Object.defineProperty(exports, "__esModule", {
|
||||||
|
value: true
|
||||||
|
});
|
||||||
|
exports.ROOT_CONFIG_FILENAMES = void 0;
|
||||||
|
exports.findConfigUpwards = findConfigUpwards;
|
||||||
|
exports.findRelativeConfig = findRelativeConfig;
|
||||||
|
exports.findRootConfig = findRootConfig;
|
||||||
|
exports.loadConfig = loadConfig;
|
||||||
|
exports.resolveShowConfigPath = resolveShowConfigPath;
|
||||||
|
function _debug() {
|
||||||
|
const data = require("debug");
|
||||||
|
_debug = function () {
|
||||||
|
return data;
|
||||||
|
};
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
function _fs() {
|
||||||
|
const data = require("fs");
|
||||||
|
_fs = function () {
|
||||||
|
return data;
|
||||||
|
};
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
function _path() {
|
||||||
|
const data = require("path");
|
||||||
|
_path = function () {
|
||||||
|
return data;
|
||||||
|
};
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
function _json() {
|
||||||
|
const data = require("json5");
|
||||||
|
_json = function () {
|
||||||
|
return data;
|
||||||
|
};
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
function _gensync() {
|
||||||
|
const data = require("gensync");
|
||||||
|
_gensync = function () {
|
||||||
|
return data;
|
||||||
|
};
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
var _caching = require("../caching.js");
|
||||||
|
var _configApi = require("../helpers/config-api.js");
|
||||||
|
var _utils = require("./utils.js");
|
||||||
|
var _moduleTypes = require("./module-types.js");
|
||||||
|
var _patternToRegex = require("../pattern-to-regex.js");
|
||||||
|
var _configError = require("../../errors/config-error.js");
|
||||||
|
var fs = require("../../gensync-utils/fs.js");
|
||||||
|
require("module");
|
||||||
|
var _rewriteStackTrace = require("../../errors/rewrite-stack-trace.js");
|
||||||
|
var _async = require("../../gensync-utils/async.js");
|
||||||
|
const debug = _debug()("babel:config:loading:files:configuration");
|
||||||
|
const ROOT_CONFIG_FILENAMES = exports.ROOT_CONFIG_FILENAMES = ["babel.config.js", "babel.config.cjs", "babel.config.mjs", "babel.config.json", "babel.config.cts"];
|
||||||
|
const RELATIVE_CONFIG_FILENAMES = [".babelrc", ".babelrc.js", ".babelrc.cjs", ".babelrc.mjs", ".babelrc.json", ".babelrc.cts"];
|
||||||
|
const BABELIGNORE_FILENAME = ".babelignore";
|
||||||
|
const runConfig = (0, _caching.makeWeakCache)(function* runConfig(options, cache) {
|
||||||
|
yield* [];
|
||||||
|
return {
|
||||||
|
options: (0, _rewriteStackTrace.endHiddenCallStack)(options)((0, _configApi.makeConfigAPI)(cache)),
|
||||||
|
cacheNeedsConfiguration: !cache.configured()
|
||||||
|
};
|
||||||
|
});
|
||||||
|
function* readConfigCode(filepath, data) {
|
||||||
|
if (!_fs().existsSync(filepath)) return null;
|
||||||
|
let options = yield* (0, _moduleTypes.default)(filepath, (yield* (0, _async.isAsync)()) ? "auto" : "require", "You appear to be using a native ECMAScript module configuration " + "file, which is only supported when running Babel asynchronously " + "or when using the Node.js `--experimental-require-module` flag.", "You appear to be using a configuration file that contains top-level " + "await, which is only supported when running Babel asynchronously.");
|
||||||
|
let cacheNeedsConfiguration = false;
|
||||||
|
if (typeof options === "function") {
|
||||||
|
({
|
||||||
|
options,
|
||||||
|
cacheNeedsConfiguration
|
||||||
|
} = yield* runConfig(options, data));
|
||||||
|
}
|
||||||
|
if (!options || typeof options !== "object" || Array.isArray(options)) {
|
||||||
|
throw new _configError.default(`Configuration should be an exported JavaScript object.`, filepath);
|
||||||
|
}
|
||||||
|
if (typeof options.then === "function") {
|
||||||
|
options.catch == null || options.catch(() => {});
|
||||||
|
throw new _configError.default(`You appear to be using an async configuration, ` + `which your current version of Babel does not support. ` + `We may add support for this in the future, ` + `but if you're on the most recent version of @babel/core and still ` + `seeing this error, then you'll need to synchronously return your config.`, filepath);
|
||||||
|
}
|
||||||
|
if (cacheNeedsConfiguration) throwConfigError(filepath);
|
||||||
|
return buildConfigFileObject(options, filepath);
|
||||||
|
}
|
||||||
|
const cfboaf = new WeakMap();
|
||||||
|
function buildConfigFileObject(options, filepath) {
|
||||||
|
let configFilesByFilepath = cfboaf.get(options);
|
||||||
|
if (!configFilesByFilepath) {
|
||||||
|
cfboaf.set(options, configFilesByFilepath = new Map());
|
||||||
|
}
|
||||||
|
let configFile = configFilesByFilepath.get(filepath);
|
||||||
|
if (!configFile) {
|
||||||
|
configFile = {
|
||||||
|
filepath,
|
||||||
|
dirname: _path().dirname(filepath),
|
||||||
|
options
|
||||||
|
};
|
||||||
|
configFilesByFilepath.set(filepath, configFile);
|
||||||
|
}
|
||||||
|
return configFile;
|
||||||
|
}
|
||||||
|
const packageToBabelConfig = (0, _caching.makeWeakCacheSync)(file => {
|
||||||
|
const babel = file.options.babel;
|
||||||
|
if (babel === undefined) return null;
|
||||||
|
if (typeof babel !== "object" || Array.isArray(babel) || babel === null) {
|
||||||
|
throw new _configError.default(`.babel property must be an object`, file.filepath);
|
||||||
|
}
|
||||||
|
return {
|
||||||
|
filepath: file.filepath,
|
||||||
|
dirname: file.dirname,
|
||||||
|
options: babel
|
||||||
|
};
|
||||||
|
});
|
||||||
|
const readConfigJSON5 = (0, _utils.makeStaticFileCache)((filepath, content) => {
|
||||||
|
let options;
|
||||||
|
try {
|
||||||
|
options = _json().parse(content);
|
||||||
|
} catch (err) {
|
||||||
|
throw new _configError.default(`Error while parsing config - ${err.message}`, filepath);
|
||||||
|
}
|
||||||
|
if (!options) throw new _configError.default(`No config detected`, filepath);
|
||||||
|
if (typeof options !== "object") {
|
||||||
|
throw new _configError.default(`Config returned typeof ${typeof options}`, filepath);
|
||||||
|
}
|
||||||
|
if (Array.isArray(options)) {
|
||||||
|
throw new _configError.default(`Expected config object but found array`, filepath);
|
||||||
|
}
|
||||||
|
delete options.$schema;
|
||||||
|
return {
|
||||||
|
filepath,
|
||||||
|
dirname: _path().dirname(filepath),
|
||||||
|
options
|
||||||
|
};
|
||||||
|
});
|
||||||
|
const readIgnoreConfig = (0, _utils.makeStaticFileCache)((filepath, content) => {
|
||||||
|
const ignoreDir = _path().dirname(filepath);
|
||||||
|
const ignorePatterns = content.split("\n").map(line => line.replace(/#.*$/, "").trim()).filter(Boolean);
|
||||||
|
for (const pattern of ignorePatterns) {
|
||||||
|
if (pattern[0] === "!") {
|
||||||
|
throw new _configError.default(`Negation of file paths is not supported.`, filepath);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return {
|
||||||
|
filepath,
|
||||||
|
dirname: _path().dirname(filepath),
|
||||||
|
ignore: ignorePatterns.map(pattern => (0, _patternToRegex.default)(pattern, ignoreDir))
|
||||||
|
};
|
||||||
|
});
|
||||||
|
function findConfigUpwards(rootDir) {
|
||||||
|
let dirname = rootDir;
|
||||||
|
for (;;) {
|
||||||
|
for (const filename of ROOT_CONFIG_FILENAMES) {
|
||||||
|
if (_fs().existsSync(_path().join(dirname, filename))) {
|
||||||
|
return dirname;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
const nextDir = _path().dirname(dirname);
|
||||||
|
if (dirname === nextDir) break;
|
||||||
|
dirname = nextDir;
|
||||||
|
}
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
function* findRelativeConfig(packageData, envName, caller) {
|
||||||
|
let config = null;
|
||||||
|
let ignore = null;
|
||||||
|
const dirname = _path().dirname(packageData.filepath);
|
||||||
|
for (const loc of packageData.directories) {
|
||||||
|
if (!config) {
|
||||||
|
var _packageData$pkg;
|
||||||
|
config = yield* loadOneConfig(RELATIVE_CONFIG_FILENAMES, loc, envName, caller, ((_packageData$pkg = packageData.pkg) == null ? void 0 : _packageData$pkg.dirname) === loc ? packageToBabelConfig(packageData.pkg) : null);
|
||||||
|
}
|
||||||
|
if (!ignore) {
|
||||||
|
const ignoreLoc = _path().join(loc, BABELIGNORE_FILENAME);
|
||||||
|
ignore = yield* readIgnoreConfig(ignoreLoc);
|
||||||
|
if (ignore) {
|
||||||
|
debug("Found ignore %o from %o.", ignore.filepath, dirname);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return {
|
||||||
|
config,
|
||||||
|
ignore
|
||||||
|
};
|
||||||
|
}
|
||||||
|
function findRootConfig(dirname, envName, caller) {
|
||||||
|
return loadOneConfig(ROOT_CONFIG_FILENAMES, dirname, envName, caller);
|
||||||
|
}
|
||||||
|
function* loadOneConfig(names, dirname, envName, caller, previousConfig = null) {
|
||||||
|
const configs = yield* _gensync().all(names.map(filename => readConfig(_path().join(dirname, filename), envName, caller)));
|
||||||
|
const config = configs.reduce((previousConfig, config) => {
|
||||||
|
if (config && previousConfig) {
|
||||||
|
throw new _configError.default(`Multiple configuration files found. Please remove one:\n` + ` - ${_path().basename(previousConfig.filepath)}\n` + ` - ${config.filepath}\n` + `from ${dirname}`);
|
||||||
|
}
|
||||||
|
return config || previousConfig;
|
||||||
|
}, previousConfig);
|
||||||
|
if (config) {
|
||||||
|
debug("Found configuration %o from %o.", config.filepath, dirname);
|
||||||
|
}
|
||||||
|
return config;
|
||||||
|
}
|
||||||
|
function* loadConfig(name, dirname, envName, caller) {
|
||||||
|
const filepath = (((v, w) => (v = v.split("."), w = w.split("."), +v[0] > +w[0] || v[0] == w[0] && +v[1] >= +w[1]))(process.versions.node, "8.9") ? require.resolve : (r, {
|
||||||
|
paths: [b]
|
||||||
|
}, M = require("module")) => {
|
||||||
|
let f = M._findPath(r, M._nodeModulePaths(b).concat(b));
|
||||||
|
if (f) return f;
|
||||||
|
f = new Error(`Cannot resolve module '${r}'`);
|
||||||
|
f.code = "MODULE_NOT_FOUND";
|
||||||
|
throw f;
|
||||||
|
})(name, {
|
||||||
|
paths: [dirname]
|
||||||
|
});
|
||||||
|
const conf = yield* readConfig(filepath, envName, caller);
|
||||||
|
if (!conf) {
|
||||||
|
throw new _configError.default(`Config file contains no configuration data`, filepath);
|
||||||
|
}
|
||||||
|
debug("Loaded config %o from %o.", name, dirname);
|
||||||
|
return conf;
|
||||||
|
}
|
||||||
|
function readConfig(filepath, envName, caller) {
|
||||||
|
const ext = _path().extname(filepath);
|
||||||
|
switch (ext) {
|
||||||
|
case ".js":
|
||||||
|
case ".cjs":
|
||||||
|
case ".mjs":
|
||||||
|
case ".ts":
|
||||||
|
case ".cts":
|
||||||
|
case ".mts":
|
||||||
|
return readConfigCode(filepath, {
|
||||||
|
envName,
|
||||||
|
caller
|
||||||
|
});
|
||||||
|
default:
|
||||||
|
return readConfigJSON5(filepath);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
function* resolveShowConfigPath(dirname) {
|
||||||
|
const targetPath = process.env.BABEL_SHOW_CONFIG_FOR;
|
||||||
|
if (targetPath != null) {
|
||||||
|
const absolutePath = _path().resolve(dirname, targetPath);
|
||||||
|
const stats = yield* fs.stat(absolutePath);
|
||||||
|
if (!stats.isFile()) {
|
||||||
|
throw new Error(`${absolutePath}: BABEL_SHOW_CONFIG_FOR must refer to a regular file, directories are not supported.`);
|
||||||
|
}
|
||||||
|
return absolutePath;
|
||||||
|
}
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
function throwConfigError(filepath) {
|
||||||
|
throw new _configError.default(`\
|
||||||
|
Caching was left unconfigured. Babel's plugins, presets, and .babelrc.js files can be configured
|
||||||
|
for various types of caching, using the first param of their handler functions:
|
||||||
|
|
||||||
|
module.exports = function(api) {
|
||||||
|
// The API exposes the following:
|
||||||
|
|
||||||
|
// Cache the returned value forever and don't call this function again.
|
||||||
|
api.cache(true);
|
||||||
|
|
||||||
|
// Don't cache at all. Not recommended because it will be very slow.
|
||||||
|
api.cache(false);
|
||||||
|
|
||||||
|
// Cached based on the value of some function. If this function returns a value different from
|
||||||
|
// a previously-encountered value, the plugins will re-evaluate.
|
||||||
|
var env = api.cache(() => process.env.NODE_ENV);
|
||||||
|
|
||||||
|
// If testing for a specific env, we recommend specifics to avoid instantiating a plugin for
|
||||||
|
// any possible NODE_ENV value that might come up during plugin execution.
|
||||||
|
var isProd = api.cache(() => process.env.NODE_ENV === "production");
|
||||||
|
|
||||||
|
// .cache(fn) will perform a linear search though instances to find the matching plugin based
|
||||||
|
// based on previous instantiated plugins. If you want to recreate the plugin and discard the
|
||||||
|
// previous instance whenever something changes, you may use:
|
||||||
|
var isProd = api.cache.invalidate(() => process.env.NODE_ENV === "production");
|
||||||
|
|
||||||
|
// Note, we also expose the following more-verbose versions of the above examples:
|
||||||
|
api.cache.forever(); // api.cache(true)
|
||||||
|
api.cache.never(); // api.cache(false)
|
||||||
|
api.cache.using(fn); // api.cache(fn)
|
||||||
|
|
||||||
|
// Return the value that will be cached.
|
||||||
|
return { };
|
||||||
|
};`, filepath);
|
||||||
|
}
|
||||||
|
0 && 0;
|
||||||
|
|
||||||
|
//# sourceMappingURL=configuration.js.map
|
File diff suppressed because one or more lines are too long
|
@ -0,0 +1,6 @@
|
||||||
|
module.exports = function import_(filepath) {
|
||||||
|
return import(filepath);
|
||||||
|
};
|
||||||
|
0 && 0;
|
||||||
|
|
||||||
|
//# sourceMappingURL=import.cjs.map
|
|
@ -0,0 +1 @@
|
||||||
|
{"version":3,"names":["module","exports","import_","filepath"],"sources":["../../../src/config/files/import.cjs"],"sourcesContent":["// We keep this in a separate file so that in older node versions, where\n// import() isn't supported, we can try/catch around the require() call\n// when loading this file.\n\nmodule.exports = function import_(filepath) {\n return import(filepath);\n};\n"],"mappings":"AAIAA,MAAM,CAACC,OAAO,GAAG,SAASC,OAAOA,CAACC,QAAQ,EAAE;EAC1C,OAAO,OAAOA,QAAQ,CAAC;AACzB,CAAC;AAAC","ignoreList":[]}
|
|
@ -0,0 +1,58 @@
|
||||||
|
"use strict";
|
||||||
|
|
||||||
|
Object.defineProperty(exports, "__esModule", {
|
||||||
|
value: true
|
||||||
|
});
|
||||||
|
exports.ROOT_CONFIG_FILENAMES = void 0;
|
||||||
|
exports.findConfigUpwards = findConfigUpwards;
|
||||||
|
exports.findPackageData = findPackageData;
|
||||||
|
exports.findRelativeConfig = findRelativeConfig;
|
||||||
|
exports.findRootConfig = findRootConfig;
|
||||||
|
exports.loadConfig = loadConfig;
|
||||||
|
exports.loadPlugin = loadPlugin;
|
||||||
|
exports.loadPreset = loadPreset;
|
||||||
|
exports.resolvePlugin = resolvePlugin;
|
||||||
|
exports.resolvePreset = resolvePreset;
|
||||||
|
exports.resolveShowConfigPath = resolveShowConfigPath;
|
||||||
|
function findConfigUpwards(rootDir) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
function* findPackageData(filepath) {
|
||||||
|
return {
|
||||||
|
filepath,
|
||||||
|
directories: [],
|
||||||
|
pkg: null,
|
||||||
|
isPackage: false
|
||||||
|
};
|
||||||
|
}
|
||||||
|
function* findRelativeConfig(pkgData, envName, caller) {
|
||||||
|
return {
|
||||||
|
config: null,
|
||||||
|
ignore: null
|
||||||
|
};
|
||||||
|
}
|
||||||
|
function* findRootConfig(dirname, envName, caller) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
function* loadConfig(name, dirname, envName, caller) {
|
||||||
|
throw new Error(`Cannot load ${name} relative to ${dirname} in a browser`);
|
||||||
|
}
|
||||||
|
function* resolveShowConfigPath(dirname) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
const ROOT_CONFIG_FILENAMES = exports.ROOT_CONFIG_FILENAMES = [];
|
||||||
|
function resolvePlugin(name, dirname) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
function resolvePreset(name, dirname) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
function loadPlugin(name, dirname) {
|
||||||
|
throw new Error(`Cannot load plugin ${name} relative to ${dirname} in a browser`);
|
||||||
|
}
|
||||||
|
function loadPreset(name, dirname) {
|
||||||
|
throw new Error(`Cannot load preset ${name} relative to ${dirname} in a browser`);
|
||||||
|
}
|
||||||
|
0 && 0;
|
||||||
|
|
||||||
|
//# sourceMappingURL=index-browser.js.map
|
|
@ -0,0 +1 @@
|
||||||
|
{"version":3,"names":["findConfigUpwards","rootDir","findPackageData","filepath","directories","pkg","isPackage","findRelativeConfig","pkgData","envName","caller","config","ignore","findRootConfig","dirname","loadConfig","name","Error","resolveShowConfigPath","ROOT_CONFIG_FILENAMES","exports","resolvePlugin","resolvePreset","loadPlugin","loadPreset"],"sources":["../../../src/config/files/index-browser.ts"],"sourcesContent":["/* c8 ignore start */\n\nimport type { Handler } from \"gensync\";\n\nimport type {\n ConfigFile,\n IgnoreFile,\n RelativeConfig,\n FilePackageData,\n} from \"./types.ts\";\n\nimport type { CallerMetadata } from \"../validation/options.ts\";\n\nexport type { ConfigFile, IgnoreFile, RelativeConfig, FilePackageData };\n\nexport function findConfigUpwards(\n // eslint-disable-next-line @typescript-eslint/no-unused-vars\n rootDir: string,\n): string | null {\n return null;\n}\n\n// eslint-disable-next-line require-yield\nexport function* findPackageData(filepath: string): Handler<FilePackageData> {\n return {\n filepath,\n directories: [],\n pkg: null,\n isPackage: false,\n };\n}\n\n// eslint-disable-next-line require-yield\nexport function* findRelativeConfig(\n // eslint-disable-next-line @typescript-eslint/no-unused-vars\n pkgData: FilePackageData,\n // eslint-disable-next-line @typescript-eslint/no-unused-vars\n envName: string,\n // eslint-disable-next-line @typescript-eslint/no-unused-vars\n caller: CallerMetadata | undefined,\n): Handler<RelativeConfig> {\n return { config: null, ignore: null };\n}\n\n// eslint-disable-next-line require-yield\nexport function* findRootConfig(\n // eslint-disable-next-line @typescript-eslint/no-unused-vars\n dirname: string,\n // eslint-disable-next-line @typescript-eslint/no-unused-vars\n envName: string,\n // eslint-disable-next-line @typescript-eslint/no-unused-vars\n caller: CallerMetadata | undefined,\n): Handler<ConfigFile | null> {\n return null;\n}\n\n// eslint-disable-next-line require-yield\nexport function* loadConfig(\n name: string,\n dirname: string,\n // eslint-disable-next-line @typescript-eslint/no-unused-vars\n envName: string,\n // eslint-disable-next-line @typescript-eslint/no-unused-vars\n caller: CallerMetadata | undefined,\n): Handler<ConfigFile> {\n throw new Error(`Cannot load ${name} relative to ${dirname} in a browser`);\n}\n\n// eslint-disable-next-line require-yield\nexport function* resolveShowConfigPath(\n // eslint-disable-next-line @typescript-eslint/no-unused-vars\n dirname: string,\n): Handler<string | null> {\n return null;\n}\n\nexport const ROOT_CONFIG_FILENAMES: string[] = [];\n\ntype Resolved =\n | { loader: \"require\"; filepath: string }\n | { loader: \"import\"; filepath: string };\n\n// eslint-disable-next-line @typescript-eslint/no-unused-vars\nexport function resolvePlugin(name: string, dirname: string): Resolved | null {\n return null;\n}\n\n// eslint-disable-next-line @typescript-eslint/no-unused-vars\nexport function resolvePreset(name: string, dirname: string): Resolved | null {\n return null;\n}\n\nexport function loadPlugin(\n name: string,\n dirname: string,\n): Handler<{\n filepath: string;\n value: unknown;\n}> {\n throw new Error(\n `Cannot load plugin ${name} relative to ${dirname} in a browser`,\n );\n}\n\nexport function loadPreset(\n name: string,\n dirname: string,\n): Handler<{\n filepath: string;\n value: unknown;\n}> {\n throw new Error(\n `Cannot load preset ${name} relative to ${dirname} in a browser`,\n );\n}\n"],"mappings":";;;;;;;;;;;;;;;;AAeO,SAASA,iBAAiBA,CAE/BC,OAAe,EACA;EACf,OAAO,IAAI;AACb;AAGO,UAAUC,eAAeA,CAACC,QAAgB,EAA4B;EAC3E,OAAO;IACLA,QAAQ;IACRC,WAAW,EAAE,EAAE;IACfC,GAAG,EAAE,IAAI;IACTC,SAAS,EAAE;EACb,CAAC;AACH;AAGO,UAAUC,kBAAkBA,CAEjCC,OAAwB,EAExBC,OAAe,EAEfC,MAAkC,EACT;EACzB,OAAO;IAAEC,MAAM,EAAE,IAAI;IAAEC,MAAM,EAAE;EAAK,CAAC;AACvC;AAGO,UAAUC,cAAcA,CAE7BC,OAAe,EAEfL,OAAe,EAEfC,MAAkC,EACN;EAC5B,OAAO,IAAI;AACb;AAGO,UAAUK,UAAUA,CACzBC,IAAY,EACZF,OAAe,EAEfL,OAAe,EAEfC,MAAkC,EACb;EACrB,MAAM,IAAIO,KAAK,CAAC,eAAeD,IAAI,gBAAgBF,OAAO,eAAe,CAAC;AAC5E;AAGO,UAAUI,qBAAqBA,CAEpCJ,OAAe,EACS;EACxB,OAAO,IAAI;AACb;AAEO,MAAMK,qBAA+B,GAAAC,OAAA,CAAAD,qBAAA,GAAG,EAAE;AAO1C,SAASE,aAAaA,CAACL,IAAY,EAAEF,OAAe,EAAmB;EAC5E,OAAO,IAAI;AACb;AAGO,SAASQ,aAAaA,CAACN,IAAY,EAAEF,OAAe,EAAmB;EAC5E,OAAO,IAAI;AACb;AAEO,SAASS,UAAUA,CACxBP,IAAY,EACZF,OAAe,EAId;EACD,MAAM,IAAIG,KAAK,CACb,sBAAsBD,IAAI,gBAAgBF,OAAO,eACnD,CAAC;AACH;AAEO,SAASU,UAAUA,CACxBR,IAAY,EACZF,OAAe,EAId;EACD,MAAM,IAAIG,KAAK,CACb,sBAAsBD,IAAI,gBAAgBF,OAAO,eACnD,CAAC;AACH;AAAC","ignoreList":[]}
|
|
@ -0,0 +1,78 @@
|
||||||
|
"use strict";
|
||||||
|
|
||||||
|
Object.defineProperty(exports, "__esModule", {
|
||||||
|
value: true
|
||||||
|
});
|
||||||
|
Object.defineProperty(exports, "ROOT_CONFIG_FILENAMES", {
|
||||||
|
enumerable: true,
|
||||||
|
get: function () {
|
||||||
|
return _configuration.ROOT_CONFIG_FILENAMES;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
Object.defineProperty(exports, "findConfigUpwards", {
|
||||||
|
enumerable: true,
|
||||||
|
get: function () {
|
||||||
|
return _configuration.findConfigUpwards;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
Object.defineProperty(exports, "findPackageData", {
|
||||||
|
enumerable: true,
|
||||||
|
get: function () {
|
||||||
|
return _package.findPackageData;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
Object.defineProperty(exports, "findRelativeConfig", {
|
||||||
|
enumerable: true,
|
||||||
|
get: function () {
|
||||||
|
return _configuration.findRelativeConfig;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
Object.defineProperty(exports, "findRootConfig", {
|
||||||
|
enumerable: true,
|
||||||
|
get: function () {
|
||||||
|
return _configuration.findRootConfig;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
Object.defineProperty(exports, "loadConfig", {
|
||||||
|
enumerable: true,
|
||||||
|
get: function () {
|
||||||
|
return _configuration.loadConfig;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
Object.defineProperty(exports, "loadPlugin", {
|
||||||
|
enumerable: true,
|
||||||
|
get: function () {
|
||||||
|
return _plugins.loadPlugin;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
Object.defineProperty(exports, "loadPreset", {
|
||||||
|
enumerable: true,
|
||||||
|
get: function () {
|
||||||
|
return _plugins.loadPreset;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
Object.defineProperty(exports, "resolvePlugin", {
|
||||||
|
enumerable: true,
|
||||||
|
get: function () {
|
||||||
|
return _plugins.resolvePlugin;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
Object.defineProperty(exports, "resolvePreset", {
|
||||||
|
enumerable: true,
|
||||||
|
get: function () {
|
||||||
|
return _plugins.resolvePreset;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
Object.defineProperty(exports, "resolveShowConfigPath", {
|
||||||
|
enumerable: true,
|
||||||
|
get: function () {
|
||||||
|
return _configuration.resolveShowConfigPath;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
var _package = require("./package.js");
|
||||||
|
var _configuration = require("./configuration.js");
|
||||||
|
var _plugins = require("./plugins.js");
|
||||||
|
({});
|
||||||
|
0 && 0;
|
||||||
|
|
||||||
|
//# sourceMappingURL=index.js.map
|
|
@ -0,0 +1 @@
|
||||||
|
{"version":3,"names":["_package","require","_configuration","_plugins"],"sources":["../../../src/config/files/index.ts"],"sourcesContent":["type indexBrowserType = typeof import(\"./index-browser\");\ntype indexType = typeof import(\"./index\");\n\n// Kind of gross, but essentially asserting that the exports of this module are the same as the\n// exports of index-browser, since this file may be replaced at bundle time with index-browser.\n({}) as any as indexBrowserType as indexType;\n\nexport { findPackageData } from \"./package.ts\";\n\nexport {\n findConfigUpwards,\n findRelativeConfig,\n findRootConfig,\n loadConfig,\n resolveShowConfigPath,\n ROOT_CONFIG_FILENAMES,\n} from \"./configuration.ts\";\nexport type {\n ConfigFile,\n IgnoreFile,\n RelativeConfig,\n FilePackageData,\n} from \"./types.ts\";\nexport {\n loadPlugin,\n loadPreset,\n resolvePlugin,\n resolvePreset,\n} from \"./plugins.ts\";\n"],"mappings":";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAOA,IAAAA,QAAA,GAAAC,OAAA;AAEA,IAAAC,cAAA,GAAAD,OAAA;AAcA,IAAAE,QAAA,GAAAF,OAAA;AAlBA,CAAC,CAAC,CAAC;AAA0C","ignoreList":[]}
|
|
@ -0,0 +1,206 @@
|
||||||
|
"use strict";
|
||||||
|
|
||||||
|
Object.defineProperty(exports, "__esModule", {
|
||||||
|
value: true
|
||||||
|
});
|
||||||
|
exports.default = loadCodeDefault;
|
||||||
|
exports.supportsESM = void 0;
|
||||||
|
var _async = require("../../gensync-utils/async.js");
|
||||||
|
function _path() {
|
||||||
|
const data = require("path");
|
||||||
|
_path = function () {
|
||||||
|
return data;
|
||||||
|
};
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
function _url() {
|
||||||
|
const data = require("url");
|
||||||
|
_url = function () {
|
||||||
|
return data;
|
||||||
|
};
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
require("module");
|
||||||
|
function _semver() {
|
||||||
|
const data = require("semver");
|
||||||
|
_semver = function () {
|
||||||
|
return data;
|
||||||
|
};
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
function _debug() {
|
||||||
|
const data = require("debug");
|
||||||
|
_debug = function () {
|
||||||
|
return data;
|
||||||
|
};
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
var _rewriteStackTrace = require("../../errors/rewrite-stack-trace.js");
|
||||||
|
var _configError = require("../../errors/config-error.js");
|
||||||
|
var _transformFile = require("../../transform-file.js");
|
||||||
|
function asyncGeneratorStep(n, t, e, r, o, a, c) { try { var i = n[a](c), u = i.value; } catch (n) { return void e(n); } i.done ? t(u) : Promise.resolve(u).then(r, o); }
|
||||||
|
function _asyncToGenerator(n) { return function () { var t = this, e = arguments; return new Promise(function (r, o) { var a = n.apply(t, e); function _next(n) { asyncGeneratorStep(a, r, o, _next, _throw, "next", n); } function _throw(n) { asyncGeneratorStep(a, r, o, _next, _throw, "throw", n); } _next(void 0); }); }; }
|
||||||
|
const debug = _debug()("babel:config:loading:files:module-types");
|
||||||
|
{
|
||||||
|
try {
|
||||||
|
var import_ = require("./import.cjs");
|
||||||
|
} catch (_unused) {}
|
||||||
|
}
|
||||||
|
const supportsESM = exports.supportsESM = _semver().satisfies(process.versions.node, "^12.17 || >=13.2");
|
||||||
|
const LOADING_CJS_FILES = new Set();
|
||||||
|
function loadCjsDefault(filepath) {
|
||||||
|
if (LOADING_CJS_FILES.has(filepath)) {
|
||||||
|
debug("Auto-ignoring usage of config %o.", filepath);
|
||||||
|
return {};
|
||||||
|
}
|
||||||
|
let module;
|
||||||
|
try {
|
||||||
|
LOADING_CJS_FILES.add(filepath);
|
||||||
|
module = (0, _rewriteStackTrace.endHiddenCallStack)(require)(filepath);
|
||||||
|
} finally {
|
||||||
|
LOADING_CJS_FILES.delete(filepath);
|
||||||
|
}
|
||||||
|
{
|
||||||
|
return module != null && (module.__esModule || module[Symbol.toStringTag] === "Module") ? module.default || (arguments[1] ? module : undefined) : module;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
const loadMjsFromPath = (0, _rewriteStackTrace.endHiddenCallStack)(function () {
|
||||||
|
var _loadMjsFromPath = _asyncToGenerator(function* (filepath) {
|
||||||
|
const url = (0, _url().pathToFileURL)(filepath).toString() + "?import";
|
||||||
|
{
|
||||||
|
if (!import_) {
|
||||||
|
throw new _configError.default("Internal error: Native ECMAScript modules aren't supported by this platform.\n", filepath);
|
||||||
|
}
|
||||||
|
return yield import_(url);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
function loadMjsFromPath(_x) {
|
||||||
|
return _loadMjsFromPath.apply(this, arguments);
|
||||||
|
}
|
||||||
|
return loadMjsFromPath;
|
||||||
|
}());
|
||||||
|
const SUPPORTED_EXTENSIONS = {
|
||||||
|
".js": "unknown",
|
||||||
|
".mjs": "esm",
|
||||||
|
".cjs": "cjs",
|
||||||
|
".ts": "unknown",
|
||||||
|
".mts": "esm",
|
||||||
|
".cts": "cjs"
|
||||||
|
};
|
||||||
|
const asyncModules = new Set();
|
||||||
|
function* loadCodeDefault(filepath, loader, esmError, tlaError) {
|
||||||
|
let async;
|
||||||
|
const ext = _path().extname(filepath);
|
||||||
|
const isTS = ext === ".ts" || ext === ".cts" || ext === ".mts";
|
||||||
|
const type = SUPPORTED_EXTENSIONS[hasOwnProperty.call(SUPPORTED_EXTENSIONS, ext) ? ext : ".js"];
|
||||||
|
const pattern = `${loader} ${type}`;
|
||||||
|
switch (pattern) {
|
||||||
|
case "require cjs":
|
||||||
|
case "auto cjs":
|
||||||
|
if (isTS) {
|
||||||
|
return ensureTsSupport(filepath, ext, () => loadCjsDefault(filepath));
|
||||||
|
} else {
|
||||||
|
return loadCjsDefault(filepath, arguments[2]);
|
||||||
|
}
|
||||||
|
case "auto unknown":
|
||||||
|
case "require unknown":
|
||||||
|
case "require esm":
|
||||||
|
try {
|
||||||
|
if (isTS) {
|
||||||
|
return ensureTsSupport(filepath, ext, () => loadCjsDefault(filepath));
|
||||||
|
} else {
|
||||||
|
return loadCjsDefault(filepath, arguments[2]);
|
||||||
|
}
|
||||||
|
} catch (e) {
|
||||||
|
if (e.code === "ERR_REQUIRE_ASYNC_MODULE" || e.code === "ERR_REQUIRE_CYCLE_MODULE" && asyncModules.has(filepath)) {
|
||||||
|
asyncModules.add(filepath);
|
||||||
|
if (!(async != null ? async : async = yield* (0, _async.isAsync)())) {
|
||||||
|
throw new _configError.default(tlaError, filepath);
|
||||||
|
}
|
||||||
|
} else if (e.code === "ERR_REQUIRE_ESM" || type === "esm") {} else {
|
||||||
|
throw e;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
case "auto esm":
|
||||||
|
if (async != null ? async : async = yield* (0, _async.isAsync)()) {
|
||||||
|
const promise = isTS ? ensureTsSupport(filepath, ext, () => loadMjsFromPath(filepath)) : loadMjsFromPath(filepath);
|
||||||
|
return (yield* (0, _async.waitFor)(promise)).default;
|
||||||
|
}
|
||||||
|
throw new _configError.default(esmError, filepath);
|
||||||
|
default:
|
||||||
|
throw new Error("Internal Babel error: unreachable code.");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
function ensureTsSupport(filepath, ext, callback) {
|
||||||
|
if (require.extensions[".ts"] || require.extensions[".cts"] || require.extensions[".mts"]) {
|
||||||
|
return callback();
|
||||||
|
}
|
||||||
|
if (ext !== ".cts") {
|
||||||
|
throw new _configError.default(`\
|
||||||
|
You are using a ${ext} config file, but Babel only supports transpiling .cts configs. Either:
|
||||||
|
- Use a .cts config file
|
||||||
|
- Update to Node.js 23.6.0, which has native TypeScript support
|
||||||
|
- Install ts-node to transpile ${ext} files on the fly\
|
||||||
|
`, filepath);
|
||||||
|
}
|
||||||
|
const opts = {
|
||||||
|
babelrc: false,
|
||||||
|
configFile: false,
|
||||||
|
sourceType: "unambiguous",
|
||||||
|
sourceMaps: "inline",
|
||||||
|
sourceFileName: _path().basename(filepath),
|
||||||
|
presets: [[getTSPreset(filepath), Object.assign({
|
||||||
|
onlyRemoveTypeImports: true,
|
||||||
|
optimizeConstEnums: true
|
||||||
|
}, {
|
||||||
|
allowDeclareFields: true
|
||||||
|
})]]
|
||||||
|
};
|
||||||
|
let handler = function (m, filename) {
|
||||||
|
if (handler && filename.endsWith(".cts")) {
|
||||||
|
try {
|
||||||
|
return m._compile((0, _transformFile.transformFileSync)(filename, Object.assign({}, opts, {
|
||||||
|
filename
|
||||||
|
})).code, filename);
|
||||||
|
} catch (error) {
|
||||||
|
const packageJson = require("@babel/preset-typescript/package.json");
|
||||||
|
if (_semver().lt(packageJson.version, "7.21.4")) {
|
||||||
|
console.error("`.cts` configuration file failed to load, please try to update `@babel/preset-typescript`.");
|
||||||
|
}
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return require.extensions[".js"](m, filename);
|
||||||
|
};
|
||||||
|
require.extensions[ext] = handler;
|
||||||
|
try {
|
||||||
|
return callback();
|
||||||
|
} finally {
|
||||||
|
if (require.extensions[ext] === handler) delete require.extensions[ext];
|
||||||
|
handler = undefined;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
function getTSPreset(filepath) {
|
||||||
|
try {
|
||||||
|
return require("@babel/preset-typescript");
|
||||||
|
} catch (error) {
|
||||||
|
if (error.code !== "MODULE_NOT_FOUND") throw error;
|
||||||
|
let message = "You appear to be using a .cts file as Babel configuration, but the `@babel/preset-typescript` package was not found: please install it!";
|
||||||
|
{
|
||||||
|
if (process.versions.pnp) {
|
||||||
|
message += `
|
||||||
|
If you are using Yarn Plug'n'Play, you may also need to add the following configuration to your .yarnrc.yml file:
|
||||||
|
|
||||||
|
packageExtensions:
|
||||||
|
\t"@babel/core@*":
|
||||||
|
\t\tpeerDependencies:
|
||||||
|
\t\t\t"@babel/preset-typescript": "*"
|
||||||
|
`;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
throw new _configError.default(message, filepath);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
0 && 0;
|
||||||
|
|
||||||
|
//# sourceMappingURL=module-types.js.map
|
File diff suppressed because one or more lines are too long
|
@ -0,0 +1,61 @@
|
||||||
|
"use strict";
|
||||||
|
|
||||||
|
Object.defineProperty(exports, "__esModule", {
|
||||||
|
value: true
|
||||||
|
});
|
||||||
|
exports.findPackageData = findPackageData;
|
||||||
|
function _path() {
|
||||||
|
const data = require("path");
|
||||||
|
_path = function () {
|
||||||
|
return data;
|
||||||
|
};
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
var _utils = require("./utils.js");
|
||||||
|
var _configError = require("../../errors/config-error.js");
|
||||||
|
const PACKAGE_FILENAME = "package.json";
|
||||||
|
const readConfigPackage = (0, _utils.makeStaticFileCache)((filepath, content) => {
|
||||||
|
let options;
|
||||||
|
try {
|
||||||
|
options = JSON.parse(content);
|
||||||
|
} catch (err) {
|
||||||
|
throw new _configError.default(`Error while parsing JSON - ${err.message}`, filepath);
|
||||||
|
}
|
||||||
|
if (!options) throw new Error(`${filepath}: No config detected`);
|
||||||
|
if (typeof options !== "object") {
|
||||||
|
throw new _configError.default(`Config returned typeof ${typeof options}`, filepath);
|
||||||
|
}
|
||||||
|
if (Array.isArray(options)) {
|
||||||
|
throw new _configError.default(`Expected config object but found array`, filepath);
|
||||||
|
}
|
||||||
|
return {
|
||||||
|
filepath,
|
||||||
|
dirname: _path().dirname(filepath),
|
||||||
|
options
|
||||||
|
};
|
||||||
|
});
|
||||||
|
function* findPackageData(filepath) {
|
||||||
|
let pkg = null;
|
||||||
|
const directories = [];
|
||||||
|
let isPackage = true;
|
||||||
|
let dirname = _path().dirname(filepath);
|
||||||
|
while (!pkg && _path().basename(dirname) !== "node_modules") {
|
||||||
|
directories.push(dirname);
|
||||||
|
pkg = yield* readConfigPackage(_path().join(dirname, PACKAGE_FILENAME));
|
||||||
|
const nextLoc = _path().dirname(dirname);
|
||||||
|
if (dirname === nextLoc) {
|
||||||
|
isPackage = false;
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
dirname = nextLoc;
|
||||||
|
}
|
||||||
|
return {
|
||||||
|
filepath,
|
||||||
|
directories,
|
||||||
|
pkg,
|
||||||
|
isPackage
|
||||||
|
};
|
||||||
|
}
|
||||||
|
0 && 0;
|
||||||
|
|
||||||
|
//# sourceMappingURL=package.js.map
|
|
@ -0,0 +1 @@
|
||||||
|
{"version":3,"names":["_path","data","require","_utils","_configError","PACKAGE_FILENAME","readConfigPackage","makeStaticFileCache","filepath","content","options","JSON","parse","err","ConfigError","message","Error","Array","isArray","dirname","path","findPackageData","pkg","directories","isPackage","basename","push","join","nextLoc"],"sources":["../../../src/config/files/package.ts"],"sourcesContent":["import path from \"node:path\";\nimport type { Handler } from \"gensync\";\nimport { makeStaticFileCache } from \"./utils.ts\";\n\nimport type { ConfigFile, FilePackageData } from \"./types.ts\";\n\nimport ConfigError from \"../../errors/config-error.ts\";\n\nconst PACKAGE_FILENAME = \"package.json\";\n\nconst readConfigPackage = makeStaticFileCache(\n (filepath, content): ConfigFile => {\n let options;\n try {\n options = JSON.parse(content) as unknown;\n } catch (err) {\n throw new ConfigError(\n `Error while parsing JSON - ${err.message}`,\n filepath,\n );\n }\n\n if (!options) throw new Error(`${filepath}: No config detected`);\n\n if (typeof options !== \"object\") {\n throw new ConfigError(\n `Config returned typeof ${typeof options}`,\n filepath,\n );\n }\n if (Array.isArray(options)) {\n throw new ConfigError(`Expected config object but found array`, filepath);\n }\n\n return {\n filepath,\n dirname: path.dirname(filepath),\n options,\n };\n },\n);\n\n/**\n * Find metadata about the package that this file is inside of. Resolution\n * of Babel's config requires general package information to decide when to\n * search for .babelrc files\n */\nexport function* findPackageData(filepath: string): Handler<FilePackageData> {\n let pkg = null;\n const directories = [];\n let isPackage = true;\n\n let dirname = path.dirname(filepath);\n while (!pkg && path.basename(dirname) !== \"node_modules\") {\n directories.push(dirname);\n\n pkg = yield* readConfigPackage(path.join(dirname, PACKAGE_FILENAME));\n\n const nextLoc = path.dirname(dirname);\n if (dirname === nextLoc) {\n isPackage = false;\n break;\n }\n dirname = nextLoc;\n }\n\n return { filepath, directories, pkg, isPackage };\n}\n"],"mappings":";;;;;;AAAA,SAAAA,MAAA;EAAA,MAAAC,IAAA,GAAAC,OAAA;EAAAF,KAAA,YAAAA,CAAA;IAAA,OAAAC,IAAA;EAAA;EAAA,OAAAA,IAAA;AAAA;AAEA,IAAAE,MAAA,GAAAD,OAAA;AAIA,IAAAE,YAAA,GAAAF,OAAA;AAEA,MAAMG,gBAAgB,GAAG,cAAc;AAEvC,MAAMC,iBAAiB,GAAG,IAAAC,0BAAmB,EAC3C,CAACC,QAAQ,EAAEC,OAAO,KAAiB;EACjC,IAAIC,OAAO;EACX,IAAI;IACFA,OAAO,GAAGC,IAAI,CAACC,KAAK,CAACH,OAAO,CAAY;EAC1C,CAAC,CAAC,OAAOI,GAAG,EAAE;IACZ,MAAM,IAAIC,oBAAW,CACnB,8BAA8BD,GAAG,CAACE,OAAO,EAAE,EAC3CP,QACF,CAAC;EACH;EAEA,IAAI,CAACE,OAAO,EAAE,MAAM,IAAIM,KAAK,CAAC,GAAGR,QAAQ,sBAAsB,CAAC;EAEhE,IAAI,OAAOE,OAAO,KAAK,QAAQ,EAAE;IAC/B,MAAM,IAAII,oBAAW,CACnB,0BAA0B,OAAOJ,OAAO,EAAE,EAC1CF,QACF,CAAC;EACH;EACA,IAAIS,KAAK,CAACC,OAAO,CAACR,OAAO,CAAC,EAAE;IAC1B,MAAM,IAAII,oBAAW,CAAC,wCAAwC,EAAEN,QAAQ,CAAC;EAC3E;EAEA,OAAO;IACLA,QAAQ;IACRW,OAAO,EAAEC,MAAGA,CAAC,CAACD,OAAO,CAACX,QAAQ,CAAC;IAC/BE;EACF,CAAC;AACH,CACF,CAAC;AAOM,UAAUW,eAAeA,CAACb,QAAgB,EAA4B;EAC3E,IAAIc,GAAG,GAAG,IAAI;EACd,MAAMC,WAAW,GAAG,EAAE;EACtB,IAAIC,SAAS,GAAG,IAAI;EAEpB,IAAIL,OAAO,GAAGC,MAAGA,CAAC,CAACD,OAAO,CAACX,QAAQ,CAAC;EACpC,OAAO,CAACc,GAAG,IAAIF,MAAGA,CAAC,CAACK,QAAQ,CAACN,OAAO,CAAC,KAAK,cAAc,EAAE;IACxDI,WAAW,CAACG,IAAI,CAACP,OAAO,CAAC;IAEzBG,GAAG,GAAG,OAAOhB,iBAAiB,CAACc,MAAGA,CAAC,CAACO,IAAI,CAACR,OAAO,EAAEd,gBAAgB,CAAC,CAAC;IAEpE,MAAMuB,OAAO,GAAGR,MAAGA,CAAC,CAACD,OAAO,CAACA,OAAO,CAAC;IACrC,IAAIA,OAAO,KAAKS,OAAO,EAAE;MACvBJ,SAAS,GAAG,KAAK;MACjB;IACF;IACAL,OAAO,GAAGS,OAAO;EACnB;EAEA,OAAO;IAAEpB,QAAQ;IAAEe,WAAW;IAAED,GAAG;IAAEE;EAAU,CAAC;AAClD;AAAC","ignoreList":[]}
|
|
@ -0,0 +1,230 @@
|
||||||
|
"use strict";
|
||||||
|
|
||||||
|
Object.defineProperty(exports, "__esModule", {
|
||||||
|
value: true
|
||||||
|
});
|
||||||
|
exports.loadPlugin = loadPlugin;
|
||||||
|
exports.loadPreset = loadPreset;
|
||||||
|
exports.resolvePreset = exports.resolvePlugin = void 0;
|
||||||
|
function _debug() {
|
||||||
|
const data = require("debug");
|
||||||
|
_debug = function () {
|
||||||
|
return data;
|
||||||
|
};
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
function _path() {
|
||||||
|
const data = require("path");
|
||||||
|
_path = function () {
|
||||||
|
return data;
|
||||||
|
};
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
var _async = require("../../gensync-utils/async.js");
|
||||||
|
var _moduleTypes = require("./module-types.js");
|
||||||
|
function _url() {
|
||||||
|
const data = require("url");
|
||||||
|
_url = function () {
|
||||||
|
return data;
|
||||||
|
};
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
var _importMetaResolve = require("../../vendor/import-meta-resolve.js");
|
||||||
|
require("module");
|
||||||
|
function _fs() {
|
||||||
|
const data = require("fs");
|
||||||
|
_fs = function () {
|
||||||
|
return data;
|
||||||
|
};
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
const debug = _debug()("babel:config:loading:files:plugins");
|
||||||
|
const EXACT_RE = /^module:/;
|
||||||
|
const BABEL_PLUGIN_PREFIX_RE = /^(?!@|module:|[^/]+\/|babel-plugin-)/;
|
||||||
|
const BABEL_PRESET_PREFIX_RE = /^(?!@|module:|[^/]+\/|babel-preset-)/;
|
||||||
|
const BABEL_PLUGIN_ORG_RE = /^(@babel\/)(?!plugin-|[^/]+\/)/;
|
||||||
|
const BABEL_PRESET_ORG_RE = /^(@babel\/)(?!preset-|[^/]+\/)/;
|
||||||
|
const OTHER_PLUGIN_ORG_RE = /^(@(?!babel\/)[^/]+\/)(?![^/]*babel-plugin(?:-|\/|$)|[^/]+\/)/;
|
||||||
|
const OTHER_PRESET_ORG_RE = /^(@(?!babel\/)[^/]+\/)(?![^/]*babel-preset(?:-|\/|$)|[^/]+\/)/;
|
||||||
|
const OTHER_ORG_DEFAULT_RE = /^(@(?!babel$)[^/]+)$/;
|
||||||
|
const resolvePlugin = exports.resolvePlugin = resolveStandardizedName.bind(null, "plugin");
|
||||||
|
const resolvePreset = exports.resolvePreset = resolveStandardizedName.bind(null, "preset");
|
||||||
|
function* loadPlugin(name, dirname) {
|
||||||
|
const {
|
||||||
|
filepath,
|
||||||
|
loader
|
||||||
|
} = resolvePlugin(name, dirname, yield* (0, _async.isAsync)());
|
||||||
|
const value = yield* requireModule("plugin", loader, filepath);
|
||||||
|
debug("Loaded plugin %o from %o.", name, dirname);
|
||||||
|
return {
|
||||||
|
filepath,
|
||||||
|
value
|
||||||
|
};
|
||||||
|
}
|
||||||
|
function* loadPreset(name, dirname) {
|
||||||
|
const {
|
||||||
|
filepath,
|
||||||
|
loader
|
||||||
|
} = resolvePreset(name, dirname, yield* (0, _async.isAsync)());
|
||||||
|
const value = yield* requireModule("preset", loader, filepath);
|
||||||
|
debug("Loaded preset %o from %o.", name, dirname);
|
||||||
|
return {
|
||||||
|
filepath,
|
||||||
|
value
|
||||||
|
};
|
||||||
|
}
|
||||||
|
function standardizeName(type, name) {
|
||||||
|
if (_path().isAbsolute(name)) return name;
|
||||||
|
const isPreset = type === "preset";
|
||||||
|
return name.replace(isPreset ? BABEL_PRESET_PREFIX_RE : BABEL_PLUGIN_PREFIX_RE, `babel-${type}-`).replace(isPreset ? BABEL_PRESET_ORG_RE : BABEL_PLUGIN_ORG_RE, `$1${type}-`).replace(isPreset ? OTHER_PRESET_ORG_RE : OTHER_PLUGIN_ORG_RE, `$1babel-${type}-`).replace(OTHER_ORG_DEFAULT_RE, `$1/babel-${type}`).replace(EXACT_RE, "");
|
||||||
|
}
|
||||||
|
function* resolveAlternativesHelper(type, name) {
|
||||||
|
const standardizedName = standardizeName(type, name);
|
||||||
|
const {
|
||||||
|
error,
|
||||||
|
value
|
||||||
|
} = yield standardizedName;
|
||||||
|
if (!error) return value;
|
||||||
|
if (error.code !== "MODULE_NOT_FOUND") throw error;
|
||||||
|
if (standardizedName !== name && !(yield name).error) {
|
||||||
|
error.message += `\n- If you want to resolve "${name}", use "module:${name}"`;
|
||||||
|
}
|
||||||
|
if (!(yield standardizeName(type, "@babel/" + name)).error) {
|
||||||
|
error.message += `\n- Did you mean "@babel/${name}"?`;
|
||||||
|
}
|
||||||
|
const oppositeType = type === "preset" ? "plugin" : "preset";
|
||||||
|
if (!(yield standardizeName(oppositeType, name)).error) {
|
||||||
|
error.message += `\n- Did you accidentally pass a ${oppositeType} as a ${type}?`;
|
||||||
|
}
|
||||||
|
if (type === "plugin") {
|
||||||
|
const transformName = standardizedName.replace("-proposal-", "-transform-");
|
||||||
|
if (transformName !== standardizedName && !(yield transformName).error) {
|
||||||
|
error.message += `\n- Did you mean "${transformName}"?`;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
error.message += `\n
|
||||||
|
Make sure that all the Babel plugins and presets you are using
|
||||||
|
are defined as dependencies or devDependencies in your package.json
|
||||||
|
file. It's possible that the missing plugin is loaded by a preset
|
||||||
|
you are using that forgot to add the plugin to its dependencies: you
|
||||||
|
can workaround this problem by explicitly adding the missing package
|
||||||
|
to your top-level package.json.
|
||||||
|
`;
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
function tryRequireResolve(id, dirname) {
|
||||||
|
try {
|
||||||
|
if (dirname) {
|
||||||
|
return {
|
||||||
|
error: null,
|
||||||
|
value: (((v, w) => (v = v.split("."), w = w.split("."), +v[0] > +w[0] || v[0] == w[0] && +v[1] >= +w[1]))(process.versions.node, "8.9") ? require.resolve : (r, {
|
||||||
|
paths: [b]
|
||||||
|
}, M = require("module")) => {
|
||||||
|
let f = M._findPath(r, M._nodeModulePaths(b).concat(b));
|
||||||
|
if (f) return f;
|
||||||
|
f = new Error(`Cannot resolve module '${r}'`);
|
||||||
|
f.code = "MODULE_NOT_FOUND";
|
||||||
|
throw f;
|
||||||
|
})(id, {
|
||||||
|
paths: [dirname]
|
||||||
|
})
|
||||||
|
};
|
||||||
|
} else {
|
||||||
|
return {
|
||||||
|
error: null,
|
||||||
|
value: require.resolve(id)
|
||||||
|
};
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
return {
|
||||||
|
error,
|
||||||
|
value: null
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
function tryImportMetaResolve(id, options) {
|
||||||
|
try {
|
||||||
|
return {
|
||||||
|
error: null,
|
||||||
|
value: (0, _importMetaResolve.resolve)(id, options)
|
||||||
|
};
|
||||||
|
} catch (error) {
|
||||||
|
return {
|
||||||
|
error,
|
||||||
|
value: null
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
function resolveStandardizedNameForRequire(type, name, dirname) {
|
||||||
|
const it = resolveAlternativesHelper(type, name);
|
||||||
|
let res = it.next();
|
||||||
|
while (!res.done) {
|
||||||
|
res = it.next(tryRequireResolve(res.value, dirname));
|
||||||
|
}
|
||||||
|
return {
|
||||||
|
loader: "require",
|
||||||
|
filepath: res.value
|
||||||
|
};
|
||||||
|
}
|
||||||
|
function resolveStandardizedNameForImport(type, name, dirname) {
|
||||||
|
const parentUrl = (0, _url().pathToFileURL)(_path().join(dirname, "./babel-virtual-resolve-base.js")).href;
|
||||||
|
const it = resolveAlternativesHelper(type, name);
|
||||||
|
let res = it.next();
|
||||||
|
while (!res.done) {
|
||||||
|
res = it.next(tryImportMetaResolve(res.value, parentUrl));
|
||||||
|
}
|
||||||
|
return {
|
||||||
|
loader: "auto",
|
||||||
|
filepath: (0, _url().fileURLToPath)(res.value)
|
||||||
|
};
|
||||||
|
}
|
||||||
|
function resolveStandardizedName(type, name, dirname, allowAsync) {
|
||||||
|
if (!_moduleTypes.supportsESM || !allowAsync) {
|
||||||
|
return resolveStandardizedNameForRequire(type, name, dirname);
|
||||||
|
}
|
||||||
|
try {
|
||||||
|
const resolved = resolveStandardizedNameForImport(type, name, dirname);
|
||||||
|
if (!(0, _fs().existsSync)(resolved.filepath)) {
|
||||||
|
throw Object.assign(new Error(`Could not resolve "${name}" in file ${dirname}.`), {
|
||||||
|
type: "MODULE_NOT_FOUND"
|
||||||
|
});
|
||||||
|
}
|
||||||
|
return resolved;
|
||||||
|
} catch (e) {
|
||||||
|
try {
|
||||||
|
return resolveStandardizedNameForRequire(type, name, dirname);
|
||||||
|
} catch (e2) {
|
||||||
|
if (e.type === "MODULE_NOT_FOUND") throw e;
|
||||||
|
if (e2.type === "MODULE_NOT_FOUND") throw e2;
|
||||||
|
throw e;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
{
|
||||||
|
var LOADING_MODULES = new Set();
|
||||||
|
}
|
||||||
|
function* requireModule(type, loader, name) {
|
||||||
|
{
|
||||||
|
if (!(yield* (0, _async.isAsync)()) && LOADING_MODULES.has(name)) {
|
||||||
|
throw new Error(`Reentrant ${type} detected trying to load "${name}". This module is not ignored ` + "and is trying to load itself while compiling itself, leading to a dependency cycle. " + 'We recommend adding it to your "ignore" list in your babelrc, or to a .babelignore.');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
try {
|
||||||
|
{
|
||||||
|
LOADING_MODULES.add(name);
|
||||||
|
}
|
||||||
|
{
|
||||||
|
return yield* (0, _moduleTypes.default)(name, loader, `You appear to be using a native ECMAScript module ${type}, ` + "which is only supported when running Babel asynchronously " + "or when using the Node.js `--experimental-require-module` flag.", `You appear to be using a ${type} that contains top-level await, ` + "which is only supported when running Babel asynchronously.", true);
|
||||||
|
}
|
||||||
|
} catch (err) {
|
||||||
|
err.message = `[BABEL]: ${err.message} (While processing: ${name})`;
|
||||||
|
throw err;
|
||||||
|
} finally {
|
||||||
|
{
|
||||||
|
LOADING_MODULES.delete(name);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
0 && 0;
|
||||||
|
|
||||||
|
//# sourceMappingURL=plugins.js.map
|
File diff suppressed because one or more lines are too long
|
@ -0,0 +1,3 @@
|
||||||
|
0 && 0;
|
||||||
|
|
||||||
|
//# sourceMappingURL=types.js.map
|
|
@ -0,0 +1 @@
|
||||||
|
{"version":3,"names":[],"sources":["../../../src/config/files/types.ts"],"sourcesContent":["import type { InputOptions } from \"../index.ts\";\n\nexport type ConfigFile = {\n filepath: string;\n dirname: string;\n options: InputOptions & { babel?: unknown };\n};\n\nexport type IgnoreFile = {\n filepath: string;\n dirname: string;\n ignore: Array<RegExp>;\n};\n\nexport type RelativeConfig = {\n // The actual config, either from package.json#babel, .babelrc, or\n // .babelrc.js, if there was one.\n config: ConfigFile | null;\n // The .babelignore, if there was one.\n ignore: IgnoreFile | null;\n};\n\nexport type FilePackageData = {\n // The file in the package.\n filepath: string;\n // Any ancestor directories of the file that are within the package.\n directories: Array<string>;\n // The contents of the package.json. May not be found if the package just\n // terminated at a node_modules folder without finding one.\n pkg: ConfigFile | null;\n // True if a package.json or node_modules folder was found while traversing\n // the directory structure.\n isPackage: boolean;\n};\n"],"mappings":"","ignoreList":[]}
|
|
@ -0,0 +1,36 @@
|
||||||
|
"use strict";
|
||||||
|
|
||||||
|
Object.defineProperty(exports, "__esModule", {
|
||||||
|
value: true
|
||||||
|
});
|
||||||
|
exports.makeStaticFileCache = makeStaticFileCache;
|
||||||
|
var _caching = require("../caching.js");
|
||||||
|
var fs = require("../../gensync-utils/fs.js");
|
||||||
|
function _fs2() {
|
||||||
|
const data = require("fs");
|
||||||
|
_fs2 = function () {
|
||||||
|
return data;
|
||||||
|
};
|
||||||
|
return data;
|
||||||
|
}
|
||||||
|
function makeStaticFileCache(fn) {
|
||||||
|
return (0, _caching.makeStrongCache)(function* (filepath, cache) {
|
||||||
|
const cached = cache.invalidate(() => fileMtime(filepath));
|
||||||
|
if (cached === null) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
return fn(filepath, yield* fs.readFile(filepath, "utf8"));
|
||||||
|
});
|
||||||
|
}
|
||||||
|
function fileMtime(filepath) {
|
||||||
|
if (!_fs2().existsSync(filepath)) return null;
|
||||||
|
try {
|
||||||
|
return +_fs2().statSync(filepath).mtime;
|
||||||
|
} catch (e) {
|
||||||
|
if (e.code !== "ENOENT" && e.code !== "ENOTDIR") throw e;
|
||||||
|
}
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
0 && 0;
|
||||||
|
|
||||||
|
//# sourceMappingURL=utils.js.map
|
|
@ -0,0 +1 @@
|
||||||
|
{"version":3,"names":["_caching","require","fs","_fs2","data","makeStaticFileCache","fn","makeStrongCache","filepath","cache","cached","invalidate","fileMtime","readFile","nodeFs","existsSync","statSync","mtime","e","code"],"sources":["../../../src/config/files/utils.ts"],"sourcesContent":["import type { Handler } from \"gensync\";\n\nimport { makeStrongCache } from \"../caching.ts\";\nimport type { CacheConfigurator } from \"../caching.ts\";\nimport * as fs from \"../../gensync-utils/fs.ts\";\nimport nodeFs from \"node:fs\";\n\nexport function makeStaticFileCache<T>(\n fn: (filepath: string, contents: string) => T,\n) {\n return makeStrongCache(function* (\n filepath: string,\n cache: CacheConfigurator<void>,\n ): Handler<null | T> {\n const cached = cache.invalidate(() => fileMtime(filepath));\n\n if (cached === null) {\n return null;\n }\n\n return fn(filepath, yield* fs.readFile(filepath, \"utf8\"));\n });\n}\n\nfunction fileMtime(filepath: string): number | null {\n if (!nodeFs.existsSync(filepath)) return null;\n\n try {\n return +nodeFs.statSync(filepath).mtime;\n } catch (e) {\n if (e.code !== \"ENOENT\" && e.code !== \"ENOTDIR\") throw e;\n }\n\n return null;\n}\n"],"mappings":";;;;;;AAEA,IAAAA,QAAA,GAAAC,OAAA;AAEA,IAAAC,EAAA,GAAAD,OAAA;AACA,SAAAE,KAAA;EAAA,MAAAC,IAAA,GAAAH,OAAA;EAAAE,IAAA,YAAAA,CAAA;IAAA,OAAAC,IAAA;EAAA;EAAA,OAAAA,IAAA;AAAA;AAEO,SAASC,mBAAmBA,CACjCC,EAA6C,EAC7C;EACA,OAAO,IAAAC,wBAAe,EAAC,WACrBC,QAAgB,EAChBC,KAA8B,EACX;IACnB,MAAMC,MAAM,GAAGD,KAAK,CAACE,UAAU,CAAC,MAAMC,SAAS,CAACJ,QAAQ,CAAC,CAAC;IAE1D,IAAIE,MAAM,KAAK,IAAI,EAAE;MACnB,OAAO,IAAI;IACb;IAEA,OAAOJ,EAAE,CAACE,QAAQ,EAAE,OAAON,EAAE,CAACW,QAAQ,CAACL,QAAQ,EAAE,MAAM,CAAC,CAAC;EAC3D,CAAC,CAAC;AACJ;AAEA,SAASI,SAASA,CAACJ,QAAgB,EAAiB;EAClD,IAAI,CAACM,KAAKA,CAAC,CAACC,UAAU,CAACP,QAAQ,CAAC,EAAE,OAAO,IAAI;EAE7C,IAAI;IACF,OAAO,CAACM,KAAKA,CAAC,CAACE,QAAQ,CAACR,QAAQ,CAAC,CAACS,KAAK;EACzC,CAAC,CAAC,OAAOC,CAAC,EAAE;IACV,IAAIA,CAAC,CAACC,IAAI,KAAK,QAAQ,IAAID,CAAC,CAACC,IAAI,KAAK,SAAS,EAAE,MAAMD,CAAC;EAC1D;EAEA,OAAO,IAAI;AACb;AAAC","ignoreList":[]}
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue