I have script. I just need someone to run it following these instructions: ## 1. Python Environment Setup You’ll need *Python 3.9+* (preferably 3.10 or higher). Check: bash python --version If not installed: • [Download Python](https://www.python.org/downloads/) or install via your package manager. Create a *virtual environment* (recommended): bash python -m venv venv source venv/bin/activate # macOS/Linux venv\Scripts\activate # Windows --- ## 2. Install Dependencies Since this script uses: • Async HTTP requests (likely aiohttp ) • Data storage (maybe sqlite3 or aiosqlite ) • Possibly argparse for command-line arguments • The Google Places API (via HTTP requests) Your requirements.txt should include at least: txt aiohttp aiosqlite requests python-dotenv # if environment variables are used Then install them: bash pip install -r requirements.txt —or manually: bash pip install aiohttp aiosqlite requests python-dotenv --- ## 3. Google Places API Key (Optional) The script checks: python if not mobile_found and args.google_places_key: That means you can optionally supply a *Google Places API key* to enrich missing mobile data. To get one: 1. Go to [Google Cloud Console](https://console.cloud.google.com/) 2. Enable the *Places API* 3. Create an *API key* 4. Pass it to your script via command line, e.g.: bash python advanced_tile_grout_extractor_google_email_fallback.py \ --google_places_key YOUR_API_KEY --- ## 4. Database Configuration The line: python await save_listing(db, rec) suggests it writes results to a database ( db ), probably an SQLite file (like data.db ). Check if the script has: python db = await aiosqlite.connect("data.db") If so, you’re good — it will create the file automatically. If it expects a different database (PostgreSQL, etc.), you’ll need connection credentials. --- ## 5. Input Data or Directory The function crawl_directory_worker implies the script crawls through a *directory of listings* (maybe HTML, JSON, or CSV files). Usually, you’ll run it like: bash python advanced_tile_grout_extractor_google_email_fallback.py --input my_directory/ or (if it’s web-based) bash python advanced_tile_grout_extractor_google_email_fallback.py --query "tile and grout cleaning" --country "AU" Check the top of the file for argparse usage to see what options it expects ( --input , --output , --google_places_key , etc.). --- ## 6. Run and Monitor To run normally: bash python advanced_tile_grout_extractor_google_email_fallback.py If the script is asynchronous ( asyncio.run(main()) ), you’ll see it handle multiple listings concurrently. You might also want to add logging to see progress: bash python advanced_tile_grout_extractor_google_email_fallback.py --verbose --- ## Summary | Requirement | Description | | ------------------------------------ | ------------------------------------------------------------------------- | | *Python 3.9+* | Installed and in PATH | | *Dependencies* | aiohttp , aiosqlite , requests , etc. | | *Google Places API key (optional)* | For mobile enrichment | | *Database (likely SQLite)* | Automatically created as data.db | | *Input source* | Directory, query, or listings | | *Run command* | python advanced_tile_grout_extractor_google_email_fallback.py [options] |