Laravel Developer – Automated Lead Aggregation System (Low-Voltage Industry)

Заказчик: AI | Опубликовано: 05.03.2026
Бюджет: 1500 $

Overview We are seeking an experienced Laravel Developer / Automation Engineer to architect and build a self-sustaining system that gathers publicly available contact and professional information for individuals working in the low-voltage industry (e.g., low-voltage technicians, structured cabling installers, fire alarm techs, security system installers, access control technicians, etc.). The system will collect relevant professional data from platforms such as LinkedIn and additional job boards where users publicly publish resume and profile information. The final solution must: Operate autonomously once deployed Allow geo-targeted searches by specific City/State Store structured data in a database designed like a spreadsheet Provide exportable, filterable datasets (CSV/Excel) Operate within an ongoing subscription/infrastructure budget of approximately $100/month or less Core Responsibilities 1. System Architecture & Automation Design and implement a fully automated data aggregation pipeline. Build a Laravel-based backend system that: Handles scheduled crawling/scraping jobs Queues tasks using Laravel Queues (Redis or database driver) Implements retry logic and error handling Ensure system stability with minimal manual oversight. 2. Data Collection & Source Integration Develop compliant data extraction processes for: LinkedIn (public profile data only) Job boards where users publish resumes/profile data publicly Industry-specific directories (if applicable) Key Requirements: Respect robots.txt and platform compliance policies Avoid rate-limit violations Implement proxy rotation (if required) Use headless browser automation (Puppeteer/Playwright via API service if necessary) Implement intelligent throttling 3. Geo-Targeted Search Capability Build an admin interface that allows: Input of specific: City State Zip code radius (optional) Selection of job titles/keywords Triggering automated data collection for those parameters The system must allow: On-demand search campaigns Scheduled recurring searches Multi-location campaigns 4. Data Structuring & Storage Design a relational database structured similarly to a spreadsheet. Minimum data columns: First Name Last Name Job Title Company City State Email (if publicly available) Phone (if publicly available) Profile URL Source Platform Date Collected Tags / Campaign Identifier Database Requirements: Deduplication logic Email/phone normalization Indexing for fast filtering Soft delete capability 5. Admin Dashboard (Laravel + Blade or Livewire) Build a simple, clean UI where I can: Filter by city/state Filter by job title Filter by source Search by keyword Export filtered results to CSV/Excel Tag or categorize contacts View collection statistics The interface should feel like a spreadsheet (sortable columns, filterable fields). 6. Export Functionality Provide: CSV export Excel (.xlsx) export Filter-based export (not full database only) Ability to export by: Location Campaign Date range Job title 7. Cost Constraints Ongoing monthly costs must not exceed $100/month, including: Hosting (VPS preferred: DigitalOcean, Vultr, etc.) Proxy services (if required) Automation APIs (if necessary) Queue services (if external) Preferred stack for cost control: Laravel 10+ MySQL or PostgreSQL Redis (optional but preferred) VPS-based deployment (no heavy SaaS dependencies) Supervisor for queue workers Cron-based scheduling Avoid expensive scraping SaaS platforms unless justified and within budget. 8. Automation Requirements The system must: Run scheduled searches automatically Monitor failed jobs Log scraping errors Send optional email alerts for failures Self-heal where possible (retry logic) Minimal manual oversight should be required after deployment. Required Experience 5+ years Laravel experience Strong experience with: Web scraping & automation Headless browsers Rate limiting & anti-bot handling Proxy rotation Experience building admin dashboards Strong database design skills Experience building export systems (CSV/Excel) Experience with queues & background workers VPS deployment & server configuration Understanding of compliance and ethical data collection practices Preferred Experience Experience working with LinkedIn data extraction (public data only) Experience with Puppeteer or Playwright Experience with Laravel Horizon Experience with scalable scraping architectures Familiarity with data normalization and deduplication techniques Deliverables Fully functional Laravel application Database schema and migration files Automated data collection system Admin dashboard Export functionality Deployment documentation Cost breakdown documentation (showing < $100/month ongoing costs) Basic user documentation Performance Metrics System uptime Data accuracy Deduplication efficiency Export performance Ability to handle multiple city/state campaigns Monthly infrastructure cost compliance Important Notes This role requires someone who understands both Laravel development and automation engineering. The build must prioritize long-term sustainability and cost control. The system must be modular to allow additional data sources in the future. Clean, well-documented code is mandatory. If you are experienced in Laravel automation systems and can architect a reliable, self-running lead aggregation platform within tight cost constraints, we encourage you to apply.