I have a medium-sized, structured dataset (roughly 100 MB – 1 GB) that needs to travel the full machine-learning journey—from raw CSV/SQL tables to a trained, evaluated model and clear hand-off documentation. Here is the flow I’d like you to cover: • Data cleaning and preprocessing so the tables are analysis-ready • Solid exploratory data analysis to surface patterns and guide feature work • Meaningful feature engineering that actually lifts model quality • Model development for either regression or classification, followed by evaluation with Accuracy, RMSE, Precision, Recall and ROC-AUC as appropriate • A concise walk-through on basic deployment steps; just enough so I can push the model into a simple production environment later Python is the language of choice, and the usual suspects—Pandas, NumPy, Scikit-learn, Matplotlib/Seaborn inside a Jupyter Notebook—should be more than enough. Deliverables 1. Well-structured, fully commented Python notebooks or scripts 2. The final trained model saved in a standard, reloadable format 3. A short PDF or README that explains the workflow, key decisions and instructions to reproduce results 4. (Optional) quick notes or code snippets showing how to expose the model behind an endpoint or batch job I’d like to wrap this up within the next few weeks; finishing closer to 7–10 days is ideal if your schedule allows. Let me know your approach, any clarifying questions, and the earliest you can start.