I need a synthetic yet physically consistent power-flow dataset for the standard 33-bus radial distribution network that already includes my specified distributed-generation siting. Every scenario in the dataset must stress the feeders with unexpected load spikes and drops so the final collection reflects realistic worst-case and recovery conditions. The workflow has two phases: • Phase 1 – Data generation Write reproducible code (MATLAB, Python + pandapower/OpenDSS, or a comparable solver) that iterates over thousands of stochastic load-change events. For each run I expect full node voltage profiles, line currents, active/reactive injections and total losses stored in an easy-to-parse format such as CSV or MAT. All assumptions—DG sizes, placement, load multipliers and randomness seeding—need to be documented so the study is traceable. • Phase 2 – Deep-learning model Using the dataset, train a neural network that learns to predict the resulting voltage magnitude at every bus from the perturbed load vector. The model may be built in TensorFlow or PyTorch; just include the training script, a saved model file and a short notebook showing inference on unseen cases with the usual train/validation split and error metrics. Deliverables are the code, the generated dataset, the trained network, and a concise README explaining how to rerun everything end-to-end. Please highlight relevant past work when you reply, especially examples where you handled power-system simulations or machine-learning models for electrical data.