Either create actor or take our existing python code, edit as needed, and setup actor in apify to scrap website. For that will need to setup Python Dockerfile so it will load a system with Python installed.
Then to pass some input in and then save the data to for example a dataset or key value store or use proxy for the requests, we will need 2 things:
-Access environment variables - Things like APIFY_TOKEN, APIFY_PROXY_PASSWORD are saved there so we don't need to hardcode to your code
-Use apify API for INPUT/OUTPUT. Apify doesn;t have Python API client yet so we will need to use it as it is. In any case, we are mainly interested in key value store (get record) for input and dataset (put items) for output.
However I look forward to your suggestion. Please confirm in your first line if you have experience with Apify or similar alternative.