We are looking for a skilled MongoDB developer to create a Role-based Access Control (RBAC) system with live editing capabilities for our application. The goal of this project is to create a secure and scalable RBAC system for our MongoDB database that can be easily managed by our team.
The RBAC system should allow us to define roles and permissions for users, as well as create groups and assign users to them. The system should be able to authenticate users and authorize them based on their assigned roles and permissions.
The live editing capabilities should allow authorized users to make changes to the database in real-time, without the need for manual updates. The system should be able to track changes made by users, and provide auditing and logging capabilities for security purposes.
The ideal candidate should have experience with MongoDB and RBAC systems, as well as expertise in server-side programming languages such as Node.js or Python. Familiarity with web technologies such as HTML, CSS, and JavaScript is also preferred.
In addition to developing the RBAC system, the candidate should also provide clear and comprehensive documentation, as well as support and troubleshooting assistance during the implementation phase.
Hello
I am a professional python developer. My main specializations are automation, web scrapers and bots development.
I have already developed over 200 scrapers. From the simplest (for example, a competitor's price collector) to complex parsers (with authorization, bypassing captcha, rotating ips and others) which can collect millions of products from amazon.
I have done web scrapers for:
- Amazon
- Instagram
- Facebook
- Google
- Twitter
- LinkedIn
- Pinterest
- Walmart
- And many others
For scraping I use:
- Python
- Requests
- BeautifulSoup
- Selenium
- Scrapy
- Pyautogui
- Undetected Chromedriver
- Rotating ips
I can bypass:
- CloudFlare
- IP blocking
- Captcha
- Authorization required
- Other limitations
Django / PostgreSQL
For big scraping projects I usually use Django with PostgreSQL. This allows us to store information in a database for further processing and use. I also set up an administration area which allows us to check the data and set up scraper configs.
If you need a professional solution in this area - I am ready to cooperate.
I am ready to make a sample script before we start
Regards, Oleg