Know more Some of my Projects Certificates Achieved Linkdin GitHub
Currently pursuing my Master's in Applied Computing at the University of Windsor, Canada. I hold a Bachelor's degree in Computer Science from NMAM Institute of Technology with a stellar CGPA of 9.25/10. I bring a wealth of experience as a backend developer at Clover Bay Technologies, specializing in reactive microservices using Spring and Couchbase. Recognized as the "Employee of the Year 2021" for my dedication and accomplishments, I am a highly ambitious and performance-driven Software Developer. Proficient in managing multiple projects with strict timelines while adhering to brand identity, I have a strong capacity for innovation and problem-solving in backend development. Achieved AWS Certified Cloud Practitioner Certification, showcasing mastery in foundational cloud concepts and validating essential cloud skills through focused study and examination.
Working as a lead backend developer on an E-commerce B2B web application, which serves millions of business users all around the world.
Led the development of Qalara, a B2B e-commerce platform, taking charge of diverse features like Orders management, Quality inspectors' platform, Purchase orders panel, Reports, Product Recommendation System, Payments integration, Database integration. Acquired proficiency in technologies like Reactive Spring API’s, Spring Security, SES mailing, AWS Lambda Functions, Asset management using Buckets, pdf and excel invoice generation and Couchbase.
Implemented a recurring cron job to dispatch personalized emails to abandoned carts, resulting in a remarkable 40% surge in sales over subsequent months, showcasing adept strategic planning and impactful execution.
In pursuit of the ambitious goal to achieve Zero Hunger by 2030, the "Produce Sync" initiative aims to increase the income of small-scale food producers
The project strategically integrates MongoDB and the OpenCage API, fostering geospatial connectivity to empower local farmers, engage buyers directly, and minimize logistic costs.
Addressing the critical issue of food waste, the initiative pioneers advanced technology by employing computer vision. This groundbreaking approach innovates categorical price segmentation, actively assessing fruit ripeness and dynamically adjusting prices.
The system ensures timely sales, reducing waste and optimizing profits, aligning with the broader vision of sustainable agricultural practices.
I developed a portfolio website showcasing a selection of my personal projects, employing HTML for design and functionality.
The website is hosted on Netlify, ensuring seamless accessibility and a user-friendly experience.
Through this initiative, I actively demonstrate my skills and accomplishments in a visually appealing and accessible format.
An app built using react.js with face detection api ,backend using node.js and express.js , and a postgresql database hosted on heroku server give your image url and let the app detect the face .
Creates a dynamic web page feature that visually showcases the RGB colors of a selected input and instantly transforms the page background to the chosen color, offering an interactive and real-time color experience.
Developed a React project using Create React App to create interactive robot cards, allowing users to easily search and access details of dummy robots.
Designed and implemented a database project using XAMPP and PHP for Pizza Management, offering features such as adding pizzas, deleting pizzas, and allowing users to write and store their own recipes.
Developed a comprehensive Farmer Loan Management System as a database project using XAMPP and PHP. The system empowers farmers to efficiently manage, apply for, and pay their loans through a user-friendly interface.
Clients connect to a server to request files, and the server responds by searching for files and returning them or an appropriate message. The response includes a compressed tar archive containing all files found based on the client's command.
The server (serverw24) runs alongside two mirrors (mirror1 and mirror2), forks a child process to handle each client request, and uses a `crequest()` function to process client commands. The server exits upon receiving the `quitc` command.
The client (clientw24) operates in an infinite loop, waiting for user commands, verifies command syntax before sending them to the server, and supports various commands such as `dirlist`, `w24fn`, `w24fz`, `w24ft`, `w24fdb`, `w24fda`, and `quitc`.
To ensure load balancing, initial client connections are handled by serverw24, with subsequent connections alternating between serverw24, mirror1, and mirror2.
Background execution allows running commands with `&`, and foreground execution uses the `fg` command to bring background processes to the foreground.
The shell supports opening new instances within the current shell using the `newt` command and concatenating text files into a single output file with the `cat
Pipe operations and redirection are supported, enabling the output of one command to serve as input to another with `|`, and redirecting standard input/output using `>`, `>>`, and `<` operators.
Conditional execution with `&&` and `||`, sequential execution with `;`, and signal handling for terminating background processes with Ctrl+C (SIGINT) are also featured.
Comprehensive Backup Strategy: AutoBackupPro offers complete, incremental, and differential backups of files within the /home/username directory, ensuring all changes are regularly saved.
Automated Backup Process: The script runs continuously in the background, performing backups at specified intervals without requiring manual intervention.
Backup Naming Convention: Backup files are named systematically, distinguishing between complete, incremental, and differential backups, and are appended with a sequence number for tracking purposes.
Detailed Logging: AutoBackupPro maintains a log file (backup.log) with timestamps and filenames of created backup files, providing a record of backup activities for monitoring and troubleshooting purposes.
Applied machine learning algorithms, including Logistic Regression, Support Vector Machine, and Random Forest Classifier, to analyze the MSK-MET dataset for predicting overall survival in metastatic cancer patients.
Explored and visualized the distribution of distant metastases, addressing missing values and examining relationships between overall survival and demographic factors such as sex, race category, and cancer type.
Implemented PCA for dimensionality reduction, capturing essential information and reducing dataset complexity to enhance the efficiency of predictive modeling.
Optimized the dataset for predictive modeling by employing SelectKBest and Mutual Information feature selection techniques, contributing to the development of a robust predictive model for mortality in a pan-cancer cohort.
The project collected phishing and legitimate URLs from PhishTank and the University of New Brunswick, ensuring a comprehensive representation of web traffic.
Features were extracted from URLs using a combination of techniques, including parsing the address bar for domain-related information and analyzing HTML & JavaScript elements for behavioral indicators such as iframe presence and right-click disablement.
By training and evaluating various machine learning models such as Decision Trees, Random Forests, and XGBoost, the project identified XGBoost as the most effective in detecting phishing websites.
The project's insights highlight the importance of machine learning in cybersecurity, offering a proactive defense mechanism against online threats through accurate phishing website detection.
The project starts by cleaning the SMS Spam Collection dataset, removing punctuation, and stemming words.
We train Random Forest and Gradient Boosting models on the TF-IDF vectorized data, evaluating their performance based on fit time, predict time, and accuracy.
Precision is emphasized to reduce false positives, preventing legitimate emails from being misclassified as spam.
In pursuit of the ambitious goal to achieve Zero Hunger by 2030, the "Produce Sync" initiative aims to increase the income of small-scale food producers
Addressing the critical issue of food waste, the initiative pioneers advanced technology by employing computer vision. This groundbreaking approach innovates categorical price segmentation, actively assessing fruit ripeness and dynamically adjusting prices.
The model utilizes a majority-based approach to accurately categorize tomato images by ripeness, leveraging transfer learning with ResNet architecture.
The project strategically integrates MongoDB and the OpenCage API, fostering geospatial connectivity to empower local farmers, engage buyers directly, and minimize logistic costs.
The system ensures timely sales, reducing waste and optimizing profits, aligning with the broader vision of sustainable agricultural practices.
This project optimizes job scraping through efficient HTML parsing with Jsoup and enhances word retrieval using inverted indexing with Trie Data Structure. It aims to uncover the highest-paying jobs and rank results for enhanced user-friendly job market exploration.
Inverted Indexing for Efficient Retrieval: The Trie Data Structure serves as the backbone for inverted indexing, providing efficient storage and retrieval of words from job descriptions. TrieNodes storing job IDs enable swift identification of jobs containing specific terms, optimizing search functionalities.
Word Frequency Analysis: TrieNode coupled with TreeMap(wordFrequency) facilitates the systematic counting of word occurrences in job descriptions.
Dynamic Search Frequency Monitoring: Utilizing LRUCache, the system tracks and updates search term frequencies. Paired with a SortedArray, it offers a concise and sorted view of recent search activities, providing insights into user search patterns.
Automated Extraction: This Java program efficiently uses Selenium WebDriver to gather plant information from Better Homes & Gardens. By initiating a ChromeDriver and navigating through the site, it dynamically collects 378 plant names and links, streamlining data extraction for analysis.
Table Data Extraction: Using advanced Selenium commands, the program targets specific table information by searching for identifiers like "Genus Name" and extracting associated values such as "Arctotis."
Paragraph Retrieval: Leveraging Selenium, the program extracts data from paragraphs by locating headings like "Soil and Water" and retrieving subsequent paragraphs with the "following-sibling::p" command.
Facebook Sharing Automation: Utilizing Selenium's capabilities, the program automates sharing a specific plant entry, Baby's Breath, on Facebook.
Spearheaded the development of a Product Recommendation System, leveraging semantic web technology and RDF graphs to represent intricate data relationships.
Executed a Proof of Concept (POC) to generate an RDF graph, showcasing the system's capability to understand and link product data effectively.
Chose Virtuoso as the technology backbone, ensuring scalability and high-performance for efficient storage and retrieval of RDF data.
Demonstrated the feasibility of the system, emphasizing the ability to obtain precise product recommendations through the implementation of Virtuoso and RDF graphs.
The Skipjack encryption has been fortified by enlarging the block size from 64 to 80 bits. This modification aims to bolster security by increasing the complexity of the encryption process.
To improve the frequency distribution within the cipher, the block division in Skipjack has been refined. Instead of four 16-bit words, the block is now divided into five 20-bit words.
A new modular multiplicative inverse technique has been introduced to the encryption process. This new function operates on the principle of modular multiplicative inverse, transforming an 80-bit block into four distinct 20-bit enciphered text blocks. This innovation adds an extra layer of complexity, making decryption more challenging for potential attackers.
Developed a customized Flappy Bird using GDXlib within the Android SDK, providing a unique and challenging gaming experience.
Take on the challenge by downloading the app and enjoying the thrill of this personalized version.
A clone of the uber app , Created on the basis of the driver and the rider .
A native app where you can save and delete your notes .It is built with react native.
A personality test app built with Flutter . Test and see how this app tests you based on simple questions
A game where you need to guess the celebrity by looking at their image .
Leveraging the Hortonworks Docker sandbox, We utilized the Ambari web service for our project, concentrating on analyzing a sizable dataset collected from diverse Facebook users.
Employing tools such as MapReduce with Python, Spark, Hive, and Zeppelin, we executed operations to extract insights and identify new opportunities from the data.
deonvictorlobo@gmail.com
Email