Machine Learning-Driven Optimization of Wireless Communication in Smart Ecosystems: Comparative Analysis of Image Denoising on Oracle Cloud Platforms

Authors

  • Ishaan Vivek Chatterjee Department of Computer Science and Engineering, SYCET, Aurangabad, India. Author

DOI:

https://doi.org/10.21590/

Keywords:

Wireless Mobile Communication, Smart Connect Ecosystems, Image Denoising, Oracle Cloud Database, Edge vs Cloud Processing,PSNR, SSIM, Latency & Energy Trade-off.

Abstract

Wireless mobile communication within Smart Connect ecosystems (such as IoT networks, smart cities, connected vehicles, mobile sensor arrays) is often challenged by noise in imaging data, limited bandwidth, latency constraints, and resource limitations on client devices. This paper investigates how machine learning (ML) techniques combined with Oracle Cloud database services can optimize mobile wireless communication, especially when image data transmitted over noisy or constrained channels must be denoised efficiently. We perform a comparative study of several image denoising techniques — traditional filters (Gaussian, median, bilateral), non-local means (NLM), BM3D, and deep learning methods (DnCNN, FFDNet) — in the context of mobile-edge-cloud pipelines. We propose a system architecture in which denoising can be done either on the mobile device, at the edge, or in the Oracle Cloud, depending on network conditions, energy budgets, and latency requirements. Oracle’s Autonomous Database and Oracle Machine Learning services are used to store denoised and raw image data, run ML model inference, monitor performance, and support adaptive decision-making for where denoising should take place. The experimental evaluation is conducted using real and synthetic image datasets corrupted with different noise types (Gaussian, salt-and-pepper, speckle), across different mobile wireless link qualities (varying SNR, bandwidth) and with variations in compute capabilities on mobile/edge nodes. Key metrics include Peak Signal-to-Noise Ratio (PSNR), Structural Similarity Index (SSIM), inference latency, energy consumption, and total communication cost (data transmitted). Results show that deep learning models (FFDN etc.) provide superior image quality (up to ~2–4 dB PSNR gain over traditional filters) but at higher computational cost. Hybrid schemes (e.g. partial denoising on mobile + refinement in cloud) can balance trade-offs, achieving near-cloud quality while reducing transmitted data by up to ~50% and reducing end-to-end latency under moderate network degradation. We discuss benefits and drawbacks, trade-offs between image quality, energy, latency, and cloud vs edge processing, and propose guidelines for deployment.

Downloads

Published

2024-06-30

Similar Articles

1-10 of 91

You may also start an advanced similarity search for this article.