Skip to main content

According to the report published in 2021 by global cancer statistics, the world’s new cancer cases have reached 19.3 million, and lung cancer is the second largest cause of cancer after female breast cancer.

Among all the other deadliest diseases, cancer is the most lethal cause of death, which may occur in various types and regions inside the human body. It arises due to uncontrolled cellular functionality that happens when cells start working independently and performing odd behaviors compared to their surrounding structures. When it occurs in the lung region, it creates a severe condition, as lung cancer’s survival rate is lowest compared to other types of cancer. According to the report published in 2021 by global cancer statistics, the world’s new cancer cases have reached 19.3 million, and lung cancer is the second largest cause of cancer after female breast cancer. In various research studies, it has been found that early cancer detection may increase the chances of long-term survival.

Furthermore, studies have indicated that a low dose CT scan and its automated analysis can be the best way of early cancer detection. It works without putting the human body in harm compared to other invasive procedures. Analyzing CT images to get insights into internal structure and abnormality has been vastly carried out by experts and radiologists in previous days. But due to the consistent efforts of researchers, artificial intelligence (AI) based disease detection systems have been gaining the direct attention of clinical practitioners and medical institutes for the past two decades. With increasing structured data in the medical field, more opportunities for finding precise and straightforward methods of diagnosis are opening. The modern world where we live now is just embarking on the long and exciting data science journey that can lead us towards unimaginable peaks of automated technological solutions. It is yet unclear how precise the medical technology would be in detecting and diagnosing diseases at their early stage. Perhaps, the advanced algorithm defines the upcoming health issues before their occurrence in the human body. It allows someone to take corrective feedback in an ongoing lifestyle to prevent the forthcoming critical situation. Everything we imagined today may be possible in the near future but that requires consistent efforts to develop efficient techniques and algorithms.

With the collaborative or individual efforts of various scientific and research groups, algorithms are rapidly getting advanced by adopting the most trustworthy and less time-consuming methodologies. Deep learning and fine-tuned neural networks play an essential role in that context as they open new ways to modify existing techniques to obtain promising results. Inside the deep neural network, it’s hard to know how they behave and interact, but by employing systematic hyperparameters and fine-tuning, outcomes and performance can be improved. Some key hyperparameters are learning rate, optimization algorithm, activation function, number of hidden layers, number of activation units, kernel size, pooling size, batch size, and number of epochs.

Apart from the hyperparameter, the most crucial point is to decide a number of layers for the optimal and best performance of the network. Creating balance with all these parameters is complex and requires multiple testing and experiments. It takes a lot of effort and time to maintain a balance between two different quantities for the increased performance of a deep learning network. This problem can be addressed by employing an additional algorithm responsible for performing the required analysis to choose the best hyperparameter and values to establish a fully automated and self-improving deep learning network. Maybe the better strategy is to implement the most adaptive and high-performance AI algorithm, which helps detect and diagnose diseases on the individual and societal scale.

With this article, I would like to propose a “partial naturally randomized deep learning layers (PNRDL)” for the advanced performance of an automated detection system. The procedure is not tested yet, but the inclusion of optimization with a little bit of relaxing parameter in the form of partial naturally randomized weights may provide better human-level performance for the analysis of CT images to detect lung cancer or its early signs. In general, relaxing situations are when someone takes a break from the process to achieve the desired goal and spends some time doing other activities. Many philosophical studies discuss this period as the best period for finding new innovative, astonishing ideas that revolutionize the actual path of moving towards the desired goal. Sometimes it gives better outcomes than expected results that someone never imagined. I considered this not a coincidence but a part of natural computing through which anyone and everyone are connected. Billions of neurons continuously acquire weights inside the human brain by taking insights or parametric values. These values or parameters are generated by nature that add up with the brain as essential insights for the computation of various tasks, which sometimes reveal exceptional outcomes.

I believe it is a substantial phenomenon that is seemingly random but not actually random as everything inside nature runs by natural computing. To introduce a relaxing period inside the machine, I propose a relaxed parameter-based, “partial naturally randomized deep learning layers” that takes random values for deep learning weight updation. Random values used in these experiments will be directly obtained from the naturally randomized numbers that are randomized and obtained during individual image analysis. After such procedures, it may be possible to get a deeper connection to the automated detection system through natural computing, where the randomization of parameters works as a relaxing situation in the case of the human being. By going through this procedure, I believe the development of automated disease detection systems has become more realistic and precise. Maybe the idea of making this type of system works fine, and the procedure discussed for disease detection render outstanding results if employed in other research and development purposes. Figure 1 below is showing a short description of the development of the proposed algorithm.

The experimental scope of this research focuses on taking a machine into the time-space where some of its weight functions process the very subtle event of the natural parameter to extract essential values.

Figure 1 Short description of the proposed algorithm

However, the proposed model is still under exploration as the procedure to obtain an utterly randomized number is still in the experimental stage. The first few tries to generate random values are conducted by taking never-repeating decimal digits of “pi”. Although, they are also finite at any given period. Further trials to generate a natural randomized number to introduce relaxation sessions inside machine-learning operations are ongoing. The study aims to take insights from nature, always trying to say something to us regarding any event and situation. An American biologist, Barry Commoner once said that “everything is connected to everything else”.

It means the present case, past conditions, and future outcomes are not distinct. It is just our inability not to process the subtle insights of the surroundings and ignore them by considering them as random non-valuable information. Processing every element of information is not possible, but taking a small piece and extracting valuable insights is the way to achieve desired outcomes. The experimental scope of this research focuses on taking a machine into the time-space where some of its weight functions process the very subtle event of the natural parameter to extract essential values. The outcome of this process will generate a piece of information that helps to implement all-inclusive nature-driven algorithmic results. Unlike the existing computation system, it performs its task by combining insights of input data and instincts of current surroundings. It is an entirely radical approach to using AI techniques that can be time-consuming but taking a chance to develop and work upon this methodology may undoubtedly be helpful. After its successful implementation, it may be possible to implement a deeper bond with nature and harness the power of robust natural computing or actual computing. 

Deep learning in AI also suggests that deeper and closely connected nodes are the one who dominates the outcomes. Similarly, the more dominant one around us (nature) should be deeply and strongly connected to every possible node (eg. deep learning algorithms and layers) to obtain the best possible results today and in the near future.

Composed by: “Mr. Resham Raj Shivwanshi is pursuing PhD at the Department of Biomedical Engineering, NIT Raipur. He is currently working upon medical imaging, CT scan analysis,Machine learning and AI methodologies.”

InnoHEALTH magazine digital team

Author InnoHEALTH magazine digital team

More posts by InnoHEALTH magazine digital team

Leave a Reply