Search results for: elliptic curve digital signature algorithm
896 Advances in Mathematical Sciences: Unveiling the Power of Data Analytics
Authors: Zahid Ullah, Atlas Khan
Abstract:
The rapid advancements in data collection, storage, and processing capabilities have led to an explosion of data in various domains. In this era of big data, mathematical sciences play a crucial role in uncovering valuable insights and driving informed decision-making through data analytics. The purpose of this abstract is to present the latest advances in mathematical sciences and their application in harnessing the power of data analytics. This abstract highlights the interdisciplinary nature of data analytics, showcasing how mathematics intersects with statistics, computer science, and other related fields to develop cutting-edge methodologies. It explores key mathematical techniques such as optimization, mathematical modeling, network analysis, and computational algorithms that underpin effective data analysis and interpretation. The abstract emphasizes the role of mathematical sciences in addressing real-world challenges across different sectors, including finance, healthcare, engineering, social sciences, and beyond. It showcases how mathematical models and statistical methods extract meaningful insights from complex datasets, facilitating evidence-based decision-making and driving innovation. Furthermore, the abstract emphasizes the importance of collaboration and knowledge exchange among researchers, practitioners, and industry professionals. It recognizes the value of interdisciplinary collaborations and the need to bridge the gap between academia and industry to ensure the practical application of mathematical advancements in data analytics. The abstract highlights the significance of ongoing research in mathematical sciences and its impact on data analytics. It emphasizes the need for continued exploration and innovation in mathematical methodologies to tackle emerging challenges in the era of big data and digital transformation. In summary, this abstract sheds light on the advances in mathematical sciences and their pivotal role in unveiling the power of data analytics. It calls for interdisciplinary collaboration, knowledge exchange, and ongoing research to further unlock the potential of mathematical methodologies in addressing complex problems and driving data-driven decision-making in various domains.Keywords: mathematical sciences, data analytics, advances, unveiling
Procedia PDF Downloads 93895 High Aspect Ratio Micropillar Array Based Microfluidic Viscometer
Authors: Ahmet Erten, Adil Mustafa, Ayşenur Eser, Özlem Yalçın
Abstract:
We present a new viscometer based on a microfluidic chip with elastic high aspect ratio micropillar arrays. The displacement of pillar tips in flow direction can be used to analyze viscosity of liquid. In our work, Computational Fluid Dynamics (CFD) is used to analyze pillar displacement of various micropillar array configurations in flow direction at different viscosities. Following CFD optimization, micro-CNC based rapid prototyping is used to fabricate molds for microfluidic chips. Microfluidic chips are fabricated out of polydimethylsiloxane (PDMS) using soft lithography methods with molds machined out of aluminum. Tip displacements of micropillar array (300 µm in diameter and 1400 µm in height) in flow direction are recorded using a microscope mounted camera, and the displacements are analyzed using image processing with an algorithm written in MATLAB. Experiments are performed with water-glycerol solutions mixed at 4 different ratios to attain 1 cP, 5 cP, 10 cP and 15 cP viscosities at room temperature. The prepared solutions are injected into the microfluidic chips using a syringe pump at flow rates from 10-100 mL / hr and the displacement versus flow rate is plotted for different viscosities. A displacement of around 1.5 µm was observed for 15 cP solution at 60 mL / hr while only a 1 µm displacement was observed for 10 cP solution. The presented viscometer design optimization is still in progress for better sensitivity and accuracy. Our microfluidic viscometer platform has potential for tailor made microfluidic chips to enable real time observation and control of viscosity changes in biological or chemical reactions.Keywords: Computational Fluid Dynamics (CFD), high aspect ratio, micropillar array, viscometer
Procedia PDF Downloads 246894 Dynamic Web-Based 2D Medical Image Visualization and Processing Software
Authors: Abdelhalim. N. Mohammed, Mohammed. Y. Esmail
Abstract:
In the course of recent decades, medical imaging has been dominated by the use of costly film media for review and archival of medical investigation, however due to developments in networks technologies and common acceptance of a standard digital imaging and communication in medicine (DICOM) another approach in light of World Wide Web was produced. Web technologies successfully used in telemedicine applications, the combination of web technologies together with DICOM used to design a web-based and open source DICOM viewer. The Web server allowance to inquiry and recovery of images and the images viewed/manipulated inside a Web browser without need for any preinstalling software. The dynamic site page for medical images visualization and processing created by using JavaScript and HTML5 advancements. The XAMPP ‘apache server’ is used to create a local web server for testing and deployment of the dynamic site. The web-based viewer connected to multiples devices through local area network (LAN) to distribute the images inside healthcare facilities. The system offers a few focal points over ordinary picture archiving and communication systems (PACS): easy to introduce, maintain and independently platforms that allow images to display and manipulated efficiently, the system also user-friendly and easy to integrate with an existing system that have already been making use of web technologies. The wavelet-based image compression technique on which 2-D discrete wavelet transform used to decompose the image then wavelet coefficients are transmitted by entropy encoding after threshold to decrease transmission time, stockpiling cost and capacity. The performance of compression was estimated by using images quality metrics such as mean square error ‘MSE’, peak signal to noise ratio ‘PSNR’ and compression ratio ‘CR’ that achieved (83.86%) when ‘coif3’ wavelet filter is used.Keywords: DICOM, discrete wavelet transform, PACS, HIS, LAN
Procedia PDF Downloads 160893 Efficacy and Mechanisms of Acupuncture for Depression: A Meta-Analysis of Clinical and Preclinical Evidence
Authors: Yimeng Zhang
Abstract:
Major depressive disorder (MDD) is a prevalent mental health condition with a substantial economic impact and limited treatment options. Acupuncture has gained attention as a promising non-pharmacological intervention for alleviating depressive symptoms. However, its mechanisms and clinical effectiveness remain incompletely understood. This meta-analysis aims to (1) synthesize existing evidence on the mechanisms and clinical effectiveness of acupuncture for depression and (2) compare these findings with pharmacological interventions, providing insights for future research. Evidence from animal models and clinical studies indicates that acupuncture may enhance hippocampal and network neuroplasticity and reduce brain inflammation, potentially alleviating depressive disorders. Clinical studies suggest that acupuncture can effectively relieve primary depression, particularly in milder cases, and is beneficial in managing post-stroke depression, pain-related depression, and postpartum depression, both as a standalone and adjunctive treatment. Notably, combining acupuncture with antidepressant pharmacotherapy appears to enhance treatment outcomes and reduce medication side effects, addressing a critical issue in conventional drug therapy's high dropout rates. This meta-analysis, encompassing 12 studies and 710 participants, draws data from eight digital databases (PubMed, EMBASE, Web of Science, EBSCOhost, CNKI, CBM, Wangfang, and CQVIP) covering the period from 2012 to 2022. Utilizing Stata software 15.0, the meta-analysis employed random-effects and fixed-effects models to assess the distribution of depression in Traditional Chinese Medicine (TCM). The results underscore the substantial evidence supporting acupuncture's beneficial effects on depression. However, the small sample sizes of many clinical trials raise concerns about the generalizability of the findings, indicating a need for further research to validate these outcomes and optimize acupuncture's role in treating depression.Keywords: Chinese medicine, acupuncture, depression, meta-analysis
Procedia PDF Downloads 35892 Strong Ground Motion Characteristics Revealed by Accelerograms in Ms8.0 Wenchuan Earthquake
Authors: Jie Su, Zhenghua Zhou, Yushi Wang, Yongyi Li
Abstract:
The ground motion characteristics, which are given by the analysis of acceleration records, underlie the formulation and revision of the seismic design code of structural engineering. China Digital Strong Motion Network had recorded a lot of accelerograms of main shock from 478 permanent seismic stations, during the Ms8.0 Wenchuan earthquake on 12th May, 2008. These accelerograms provided a large number of essential data for the analysis of ground motion characteristics of the event. The spatial distribution characteristics, rupture directivity effect, hanging-wall and footwall effect had been studied based on these acceleration records. The results showed that the contours of horizontal peak ground acceleration and peak velocity were approximately parallel to the seismogenic fault which demonstrated that the distribution of the ground motion intensity was obviously controlled by the spatial extension direction of the seismogenic fault. Compared with the peak ground acceleration (PGA) recorded on the sites away from which the front of the fault rupture propagates, the PGA recorded on the sites toward which the front of the fault rupture propagates had larger amplitude and shorter duration, which indicated a significant rupture directivity effect. With the similar fault distance, the PGA of the hanging-wall is apparently greater than that of the foot-wall, while the peak velocity fails to observe this rule. Taking account of the seismic intensity distribution of Wenchuan Ms8.0 earthquake, the shape of strong ground motion contours was significantly affected by the directional effect in the regions with Chinese seismic intensity level VI ~ VIII. However, in the regions whose Chinese seismic intensity level are equal or greater than VIII, the mutual positional relationship between the strong ground motion contours and the surface outcrop trace of the fault was evidently influenced by the hanging-wall and foot-wall effect.Keywords: hanging-wall and foot-wall effect, peak ground acceleration, rupture directivity effect, strong ground motion
Procedia PDF Downloads 350891 Food Security and Utilization in Ethiopia
Authors: Tuji Jemal Ahmed
Abstract:
Food security and utilization are critical aspects of ensuring the well-being and prosperity of a nation. This paper examines the current state of food security and utilization in Ethiopia, focusing on the challenges, opportunities, and strategies employed to address the issue. Ethiopia, a country in East Africa, has made significant progress in recent years to improve food security and utilization for its population. However, persistent challenges such as recurrent droughts, limited access to resources, and low agricultural productivity continue to pose obstacles to achieving sustainable food security. The paper begins by providing an overview of the concept of food security, emphasizing its multidimensional nature and the importance of access, availability, utilization, and stability. It then explores the specific factors influencing food security and utilization in Ethiopia, including natural resources, climate variability, agricultural practices, infrastructure, and socio-economic factors. Furthermore, the paper highlights the initiatives and interventions implemented by the Ethiopian government, non-governmental organizations, and international partners to enhance food security and utilization. These efforts include agricultural extension programs, irrigation projects, investments in rural infrastructure, and social safety nets to protect vulnerable populations. The study also examines the role of technology and innovation in improving food security and utilization in Ethiopia. It explores the potential of sustainable agricultural practices, such as conservation agriculture, improved seed varieties, and precision farming techniques. Additionally, it discusses the role of digital technologies in enhancing access to market information, financial services, and agricultural inputs for smallholder farmers. Finally, the paper discusses the importance of collaboration and partnerships between stakeholders, including government agencies, development organizations, research institutions, and communities, in addressing food security and utilization challenges. It emphasizes the need for integrated and holistic approaches that consider both production and consumption aspects of the food system.Keywords: food security, utilization, Ethiopia, challenges
Procedia PDF Downloads 120890 Food Security and Utilization in Ethiopia
Authors: Tuji Jemal Ahmed
Abstract:
Food security and utilization are critical aspects of ensuring the well-being and prosperity of a nation. This paper examines the current state of food security and utilization in Ethiopia, focusing on the challenges, opportunities, and strategies employed to address the issue. Ethiopia, a country in East Africa, has made significant progress in recent years to improve food security and utilization for its population. However, persistent challenges such as recurrent droughts, limited access to resources, and low agricultural productivity continue to pose obstacles to achieving sustainable food security. The paper begins by providing an overview of the concept of food security, emphasizing its multidimensional nature and the importance of access, availability, utilization, and stability. It then explores the specific factors influencing food security and utilization in Ethiopia, including natural resources, climate variability, agricultural practices, infrastructure, and socio-economic factors. Furthermore, the paper highlights the initiatives and interventions implemented by the Ethiopian government, non-governmental organizations, and international partners to enhance food security and utilization. These efforts include agricultural extension programs, irrigation projects, investments in rural infrastructure, and social safety nets to protect vulnerable populations. The study also examines the role of technology and innovation in improving food security and utilization in Ethiopia. It explores the potential of sustainable agricultural practices, such as conservation agriculture, improved seed varieties, and precision farming techniques. Additionally, it discusses the role of digital technologies in enhancing access to market information, financial services, and agricultural inputs for smallholder farmers. Finally, the paper discusses the importance of collaboration and partnerships between stakeholders, including government agencies, development organizations, research institutions, and communities, in addressing food security and utilization challenges. It emphasizes the need for integrated and holistic approaches that consider both production and consumption aspects of the food system.Keywords: food security, utilization, Ethiopia, challenges
Procedia PDF Downloads 84889 Error Detection and Correction for Onboard Satellite Computers Using Hamming Code
Authors: Rafsan Al Mamun, Md. Motaharul Islam, Rabana Tajrin, Nabiha Noor, Shafinaz Qader
Abstract:
In an attempt to enrich the lives of billions of people by providing proper information, security and a way of communicating with others, the need for efficient and improved satellites is constantly growing. Thus, there is an increasing demand for better error detection and correction (EDAC) schemes, which are capable of protecting the data onboard the satellites. The paper is aimed towards detecting and correcting such errors using a special algorithm called the Hamming Code, which uses the concept of parity and parity bits to prevent single-bit errors onboard a satellite in Low Earth Orbit. This paper focuses on the study of Low Earth Orbit satellites and the process of generating the Hamming Code matrix to be used for EDAC using computer programs. The most effective version of Hamming Code generated was the Hamming (16, 11, 4) version using MATLAB, and the paper compares this particular scheme with other EDAC mechanisms, including other versions of Hamming Codes and Cyclic Redundancy Check (CRC), and the limitations of this scheme. This particular version of the Hamming Code guarantees single-bit error corrections as well as double-bit error detections. Furthermore, this version of Hamming Code has proved to be fast with a checking time of 5.669 nanoseconds, that has a relatively higher code rate and lower bit overhead compared to the other versions and can detect a greater percentage of errors per length of code than other EDAC schemes with similar capabilities. In conclusion, with the proper implementation of the system, it is quite possible to ensure a relatively uncorrupted satellite storage system.Keywords: bit-flips, Hamming code, low earth orbit, parity bits, satellite, single error upset
Procedia PDF Downloads 130888 Pilot-free Image Transmission System of Joint Source Channel Based on Multi-Level Semantic Information
Authors: Linyu Wang, Liguo Qiao, Jianhong Xiang, Hao Xu
Abstract:
In semantic communication, the existing joint Source Channel coding (JSCC) wireless communication system without pilot has unstable transmission performance and can not effectively capture the global information and location information of images. In this paper, a pilot-free image transmission system of joint source channel based on multi-level semantic information (Multi-level JSCC) is proposed. The transmitter of the system is composed of two networks. The feature extraction network is used to extract the high-level semantic features of the image, compress the information transmitted by the image, and improve the bandwidth utilization. Feature retention network is used to preserve low-level semantic features and image details to improve communication quality. The receiver also is composed of two networks. The received high-level semantic features are fused with the low-level semantic features after feature enhancement network in the same dimension, and then the image dimension is restored through feature recovery network, and the image location information is effectively used for image reconstruction. This paper verifies that the proposed multi-level JSCC algorithm can effectively transmit and recover image information in both AWGN channel and Rayleigh fading channel, and the peak signal-to-noise ratio (PSNR) is improved by 1~2dB compared with other algorithms under the same simulation conditions.Keywords: deep learning, JSCC, pilot-free picture transmission, multilevel semantic information, robustness
Procedia PDF Downloads 120887 Calling the Shots: How Others’ Mistakes May Influence Vaccine Take-up
Authors: Elizabeth Perry, Jylana Sheats
Abstract:
Scholars posit that there is an overlap between the fields of Behavioral Economics (BE) and Behavior Science (BSci)—and that consideration of concepts from both may facilitate a greater understanding of health decision-making processes. For example, the ‘intention-action gap’ is one BE concept to explain sup-optimal decision-making. It is described as having knowledge that does not translate into behavior. Complementary best BSci practices may provide insights into behavioral determinants and relevant behavior change techniques (BCT). Within the context of BSci, this exploratory study aimed to apply a BE concept with demonstrated effectiveness in financial decision-making to a health behavior: influenza (flu) vaccine uptake. Adults aged >18 years were recruited on Amazon’s Mechanical Turk, a digital labor market where anonymous users perform simple tasks at low cost. Eligible participants were randomized into 2 groups, reviewed a scenario, and then completed a survey on the likelihood of receiving a flu shot. The ‘usual care’ group’s scenario included standard CDC guidance that supported the behavior. The ‘intervention’ group’s scenario included messaging about people who did not receive the flu shot. The framing was such that participants could learn from others’ (strangers) mistakes and the subsequent health consequences: ‘Last year, other people who didn’t get the vaccine were about twice as likely to get the flu, and a number of them were hospitalized or even died. Don’t risk it.’ Descriptive statistics and chi-square analyses were performed on the sample. There were 648 participants (usual care, n=326; int., n=322). Among racial/ethnic minorities (n=169; 57% aged < 40), the intervention group was 22% more likely to report that they were ‘extremely’ or ‘moderately’ likely to get the flu vaccine (p = 0.11). While not statistically significant, findings suggest that framing messages from the perspective of learning from the mistakes of unknown others coupled with the BCT ‘knowledge about the health consequences’ may help influence flu vaccine uptake among the study population. With the widely documented disparities in vaccine uptake, exploration of the complementary application of these concepts and strategies may be critical.Keywords: public health, decision-making, vaccination, behavioral science
Procedia PDF Downloads 41886 Simple Finite-Element Procedure for Modeling Crack Propagation in Reinforced Concrete Bridge Deck under Repetitive Moving Truck Wheel Loads
Authors: Rajwanlop Kumpoopong, Sukit Yindeesuk, Pornchai Silarom
Abstract:
Modeling cracks in concrete is complicated by its strain-softening behavior which requires the use of sophisticated energy criteria of fracture mechanics to assure stable and convergent solutions in the finite-element (FE) analysis particularly for relatively large structures. However, for small-scale structures such as beams and slabs, a simpler approach relies on retaining some shear stiffness in the cracking plane has been adopted in literature to model the strain-softening behavior of concrete under monotonically increased loading. According to the shear retaining approach, each element is assumed to be an isotropic material prior to cracking of concrete. Once an element is cracked, the isotropic element is replaced with an orthotropic element in which the new orthotropic stiffness matrix is formulated with respect to the crack orientation. The shear transfer factor of 0.5 is used in parallel to the crack plane. The shear retaining approach is adopted in this research to model cracks in RC bridge deck with some modifications to take into account the effect of repetitive moving truck wheel loads as they cause fatigue cracking of concrete. First modification is the introduction of fatigue tests of concrete and reinforcing steel and the Palmgren-Miner linear criterion of cumulative damage in the conventional FE analysis. For a certain loading, the number of cycles to failure of each concrete or RC element can be calculated from the fatigue or S-N curves of concrete and reinforcing steel. The elements with the minimum number of cycles to failure are the failed elements. For the elements that do not fail, the damage is accumulated according to Palmgren-Miner linear criterion of cumulative damage. The stiffness of the failed element is modified and the procedure is repeated until the deck slab fails. The total number of load cycles to failure of the deck slab can then be obtained from which the S-N curve of the deck slab can be simulated. Second modification is the modification in shear transfer factor. Moving loading causes continuous rubbing of crack interfaces which greatly reduces shear transfer mechanism. It is therefore conservatively assumed in this study that the analysis is conducted with shear transfer factor of zero for the case of moving loading. A customized FE program has been developed using the MATLAB software to accomodate such modifications. The developed procedure has been validated with the fatigue test of the 1/6.6-scale AASHTO bridge deck under the applications of both fixed-point repetitive loading and moving loading presented in the literature. Results are in good agreement both experimental vs. simulated S-N curves and observed vs. simulated crack patterns. Significant contribution of the developed procedure is a series of S-N relations which can now be simulated at any desired levels of cracking in addition to the experimentally derived S-N relation at the failure of the deck slab. This permits the systematic investigation of crack propagation or deterioration of RC bridge deck which is appeared to be useful information for highway agencies to prolong the life of their bridge decks.Keywords: bridge deck, cracking, deterioration, fatigue, finite-element, moving truck, reinforced concrete
Procedia PDF Downloads 257885 Using Geo-Statistical Techniques and Machine Learning Algorithms to Model the Spatiotemporal Heterogeneity of Land Surface Temperature and its Relationship with Land Use Land Cover
Authors: Javed Mallick
Abstract:
In metropolitan areas, rapid changes in land use and land cover (LULC) have ecological and environmental consequences. Saudi Arabia's cities have experienced tremendous urban growth since the 1990s, resulting in urban heat islands, groundwater depletion, air pollution, loss of ecosystem services, and so on. From 1990 to 2020, this study examines the variance and heterogeneity in land surface temperature (LST) caused by LULC changes in Abha-Khamis Mushyet, Saudi Arabia. LULC was mapped using the support vector machine (SVM). The mono-window algorithm was used to calculate the land surface temperature (LST). To identify LST clusters, the local indicator of spatial associations (LISA) model was applied to spatiotemporal LST maps. In addition, the parallel coordinate (PCP) method was used to investigate the relationship between LST clusters and urban biophysical variables as a proxy for LULC. According to LULC maps, urban areas increased by more than 330% between 1990 and 2018. Between 1990 and 2018, built-up areas had an 83.6% transitional probability. Furthermore, between 1990 and 2020, vegetation and agricultural land were converted into built-up areas at a rate of 17.9% and 21.8%, respectively. Uneven LULC changes in built-up areas result in more LST hotspots. LST hotspots were associated with high NDBI but not NDWI or NDVI. This study could assist policymakers in developing mitigation strategies for urban heat islandsKeywords: land use land cover mapping, land surface temperature, support vector machine, LISA model, parallel coordinate plot
Procedia PDF Downloads 78884 Outcome of Bowel Management Program in Patient with Spinal Cord Injury
Authors: Roongtiwa Chobchuen, Angkana Srikhan, Pattra Wattanapan
Abstract:
Background: Neurogenic bowel is common condition after spinal cord injury. Most of spinal cord injured patients have motor weakness, mobility impairment which leads to constipation. Moreover, the neural pathway involving bowel function is interrupted. Therefore, the bowel management program should be implemented in nursing care in the earliest time after the onset of the disease to prevent the morbidity and mortality. Objective: To study the outcome of bowel management program of the patients with spinal cord injury who admitted for rehabilitation program. Study design: Descriptive study. Setting: Rehabilitation ward in Srinagarind Hospital. Populations: patients with subacute to chronic spinal cord injury who admitted at rehabilitation ward, Srinagarind hospital, aged over 18 years old. Instrument: The neurogenic bowel dysfunction score (NBDS) was used to determine the severity of neurogenic bowel. Procedure and statistical analysis: All participants were asked to complete the demographic data; age gender, duration of disease, diagnosis. The individual bowel function was assessed using NBDS at admission. The patients and caregivers were trained by nurses about the bowel management program which consisted of diet modification, abdominal massage, digital stimulation, stool evacuation including medication and physical activity. The outcome of the bowel management program was assessed by NBDS at discharge. The chi-square test was used to detect the difference in severity of neurogenic bowel at admission and discharge. Results: Sixteen spinal cord injured patients were enrolled in the study (age 45 ± 17 years old, 69% were male). Most of them (50%) were tetraplegia. On the admission, 12.5%, 12.5%, 43.75% and 31.25% were categorized as very minor (NBDS 0-6), minor (NBDS 7-9), moderate (NBDS 10-13) and severe (NBDS 14+) respectively. The severity of neurogenic bowel was decreased significantly at discharge (56.25%, 18.755%, 18.75% and 6.25% for very minor, minor, moderate and severe group respectively; p < 0.001) compared with NBDS at admission. Conclusions: Implementation of the effective bowel program decrease the severity of the neurogenic bowel in patient with spinal cord injury.Keywords: neurogenic bowel, NBDS, spinal cord injury, bowel program
Procedia PDF Downloads 243883 Model-Based Fault Diagnosis in Carbon Fiber Reinforced Composites Using Particle Filtering
Abstract:
Carbon fiber reinforced composites (CFRP) used as aircraft structure are subject to lightning strike, putting structural integrity under risk. Indirect damage may occur after a lightning strike where the internal structure can be damaged due to excessive heat induced by lightning current, while the surface of the structures remains intact. Three damage modes may be observed after a lightning strike: fiber breakage, inter-ply delamination and intra-ply cracks. The assessment of internal damage states in composite is challenging due to complicated microstructure, inherent uncertainties, and existence of multiple damage modes. In this work, a model based approach is adopted to diagnose faults in carbon composites after lighting strikes. A resistor network model is implemented to relate the overall electrical and thermal conduction behavior under simulated lightning current waveform to the intrinsic temperature dependent material properties, microstructure and degradation of materials. A fault detection and identification (FDI) module utilizes the physics based model and a particle filtering algorithm to identify damage mode as well as calculate the probability of structural failure. Extensive simulation results are provided to substantiate the proposed fault diagnosis methodology with both single fault and multiple faults cases. The approach is also demonstrated on transient resistance data collected from a IM7/Epoxy laminate under simulated lightning strike.Keywords: carbon composite, fault detection, fault identification, particle filter
Procedia PDF Downloads 195882 Pattern of Adverse Drug Reactions with Platinum Compounds in Cancer Chemotherapy at a Tertiary Care Hospital in South India
Authors: Meena Kumari, Ajitha Sharma, Mohan Babu Amberkar, Hasitha Manohar, Joseph Thomas, K. L. Bairy
Abstract:
Aim: To evaluate the pattern of occurrence of adverse drug reactions (ADRs) with platinum compounds in cancer chemotherapy at a tertiary care hospital. Methods: It was a retrospective, descriptive case record study done on patients admitted to the medical oncology ward of Kasturba Hospital, Manipal from July to November 2012. Inclusion criteria comprised of patients of both sexes and all ages diagnosed with cancer and were on platinum compounds, who developed at least one adverse drug reaction during or after the treatment period. CDSCO proforma was used for reporting ADRs. Causality was assessed using Naranjo Algorithm. Results: A total of 65 patients was included in the study. Females comprised of 67.69% and rest males. Around 49.23% of the ADRs were seen in the age group of 41-60 years, followed by 20 % in 21-40 years, 18.46% in patients over 60 years and 12.31% in 1-20 years age group. The anticancer agents which caused adverse drug reactions in our study were carboplatin (41.54%), cisplatin (36.92%) and oxaliplatin (21.54%). Most common adverse drug reactions observed were oral candidiasis (21.53%), vomiting (16.92%), anaemia (12.3%), diarrhoea (12.3%) and febrile neutropenia (0.08%). The results of the causality assessment of most of the cases were probable. Conclusion: The adverse effect of chemotherapeutic agents is a matter of concern in the pharmacological management of cancer as it affects the quality of life of patients. This information would be useful in identifying and minimizing preventable adverse drug reactions while generally enhancing the knowledge of the prescribers to deal with these adverse drug reactions more efficiently.Keywords: adverse drug reactions, platinum compounds, cancer, chemotherapy
Procedia PDF Downloads 429881 Heuristics for Optimizing Power Consumption in the Smart Grid
Authors: Zaid Jamal Saeed Almahmoud
Abstract:
Our increasing reliance on electricity, with inefficient consumption trends, has resulted in several economical and environmental threats. These threats include wasting billions of dollars, draining limited resources, and elevating the impact of climate change. As a solution, the smart grid is emerging as the future power grid, with smart techniques to optimize power consumption and electricity generation. Minimizing the peak power consumption under a fixed delay requirement is a significant problem in the smart grid. In addition, matching demand to supply is a key requirement for the success of the future electricity. In this work, we consider the problem of minimizing the peak demand under appliances constraints by scheduling power jobs with uniform release dates and deadlines. As the problem is known to be NP-Hard, we propose two versions of a heuristic algorithm for solving this problem. Our theoretical analysis and experimental results show that our proposed heuristics outperform existing methods by providing a better approximation to the optimal solution. In addition, we consider dynamic pricing methods to minimize the peak load and match demand to supply in the smart grid. Our contribution is the proposal of generic, as well as customized pricing heuristics to minimize the peak demand and match demand with supply. In addition, we propose optimal pricing algorithms that can be used when the maximum deadline period of the power jobs is relatively small. Finally, we provide theoretical analysis and conduct several experiments to evaluate the performance of the proposed algorithms.Keywords: heuristics, optimization, smart grid, peak demand, power supply
Procedia PDF Downloads 88880 Applying Kinect on the Development of a Customized 3D Mannequin
Authors: Shih-Wen Hsiao, Rong-Qi Chen
Abstract:
In the field of fashion design, 3D Mannequin is a kind of assisting tool which could rapidly realize the design concepts. While the concept of 3D Mannequin is applied to the computer added fashion design, it will connect with the development and the application of design platform and system. Thus, the situation mentioned above revealed a truth that it is very critical to develop a module of 3D Mannequin which would correspond with the necessity of fashion design. This research proposes a concrete plan that developing and constructing a system of 3D Mannequin with Kinect. In the content, ergonomic measurements of objective human features could be attained real-time through the implement with depth camera of Kinect, and then the mesh morphing can be implemented through transformed the locations of the control-points on the model by inputting those ergonomic data to get an exclusive 3D mannequin model. In the proposed methodology, after the scanned points from the Kinect are revised for accuracy and smoothening, a complete human feature would be reconstructed by the ICP algorithm with the method of image processing. Also, the objective human feature could be recognized to analyze and get real measurements. Furthermore, the data of ergonomic measurements could be applied to shape morphing for the division of 3D Mannequin reconstructed by feature curves. Due to a standardized and customer-oriented 3D Mannequin would be generated by the implement of subdivision, the research could be applied to the fashion design or the presentation and display of 3D virtual clothes. In order to examine the practicality of research structure, a system of 3D Mannequin would be constructed with JAVA program in this study. Through the revision of experiments the practicability-contained research result would come out.Keywords: 3D mannequin, kinect scanner, interactive closest point, shape morphing, subdivision
Procedia PDF Downloads 306879 Prediction Modeling of Alzheimer’s Disease and Its Prodromal Stages from Multimodal Data with Missing Values
Authors: M. Aghili, S. Tabarestani, C. Freytes, M. Shojaie, M. Cabrerizo, A. Barreto, N. Rishe, R. E. Curiel, D. Loewenstein, R. Duara, M. Adjouadi
Abstract:
A major challenge in medical studies, especially those that are longitudinal, is the problem of missing measurements which hinders the effective application of many machine learning algorithms. Furthermore, recent Alzheimer's Disease studies have focused on the delineation of Early Mild Cognitive Impairment (EMCI) and Late Mild Cognitive Impairment (LMCI) from cognitively normal controls (CN) which is essential for developing effective and early treatment methods. To address the aforementioned challenges, this paper explores the potential of using the eXtreme Gradient Boosting (XGBoost) algorithm in handling missing values in multiclass classification. We seek a generalized classification scheme where all prodromal stages of the disease are considered simultaneously in the classification and decision-making processes. Given the large number of subjects (1631) included in this study and in the presence of almost 28% missing values, we investigated the performance of XGBoost on the classification of the four classes of AD, NC, EMCI, and LMCI. Using 10-fold cross validation technique, XGBoost is shown to outperform other state-of-the-art classification algorithms by 3% in terms of accuracy and F-score. Our model achieved an accuracy of 80.52%, a precision of 80.62% and recall of 80.51%, supporting the more natural and promising multiclass classification.Keywords: eXtreme gradient boosting, missing data, Alzheimer disease, early mild cognitive impairment, late mild cognitive impair, multiclass classification, ADNI, support vector machine, random forest
Procedia PDF Downloads 188878 Fragment Domination for Many-Objective Decision-Making Problems
Authors: Boris Djartov, Sanaz Mostaghim
Abstract:
This paper presents a number-based dominance method. The main idea is how to fragment the many attributes of the problem into subsets suitable for the well-established concept of Pareto dominance. Although other similar methods can be found in the literature, they focus on comparing the solutions one objective at a time, while the focus of this method is to compare entire subsets of the objective vector. Given the nature of the method, it is computationally costlier than other methods and thus, it is geared more towards selecting an option from a finite set of alternatives, where each solution is defined by multiple objectives. The need for this method was motivated by dynamic alternate airport selection (DAAS). In DAAS, pilots, while en route to their destination, can find themselves in a situation where they need to select a new landing airport. In such a predicament, they need to consider multiple alternatives with many different characteristics, such as wind conditions, available landing distance, the fuel needed to reach it, etc. Hence, this method is primarily aimed at human decision-makers. Many methods within the field of multi-objective and many-objective decision-making rely on the decision maker to initially provide the algorithm with preference points and weight vectors; however, this method aims to omit this very difficult step, especially when the number of objectives is so large. The proposed method will be compared to Favour (1 − k)-Dom and L-dominance (LD) methods. The test will be conducted using well-established test problems from the literature, such as the DTLZ problems. The proposed method is expected to outperform the currently available methods in the literature and hopefully provide future decision-makers and pilots with support when dealing with many-objective optimization problems.Keywords: multi-objective decision-making, many-objective decision-making, multi-objective optimization, many-objective optimization
Procedia PDF Downloads 91877 Heterogeneous-Resolution and Multi-Source Terrain Builder for CesiumJS WebGL Virtual Globe
Authors: Umberto Di Staso, Marco Soave, Alessio Giori, Federico Prandi, Raffaele De Amicis
Abstract:
The increasing availability of information about earth surface elevation (Digital Elevation Models DEM) generated from different sources (remote sensing, Aerial Images, Lidar) poses the question about how to integrate and make available to the most than possible audience this huge amount of data. In order to exploit the potential of 3D elevation representation the quality of data management plays a fundamental role. Due to the high acquisition costs and the huge amount of generated data, highresolution terrain surveys tend to be small or medium sized and available on limited portion of earth. Here comes the need to merge large-scale height maps that typically are made available for free at worldwide level, with very specific high resolute datasets. One the other hand, the third dimension increases the user experience and the data representation quality, unlocking new possibilities in data analysis for civil protection, real estate, urban planning, environment monitoring, etc. The open-source 3D virtual globes, which are trending topics in Geovisual Analytics, aim at improving the visualization of geographical data provided by standard web services or with proprietary formats. Typically, 3D Virtual globes like do not offer an open-source tool that allows the generation of a terrain elevation data structure starting from heterogeneous-resolution terrain datasets. This paper describes a technological solution aimed to set up a so-called “Terrain Builder”. This tool is able to merge heterogeneous-resolution datasets, and to provide a multi-resolution worldwide terrain services fully compatible with CesiumJS and therefore accessible via web using traditional browser without any additional plug-in.Keywords: Terrain Builder, WebGL, Virtual Globe, CesiumJS, Tiled Map Service, TMS, Height-Map, Regular Grid, Geovisual Analytics, DTM
Procedia PDF Downloads 426876 Artificial Intelligence-Generated Previews of Hyaluronic Acid-Based Treatments
Authors: Ciro Cursio, Giulia Cursio, Pio Luigi Cursio, Luigi Cursio
Abstract:
Communication between practitioner and patient is of the utmost importance in aesthetic medicine: as of today, images of previous treatments are the most common tool used by doctors to describe and anticipate future results for their patients. However, using photos of other people often reduces the engagement of the prospective patient and is further limited by the number and quality of pictures available to the practitioner. Pre-existing work solves this issue in two ways: 3D scanning of the area with manual editing of the 3D model by the doctor or automatic prediction of the treatment by warping the image with hand-written parameters. The first approach requires the manual intervention of the doctor, while the second approach always generates results that aren’t always realistic. Thus, in one case, there is significant manual work required by the doctor, and in the other case, the prediction looks artificial. We propose an AI-based algorithm that autonomously generates a realistic prediction of treatment results. For the purpose of this study, we focus on hyaluronic acid treatments in the facial area. Our approach takes into account the individual characteristics of each face, and furthermore, the prediction system allows the patient to decide which area of the face she wants to modify. We show that the predictions generated by our system are realistic: first, the quality of the generated images is on par with real images; second, the prediction matches the actual results obtained after the treatment is completed. In conclusion, the proposed approach provides a valid tool for doctors to show patients what they will look like before deciding on the treatment.Keywords: prediction, hyaluronic acid, treatment, artificial intelligence
Procedia PDF Downloads 114875 Evaluation of Physical Parameters and in-Vitro and in-Vivo Antidiabetic Activity of a Selected Combined Medicinal Plant Extracts Mixture
Authors: S. N. T. I. Sampath, J. M. S. Jayasinghe, A. P. Attanayake, V. Karunaratne
Abstract:
Diabetes mellitus is one of the major public health posers throughout the world today that incidence and associated with increasing mortality. Insufficient regulation of the blood glucose level might be serious effects for health and its necessity to identify new therapeutics that have ability to reduce hyperglycaemic condition in the human body. Even though synthetic antidiabetic drugs are more effective to control diabetes mellitus, there are considerable side effects have been reported. Thus, there is an increasing demand for searching new natural products having high antidiabetic activity with lesser side effects. The purposes of the present study were to evaluate different physical parameters and in-vitro and in-vivo antidiabetic potential of the selected combined medicinal plant extracts mixture composed of leaves of Murraya koenigii, cloves of Allium sativum, fruits of Garcinia queasita and seeds of Piper nigrum. The selected plants parts were mixed and ground together and extracted sequentially into the hexane, ethyl acetate and methanol. Solvents were evaporated and they were further dried by freeze-drying to obtain a fine powder of each extract. Various physical parameters such as moisture, total ash, acid insoluble ash and water soluble ash were evaluated using standard test procedures. In-vitro antidiabetic activity of combined plant extracts mixture was screened using enzyme assays such as α-amylase inhibition assay and α-glucosidase inhibition assay. The acute anti-hyperglycaemic activity was performed using oral glucose tolerance test for the streptozotocin induced diabetic Wistar rats to find out in-vivo antidiabetic activity of combined plant extracts mixture and it was assessed through total oral glucose tolerance curve (TAUC) values. The percentage of moisture content, total ash content, acid insoluble ash content and water soluble ash content were ranged of 7.6-17.8, 8.1-11.78, 0.019-0.134 and 6.2-9.2 respectively for the plant extracts and those values were less than standard values except the methanol extract. The hexane and ethyl acetate extracts exhibited highest α-amylase (IC50 = 25.7 ±0.6; 27.1 ±1.2 ppm) and α-glucosidase (IC50 = 22.4 ±0.1; 33.7 ±0.2 ppm) inhibitory activities than methanol extract (IC50 = 360.2 ±0.6; 179.6 ±0.9 ppm) when compared with the acarbose positive control (IC50 = 5.7 ±0.4; 17.1 ±0.6 ppm). The TAUC values for hexane, ethyl acetate, and methanol extracts and glibenclamide (positive control) treated rats were 8.01 ±0.66; 8.05 ±1.07; 8.40±0.50; 5.87 ±0.93 mmol/L.h respectively, whereas in diabetic control rats the TAUC value was 13.22 ±1.07 mmol/L.h. Administration of plant extracts treated rats significantly suppressed (p<0.05) the rise in plasma blood glucose levels compared to control rats but less significant than glibenclamide. The obtained results from in-vivo and in-vitro antidiabetic study showed that the hexane and ethyl acetate extracts of selected combined plant mixture might be considered as a potential source to isolate natural antidiabetic agents and physical parameters of hexane and ethyl acetate extracts will helpful to develop antidiabetic drug with further standardize properties.Keywords: diabetes mellitus, in-vitro antidiabetic assays, medicinal plants, standardization
Procedia PDF Downloads 131874 Quantum Statistical Machine Learning and Quantum Time Series
Authors: Omar Alzeley, Sergey Utev
Abstract:
Minimizing a constrained multivariate function is the fundamental of Machine learning, and these algorithms are at the core of data mining and data visualization techniques. The decision function that maps input points to output points is based on the result of optimization. This optimization is the central of learning theory. One approach to complex systems where the dynamics of the system is inferred by a statistical analysis of the fluctuations in time of some associated observable is time series analysis. The purpose of this paper is a mathematical transition from the autoregressive model of classical time series to the matrix formalization of quantum theory. Firstly, we have proposed a quantum time series model (QTS). Although Hamiltonian technique becomes an established tool to detect a deterministic chaos, other approaches emerge. The quantum probabilistic technique is used to motivate the construction of our QTS model. The QTS model resembles the quantum dynamic model which was applied to financial data. Secondly, various statistical methods, including machine learning algorithms such as the Kalman filter algorithm, are applied to estimate and analyses the unknown parameters of the model. Finally, simulation techniques such as Markov chain Monte Carlo have been used to support our investigations. The proposed model has been examined by using real and simulated data. We establish the relation between quantum statistical machine and quantum time series via random matrix theory. It is interesting to note that the primary focus of the application of QTS in the field of quantum chaos was to find a model that explain chaotic behaviour. Maybe this model will reveal another insight into quantum chaos.Keywords: machine learning, simulation techniques, quantum probability, tensor product, time series
Procedia PDF Downloads 469873 Rising Levels of Greenhouse Gases: Implication for Global Warming in Anambra State South Eastern Nigeria
Authors: Chikwelu Edward Emenike, Ogbuagu Uchenna Fredrick
Abstract:
About 34% of the solar radiant energy reaching the earth is immediately reflected back to space as incoming radiation by clouds, chemicals, dust in the atmosphere and by the earth’s surface. Most of the remaining 66% warms the atmosphere and land. Most of the incoming solar radiation not reflect away is degraded into low-quality heat and flows into space. The rate at which this energy returns to space as low-quality heat is affected by the presence of molecules of greenhouse gases. Gaseous emission was measured with the aid of Growen gas Analyzer with a digital readout. Total measurements of eight parameters of twelve selected sample locations taken at two different seasons within two months were made. The ambient air quality investigation in Anambra State has shown the overall mean concentrations of gaseous emission at twelve (12) locations. The mean gaseous emissions showed (NO2=0.66ppm, SO2=0.30ppm, CO=43.93ppm, H2S=2.17ppm, CH4=1.27ppm, CFC=1.59ppb, CO2=316.33ppm, N2O=302.67ppb and O3=0.37ppm). These values do not conform to the National Ambient Air Quality Standard (NAAQS) and thus contribute significantly to the global warming. Because some of these gaseous emissions (SO2, NO2) are oxidizing agents, they act as irritants that damage delicate tissues in the eyes and respiratory passages. These can impair lung function and trigger cardiovascular problems as the heart tries to compensate for lack of Oxygen by pumping faster and harder. The major sources of air pollution are transportation, industrial processes, stationary fuel combustion and solid waste disposal, thus much is yet to be done in a developing country like Nigeria. Air pollution control using pollution-control equipment to reduce the major conventional pollutants, relocating people who live very close to dumpsites, processing and treatment of gases to produce electricity, heat, fuel and various chemical components should be encouraged.Keywords: ambient air, atmosphere, greenhouse gases, anambra state
Procedia PDF Downloads 433872 Thulium Laser Vaporisation and Enucleation of Prostate in Patients on Anticoagulants and Antiplatelet Agents
Authors: Abdul Fatah, Naveenchandra Acharya, Vamshi Krishna, T. Shivaprasad, Ramesh Ramayya
Abstract:
Background: Significant number of patients with bladder outlet obstruction due to BPH are on anti-platelets and anticoagulants. Prostate surgery in this group of patients either in the form of TURP or Open prostatectomy is associated with increased risk of bleeding complications requiring transfusions, packing of the prostatic fossa or ligation or embolization of internal iliac arteries. Withholding of antiplatelets and anticoagulants may be associated with cardiac and other complications. Efficacy of Thulium Laser in the above group of patients was evaluated in terms of peri-operative, postoperative and delayed bleeding complications as well as cardiac events in peri-operative and immediate postoperative period. Methods: 217 patients with a mean age of 68.8 years were enrolled between March 2009 and March 2013 (36 months), and treated for BPH with ThuLEP. Every patient was evaluated at base line according to: Digital Rectal Examination (DRE), prostate volume, Post-Voided volume (PVR), International Prostate Symptoms Score (I-PSS), PSA values, urine analysis and urine culture, uroflowmetry. The post operative complications in the form of drop in hemoglobin level, transfusion rates, post –operative cardiac events within a period of 30 days, delayed hematuria and events like deep vein thrombosis and pulmonary embolism were noted. Results: Our data showed a better post-operative outcome in terms of, postoperative bleeding requiring intervention 7 (3.2%), transfusion rate 4 (1.8%) and cardiac events within a period of 30 days 4(1.8%), delayed hematuria within 6 months 2(0.9 %) compared other series of prostatectomies. Conclusion: The thulium LASER prostatectomy is a safe and effective option for patients with cardiac comorbidties and those patients who are on antiplatelet agents and anticoagulants. The complication rate is less as compared to larger series reported with open and transurethral prostatectomies.Keywords: thulium laser, prostatectomy, antiplatelet agents, bleeding
Procedia PDF Downloads 393871 Implementation of a Monostatic Microwave Imaging System using a UWB Vivaldi Antenna
Authors: Babatunde Olatujoye, Binbin Yang
Abstract:
Microwave imaging is a portable, noninvasive, and non-ionizing imaging technique that employs low-power microwave signals to reveal objects in the microwave frequency range. This technique has immense potential for adoption in commercial and scientific applications such as security scanning, material characterization, and nondestructive testing. This work presents a monostatic microwave imaging setup using an Ultra-Wideband (UWB), low-cost, miniaturized Vivaldi antenna with a bandwidth of 1 – 6 GHz. The backscattered signals (S-parameters) of the Vivaldi antenna used for scanning targets were measured in the lab using a VNA. An automated two-dimensional (2-D) scanner was employed for the 2-D movement of the transceiver to collect the measured scattering data from different positions. The targets consist of four metallic objects, each with a distinct shape. Similar setup was also simulated in Ansys HFSS. A high-resolution Back Propagation Algorithm (BPA) was applied to both the simulated and experimental backscattered signals. The BPA utilizes the phase and amplitude information recorded over a two-dimensional aperture of 50 cm × 50 cm with a discreet step size of 2 cm to reconstruct a focused image of the targets. The adoption of BPA was demonstrated by coherently resolving and reconstructing reflection signals from conventional time-of-flight profiles. For both the simulation and experimental data, BPA accurately reconstructed a high resolution 2D image of the targets in terms of shape and location. An improvement of the BPA, in terms of target resolution, was achieved by applying the filtering method in frequency domain.Keywords: back propagation, microwave imaging, monostatic, vivialdi antenna, ultra wideband
Procedia PDF Downloads 19870 Regional Problems of Electronic Governance in Autonomous Republic of Adjara
Authors: Manvelidze irakli, Iashvili Genadi
Abstract:
Research has shown that public institutions in Autonomous Republic of Ajara try their best to make their official electronic data (web-pages, social websites) more informative and improve them. Part of public institutions offer interesting electronic services and initiatives to the public although they are seldom used in communication process. The statistical analysis of the use of web-pages and social websites of public institutions for example their facebook page show lack of activity. The reason could be the fact that public institutions give people less possibility of interaction in official web-pages. Second reason could be the fact that these web-pages are less known to the public and the third reason could be the fact that heads of these institutions lack awareness about the necessity of strengthening citizens’ involvement. In order to increase people’s involvement in this process it is necessary to have at least 23 e-services in one web-page. The research has shown that 11 of the 16 public institutions have only 5 services which are contact, social networks and hotline. Besides introducing innovative services government institutions should evaluate them and make them popular and easily accessible for the public. It would be easy to solve this problem if public institutions had concrete strategic plan of public relations which involved matters connected with maximum usage of electronic services while interaction with citizens. For this moment only one governmental body has a functioning action plan of public relations. As a result of the research organizational, social, methodological and technical problems have been revealed. It should be considered that there are many feedback possibilities like forum, RSS, blogs, wiki, twitter, social networks, etc. usage of only one or three of such instruments indicate that there is no strategy of regional electronic governance. It is necessary to develop more mechanisms of feedback which will increase electronic interaction, discussions and it is necessary to introduce the service of online petitions. It is important to reduce the so-called “digital inequality” and increase internet access for the public. State actions should decrease such problems. In the end if such shortcomings will be improved the role of electronic interactions in democratic processes will increase.Keywords: e-Government, electronic services, information technology, regional government, regional government
Procedia PDF Downloads 309869 Drought Risk Analysis Using Neural Networks for Agri-Businesses and Projects in Lejweleputswa District Municipality, South Africa
Authors: Bernard Moeketsi Hlalele
Abstract:
Drought is a complicated natural phenomenon that creates significant economic, social, and environmental problems. An analysis of paleoclimatic data indicates that severe and extended droughts are inevitable part of natural climatic circle. This study characterised drought in Lejweleputswa using both Standardised Precipitation Index (SPI) and neural networks (NN) to quantify and predict respectively. Monthly 37-year long time series precipitation data were obtained from online NASA database. Prior to the final analysis, this dataset was checked for outliers using SPSS. Outliers were removed and replaced by Expectation Maximum algorithm from SPSS. This was followed by both homogeneity and stationarity tests to ensure non-spurious results. A non-parametric Mann Kendall's test was used to detect monotonic trends present in the dataset. Two temporal scales SPI-3 and SPI-12 corresponding to agricultural and hydrological drought events showed statistically decreasing trends with p-value = 0.0006 and 4.9 x 10⁻⁷, respectively. The study area has been plagued with severe drought events on SPI-3, while on SPI-12, it showed approximately a 20-year circle. The concluded the analyses with a seasonal analysis that showed no significant trend patterns, and as such NN was used to predict possible SPI-3 for the last season of 2018/2019 and four seasons for 2020. The predicted drought intensities ranged from mild to extreme drought events to come. It is therefore recommended that farmers, agri-business owners, and other relevant stakeholders' resort to drought resistant crops as means of adaption.Keywords: drought, risk, neural networks, agri-businesses, project, Lejweleputswa
Procedia PDF Downloads 126868 Secure Automatic Key SMS Encryption Scheme Using Hybrid Cryptosystem: An Approach for One Time Password Security Enhancement
Authors: Pratama R. Yunia, Firmansyah, I., Ariani, Ulfa R. Maharani, Fikri M. Al
Abstract:
Nowadays, notwithstanding that the role of SMS as a means of communication has been largely replaced by online applications such as WhatsApp, Telegram, and others, the fact that SMS is still used for certain and important communication needs is indisputable. Among them is for sending one time password (OTP) as an authentication media for various online applications ranging from chatting, shopping to online banking applications. However, the usage of SMS does not pretty much guarantee the security of transmitted messages. As a matter of fact, the transmitted messages between BTS is still in the form of plaintext, making it extremely vulnerable to eavesdropping, especially if the message is confidential, for instance, the OTP. One solution to overcome this problem is to use an SMS application which provides security services for each transmitted message. Responding to this problem, in this study, an automatic key SMS encryption scheme was designed as a means to secure SMS communication. The proposed scheme allows SMS sending, which is automatically encrypted with keys that are constantly changing (automatic key update), automatic key exchange, and automatic key generation. In terms of the security method, the proposed scheme applies cryptographic techniques with a hybrid cryptosystem mechanism. Proofing the proposed scheme, a client to client SMS encryption application was developed using Java platform with AES-256 as encryption algorithm, RSA-768 as public and private key generator and SHA-256 for message hashing function. The result of this study is a secure automatic key SMS encryption scheme using hybrid cryptosystem which can guarantee the security of every transmitted message, so as to become a reliable solution in sending confidential messages through SMS although it still has weaknesses in terms of processing time.Keywords: encryption scheme, hybrid cryptosystem, one time password, SMS security
Procedia PDF Downloads 128867 Assessing the Impacts of Riparian Land Use on Gully Development and Sediment Load: A Case Study of Nzhelele River Valley, Limpopo Province, South Africa
Authors: B. Mavhuru, N. S. Nethengwe
Abstract:
Human activities on land degradation have triggered several environmental problems especially in rural areas that are underdeveloped. The main aim of this study is to analyze the contribution of different land uses to gully development and sediment load on the Nzhelele River Valley in the Limpopo Province. Data was collected using different methods such as observation, field data techniques and experiments. Satellite digital images, topographic maps, aerial photographs and the sediment load static model also assisted in determining how land use affects gully development and sediment load. For data analysis, the researcher used the following methods: Analysis of Variance (ANOVA), descriptive statistics, Pearson correlation coefficient and statistical correlation methods. The results of the research illustrate that high land use activities create negative changes especially in areas that are highly fragile and vulnerable. Distinct impact on land use change was observed within settlement area (9.6 %) within a period of 5 years. High correlation between soil organic matter and soil moisture (R=0.96) was observed. Furthermore, a significant variation (p ≤ 0.6) between the soil organic matter and soil moisture was also observed. A very significant variation (p ≤ 0.003) was observed in bulk density and extreme significant variations (p ≤ 0.0001) were observed in organic matter and soil particle size. The sand mining and agricultural activities has contributed significantly to the amount of sediment load in the Nzhelele River. A high significant amount of total suspended sediment (55.3 %) and bed load (53.8 %) was observed within the agricultural area. The connection which associates the development of gullies to various land use activities determines the amount of sediment load. These results are consistent with other previous research and suggest that land use activities are likely to exacerbate the development of gullies and sediment load in the Nzhelele River Valley.Keywords: drainage basin, geomorphological processes, gully development, land degradation, riparian land use and sediment load
Procedia PDF Downloads 307