Search results for: classification algorithms
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3819

Search results for: classification algorithms

969 Digital Joint Equivalent Channel Hybrid Precoding for Millimeterwave Massive Multiple Input Multiple Output Systems

Authors: Linyu Wang, Mingjun Zhu, Jianhong Xiang, Hanyu Jiang

Abstract:

Aiming at the problem that the spectral efficiency of hybrid precoding (HP) is too low in the current millimeter wave (mmWave) massive multiple input multiple output (MIMO) system, this paper proposes a digital joint equivalent channel hybrid precoding algorithm, which is based on the introduction of digital encoding matrix iteration. First, the objective function is expanded to obtain the relation equation, and the pseudo-inverse iterative function of the analog encoder is derived by using the pseudo-inverse method, which solves the problem of greatly increasing the amount of computation caused by the lack of rank of the digital encoding matrix and reduces the overall complexity of hybrid precoding. Secondly, the analog coding matrix and the millimeter-wave sparse channel matrix are combined into an equivalent channel, and then the equivalent channel is subjected to Singular Value Decomposition (SVD) to obtain a digital coding matrix, and then the derived pseudo-inverse iterative function is used to iteratively regenerate the simulated encoding matrix. The simulation results show that the proposed algorithm improves the system spectral efficiency by 10~20%compared with other algorithms and the stability is also improved.

Keywords: mmWave, massive MIMO, hybrid precoding, singular value decompositing, equivalent channel

Procedia PDF Downloads 86
968 Glaucoma Detection in Retinal Tomography Using the Vision Transformer

Authors: Sushish Baral, Pratibha Joshi, Yaman Maharjan

Abstract:

Glaucoma is a chronic eye condition that causes vision loss that is irreversible. Early detection and treatment are critical to prevent vision loss because it can be asymptomatic. For the identification of glaucoma, multiple deep learning algorithms are used. Transformer-based architectures, which use the self-attention mechanism to encode long-range dependencies and acquire extremely expressive representations, have recently become popular. Convolutional architectures, on the other hand, lack knowledge of long-range dependencies in the image due to their intrinsic inductive biases. The aforementioned statements inspire this thesis to look at transformer-based solutions and investigate the viability of adopting transformer-based network designs for glaucoma detection. Using retinal fundus images of the optic nerve head to develop a viable algorithm to assess the severity of glaucoma necessitates a large number of well-curated images. Initially, data is generated by augmenting ocular pictures. After that, the ocular images are pre-processed to make them ready for further processing. The system is trained using pre-processed images, and it classifies the input images as normal or glaucoma based on the features retrieved during training. The Vision Transformer (ViT) architecture is well suited to this situation, as it allows the self-attention mechanism to utilise structural modeling. Extensive experiments are run on the common dataset, and the results are thoroughly validated and visualized.

Keywords: glaucoma, vision transformer, convolutional architectures, retinal fundus images, self-attention, deep learning

Procedia PDF Downloads 186
967 PaSA: A Dataset for Patent Sentiment Analysis to Highlight Patent Paragraphs

Authors: Renukswamy Chikkamath, Vishvapalsinhji Ramsinh Parmar, Christoph Hewel, Markus Endres

Abstract:

Given a patent document, identifying distinct semantic annotations is an interesting research aspect. Text annotation helps the patent practitioners such as examiners and patent attorneys to quickly identify the key arguments of any invention, successively providing a timely marking of a patent text. In the process of manual patent analysis, to attain better readability, recognising the semantic information by marking paragraphs is in practice. This semantic annotation process is laborious and time-consuming. To alleviate such a problem, we proposed a dataset to train machine learning algorithms to automate the highlighting process. The contributions of this work are: i) we developed a multi-class dataset of size 150k samples by traversing USPTO patents over a decade, ii) articulated statistics and distributions of data using imperative exploratory data analysis, iii) baseline Machine Learning models are developed to utilize the dataset to address patent paragraph highlighting task, and iv) future path to extend this work using Deep Learning and domain-specific pre-trained language models to develop a tool to highlight is provided. This work assists patent practitioners in highlighting semantic information automatically and aids in creating a sustainable and efficient patent analysis using the aptitude of machine learning.

Keywords: machine learning, patents, patent sentiment analysis, patent information retrieval

Procedia PDF Downloads 84
966 Test Suite Optimization Using an Effective Meta-Heuristic BAT Algorithm

Authors: Anuradha Chug, Sunali Gandhi

Abstract:

Regression Testing is a very expensive and time-consuming process carried out to ensure the validity of modified software. Due to the availability of insufficient resources to re-execute all the test cases in time constrained environment, efforts are going on to generate test data automatically without human efforts. Many search based techniques have been proposed to generate efficient, effective as well as optimized test data, so that the overall cost of the software testing can be minimized. The generated test data should be able to uncover all potential lapses that exist in the software or product. Inspired from the natural behavior of bat for searching her food sources, current study employed a meta-heuristic, search-based bat algorithm for optimizing the test data on the basis certain parameters without compromising their effectiveness. Mathematical functions are also applied that can effectively filter out the redundant test data. As many as 50 Java programs are used to check the effectiveness of proposed test data generation and it has been found that 86% saving in testing efforts can be achieved using bat algorithm while covering 100% of the software code for testing. Bat algorithm was found to be more efficient in terms of simplicity and flexibility when the results were compared with another nature inspired algorithms such as Firefly Algorithm (FA), Hill Climbing Algorithm (HC) and Ant Colony Optimization (ACO). The output of this study would be useful to testers as they can achieve 100% path coverage for testing with minimum number of test cases.

Keywords: regression testing, test case selection, test case prioritization, genetic algorithm, bat algorithm

Procedia PDF Downloads 372
965 Multi-Temporal Mapping of Built-up Areas Using Daytime and Nighttime Satellite Images Based on Google Earth Engine Platform

Authors: S. Hutasavi, D. Chen

Abstract:

The built-up area is a significant proxy to measure regional economic growth and reflects the Gross Provincial Product (GPP). However, an up-to-date and reliable database of built-up areas is not always available, especially in developing countries. The cloud-based geospatial analysis platform such as Google Earth Engine (GEE) provides an opportunity with accessibility and computational power for those countries to generate the built-up data. Therefore, this study aims to extract the built-up areas in Eastern Economic Corridor (EEC), Thailand using day and nighttime satellite imagery based on GEE facilities. The normalized indices were generated from Landsat 8 surface reflectance dataset, including Normalized Difference Built-up Index (NDBI), Built-up Index (BUI), and Modified Built-up Index (MBUI). These indices were applied to identify built-up areas in EEC. The result shows that MBUI performs better than BUI and NDBI, with the highest accuracy of 0.85 and Kappa of 0.82. Moreover, the overall accuracy of classification was improved from 79% to 90%, and error of total built-up area was decreased from 29% to 0.7%, after night-time light data from the Visible and Infrared Imaging Suite (VIIRS) Day Night Band (DNB). The results suggest that MBUI with night-time light imagery is appropriate for built-up area extraction and be utilize for further study of socioeconomic impacts of regional development policy over the EEC region.

Keywords: built-up area extraction, google earth engine, adaptive thresholding method, rapid mapping

Procedia PDF Downloads 118
964 Aerodynamic Modelling of Unmanned Aerial System through Computational Fluid Dynamics: Application to the UAS-S45 Balaam

Authors: Maxime A. J. Kuitche, Ruxandra M. Botez, Arthur Guillemin

Abstract:

As the Unmanned Aerial Systems have found diverse utilities in both military and civil aviation, the necessity to obtain an accurate aerodynamic model has shown an enormous growth of interest. Recent modeling techniques are procedures using optimization algorithms and statistics that require many flight tests and are therefore extremely demanding in terms of costs. This paper presents a procedure to estimate the aerodynamic behavior of an unmanned aerial system from a numerical approach using computational fluid dynamic analysis. The study was performed using an unstructured mesh obtained from a grid convergence analysis at a Mach number of 0.14, and at an angle of attack of 0°. The flow around the aircraft was described using a standard k-ω turbulence model. Thus, the Reynold Averaged Navier-Stokes (RANS) equations were solved using ANSYS FLUENT software. The method was applied on the UAS-S45 designed and manufactured by Hydra Technologies in Mexico. The lift, the drag, and the pitching moment coefficients were obtained at different angles of attack for several flight conditions defined in terms of altitudes and Mach numbers. The results obtained from the Computational Fluid Dynamics analysis were compared with the results obtained by using the DATCOM semi-empirical procedure. This comparison has indicated that our approach is highly accurate and that the aerodynamic model obtained could be useful to estimate the flight dynamics of the UAS-S45.

Keywords: aerodynamic modelling, CFD Analysis, ANSYS FLUENT, UAS-S45

Procedia PDF Downloads 370
963 Voice Liveness Detection Using Kolmogorov Arnold Networks

Authors: Arth J. Shah, Madhu R. Kamble

Abstract:

Voice biometric liveness detection is customized to certify an authentication process of the voice data presented is genuine and not a recording or synthetic voice. With the rise of deepfakes and other equivalently sophisticated spoofing generation techniques, it’s becoming challenging to ensure that the person on the other end is a live speaker or not. Voice Liveness Detection (VLD) system is a group of security measures which detect and prevent voice spoofing attacks. Motivated by the recent development of the Kolmogorov-Arnold Network (KAN) based on the Kolmogorov-Arnold theorem, we proposed KAN for the VLD task. To date, multilayer perceptron (MLP) based classifiers have been used for the classification tasks. We aim to capture not only the compositional structure of the model but also to optimize the values of univariate functions. This study explains the mathematical as well as experimental analysis of KAN for VLD tasks, thereby opening a new perspective for scientists to work on speech and signal processing-based tasks. This study emerges as a combination of traditional signal processing tasks and new deep learning models, which further proved to be a better combination for VLD tasks. The experiments are performed on the POCO and ASVSpoof 2017 V2 database. We used Constant Q-transform, Mel, and short-time Fourier transform (STFT) based front-end features and used CNN, BiLSTM, and KAN as back-end classifiers. The best accuracy is 91.26 % on the POCO database using STFT features with the KAN classifier. In the ASVSpoof 2017 V2 database, the lowest EER we obtained was 26.42 %, using CQT features and KAN as a classifier.

Keywords: Kolmogorov Arnold networks, multilayer perceptron, pop noise, voice liveness detection

Procedia PDF Downloads 33
962 Hand Gesture Recognition for Sign Language: A New Higher Order Fuzzy HMM Approach

Authors: Saad M. Darwish, Magda M. Madbouly, Murad B. Khorsheed

Abstract:

Sign Languages (SL) are the most accomplished forms of gestural communication. Therefore, their automatic analysis is a real challenge, which is interestingly implied to their lexical and syntactic organization levels. Hidden Markov models (HMM’s) have been used prominently and successfully in speech recognition and, more recently, in handwriting recognition. Consequently, they seem ideal for visual recognition of complex, structured hand gestures such as are found in sign language. In this paper, several results concerning static hand gesture recognition using an algorithm based on Type-2 Fuzzy HMM (T2FHMM) are presented. The features used as observables in the training as well as in the recognition phases are based on Singular Value Decomposition (SVD). SVD is an extension of Eigen decomposition to suit non-square matrices to reduce multi attribute hand gesture data to feature vectors. SVD optimally exposes the geometric structure of a matrix. In our approach, we replace the basic HMM arithmetic operators by some adequate Type-2 fuzzy operators that permits us to relax the additive constraint of probability measures. Therefore, T2FHMMs are able to handle both random and fuzzy uncertainties existing universally in the sequential data. Experimental results show that T2FHMMs can effectively handle noise and dialect uncertainties in hand signals besides a better classification performance than the classical HMMs. The recognition rate of the proposed system is 100% for uniform hand images and 86.21% for cluttered hand images.

Keywords: hand gesture recognition, hand detection, type-2 fuzzy logic, hidden Markov Model

Procedia PDF Downloads 456
961 Constructions of Linear and Robust Codes Based on Wavelet Decompositions

Authors: Alla Levina, Sergey Taranov

Abstract:

The classical approach to the providing noise immunity and integrity of information that process in computing devices and communication channels is to use linear codes. Linear codes have fast and efficient algorithms of encoding and decoding information, but this codes concentrate their detect and correct abilities in certain error configurations. To protect against any configuration of errors at predetermined probability can robust codes. This is accomplished by the use of perfect nonlinear and almost perfect nonlinear functions to calculate the code redundancy. The paper presents the error-correcting coding scheme using biorthogonal wavelet transform. Wavelet transform applied in various fields of science. Some of the wavelet applications are cleaning of signal from noise, data compression, spectral analysis of the signal components. The article suggests methods for constructing linear codes based on wavelet decomposition. For developed constructions we build generator and check matrix that contain the scaling function coefficients of wavelet. Based on linear wavelet codes we develop robust codes that provide uniform protection against all errors. In article we propose two constructions of robust code. The first class of robust code is based on multiplicative inverse in finite field. In the second robust code construction the redundancy part is a cube of information part. Also, this paper investigates the characteristics of proposed robust and linear codes.

Keywords: robust code, linear code, wavelet decomposition, scaling function, error masking probability

Procedia PDF Downloads 485
960 Unsupervised Feature Learning by Pre-Route Simulation of Auto-Encoder Behavior Model

Authors: Youngjae Jin, Daeshik Kim

Abstract:

This paper describes a cycle accurate simulation results of weight values learned by an auto-encoder behavior model in terms of pre-route simulation. Given the results we visualized the first layer representations with natural images. Many common deep learning threads have focused on learning high-level abstraction of unlabeled raw data by unsupervised feature learning. However, in the process of handling such a huge amount of data, the learning method’s computation complexity and time limited advanced research. These limitations came from the fact these algorithms were computed by using only single core CPUs. For this reason, parallel-based hardware, FPGAs, was seen as a possible solution to overcome these limitations. We adopted and simulated the ready-made auto-encoder to design a behavior model in Verilog HDL before designing hardware. With the auto-encoder behavior model pre-route simulation, we obtained the cycle accurate results of the parameter of each hidden layer by using MODELSIM. The cycle accurate results are very important factor in designing a parallel-based digital hardware. Finally this paper shows an appropriate operation of behavior model based pre-route simulation. Moreover, we visualized learning latent representations of the first hidden layer with Kyoto natural image dataset.

Keywords: auto-encoder, behavior model simulation, digital hardware design, pre-route simulation, Unsupervised feature learning

Procedia PDF Downloads 439
959 Comparative Study on the Effect of Compaction Energy and Moisture Content on the Strength Properties of Lateritic Soil

Authors: Ahmad Idris, O.A. Uche, Ado Y Abdulfatah

Abstract:

Lateritic soils are found in abundance and are the most common types of soils used in construction of roads and embankments in Nigeria. Strength properties of the soils depend on the amount of compaction applied and the amount of water available in the soil at the time of compaction. In this study, the influence of the compactive effort and that of the amount of water in the soil in the determination of the shear strength properties of lateritic soil was investigated. Lateritic soil sample was collected from an existing borrow pit in Kano, Nigeria and its basic characteristics were determined and the soil was classified according to AASHTO classification method. The soil was then compacted under various compactive efforts and at wide range of moisture contents. The maximum dry density (MDD) and optimum moisture content (OMC) at each compactive effort was determined. Unconfined undrained triaxial test was carried out to determine the shear strength properties of the soil under various conditions of moisture and energy. Preliminary results obtained indicated that the soil is an A-7-5 soil. The final results obtained shows that as the compaction energy is increased, both the cohesion and friction angle increased irrespective of the moisture content used in the compaction. However, when the amount of water in the soil was increased and compaction effort kept constant, only the cohesion of the soil increases while the friction angle shows no any pattern of variation. It was also found that the highest values for cohesion and friction angle were obtained when the soil was compacted at the highest energy and at OMC.

Keywords: laterite, OMC, compaction energy, moisture content

Procedia PDF Downloads 399
958 Preliminary Analysis on Land Use-Land Cover Assessment of Post-Earthquake Geohazard: A Case Study in Kundasang, Sabah

Authors: Nur Afiqah Mohd Kamal, Khamarrul Azahari Razak

Abstract:

The earthquake aftermath has become a major concern, especially in high seismicity region. In Kundasang, Sabah, the earthquake on 5th June 2015 resulted in several catastrophes; landslides, rockfalls, mudflows and major slopes affected regardless of the series of the aftershocks. Certainly, the consequences of earthquake generate and induce the episodic disaster, not only life-threatening but it also affects infrastructure and economic development. Therefore, a need for investigating the change in land use and land cover (LULC) of post-earthquake geohazard is essential for identifying the extent of disastrous effects towards the development in Kundasang. With the advancement of remote sensing technology, post-earthquake geohazards (landslides, mudflows, rockfalls, debris flows) assessment can be evaluated by the employment of object-based image analysis in investigating the LULC change which consists of settlements, public infrastructure and vegetation cover. Therefore, this paper discusses the preliminary results on post-earthquakes geohazards distribution in Kundasang and evaluates the LULC classification effect upon the occurrences of geohazards event. The result of this preliminary analysis will provide an overview to determine the extent of geohazard impact on LULC. This research also provides beneficial input to the local authority in Kundasang about the risk of future structural development on the geohazard area.

Keywords: geohazard, land use land cover, object-based image analysis, remote sensing

Procedia PDF Downloads 240
957 Development of pm2.5 Forecasting System in Seoul, South Korea Using Chemical Transport Modeling and ConvLSTM-DNN

Authors: Ji-Seok Koo, Hee‑Yong Kwon, Hui-Young Yun, Kyung-Hui Wang, Youn-Seo Koo

Abstract:

This paper presents a forecasting system for PM2.5 levels in Seoul, South Korea, leveraging a combination of chemical transport modeling and ConvLSTM-DNN machine learning technology. Exposure to PM2.5 has known detrimental impacts on public health, making its prediction crucial for establishing preventive measures. Existing forecasting models, like the Community Multiscale Air Quality (CMAQ) and Weather Research and Forecasting (WRF), are hindered by their reliance on uncertain input data, such as anthropogenic emissions and meteorological patterns, as well as certain intrinsic model limitations. The system we've developed specifically addresses these issues by integrating machine learning and using carefully selected input features that account for local and distant sources of PM2.5. In South Korea, the PM2.5 concentration is greatly influenced by both local emissions and long-range transport from China, and our model effectively captures these spatial and temporal dynamics. Our PM2.5 prediction system combines the strengths of advanced hybrid machine learning algorithms, convLSTM and DNN, to improve upon the limitations of the traditional CMAQ model. Data used in the system include forecasted information from CMAQ and WRF models, along with actual PM2.5 concentration and weather variable data from monitoring stations in China and South Korea. The system was implemented specifically for Seoul's PM2.5 forecasting.

Keywords: PM2.5 forecast, machine learning, convLSTM, DNN

Procedia PDF Downloads 52
956 The Formulation of R&D Strategy for Biofuel Technology: A Case Study of the Aviation Industry in Iran

Authors: Maryam Amiri, Ali Rajabzade, Gholam Reza Goudarzi, Reza Heidari

Abstract:

Growth of technology and environmental changes are so fast and therefore, companies and industries have much tendency to do activities of R&D for active participation in the market and achievement to a competitive advantages. Aviation industry and its subdivisions have high level technology and play a special role in economic and social development of countries. So, in the aviation industry for getting new technologies and competing with other countries aviation industry, there is a requirement for capability in R&D. Considering of appropriate R&D strategy is supportive that day technologies of the world can be achieved. Biofuel technology is one of the newest technologies that has allocated discussion of the world in aviation industry to itself. The purpose of this research has been formulation of R&D strategy of biofuel technology in aviation industry of Iran. After reviewing of the theoretical foundations of the methods and R&D strategies, finally we classified R&D strategies in four main categories as follows: internal R&D, collaboration R&D, out sourcing R&D and in-house R&D. After a review of R&D strategies, a model for formulation of R&D strategy with the aim of developing biofuel technology in aviation industry in Iran was offered. With regard to the requirements and aracteristics of industry and technology in the model, we presented an integrated approach to R&D. Based on the techniques of decision making and analyzing of structured expert opinion, 4 R&D strategies for different scenarios and with the aim of developing biofuel technology in aviation industry in Iran were recommended. In this research, based on the common features of the implementation process of R&D, a logical classification of these methods are presented as R&D strategies. Then, R&D strategies and their characteristics was developed according to the experts. In the end, we introduced a model to consider the role of aviation industry and biofuel technology in R&D strategies. And lastly, for conditions and various scenarios of the aviation industry, we have formulated a specific R&D strategy.

Keywords: aviation industry, biofuel technology, R&D, R&D strategy

Procedia PDF Downloads 572
955 Simulation of Glass Breakage Using Voronoi Random Field Tessellations

Authors: Michael A. Kraus, Navid Pourmoghaddam, Martin Botz, Jens Schneider, Geralt Siebert

Abstract:

Fragmentation analysis of tempered glass gives insight into the quality of the tempering process and defines a certain degree of safety as well. Different standard such as the European EN 12150-1 or the American ASTM C 1048/CPSC 16 CFR 1201 define a minimum number of fragments required for soda-lime safety glass on the basis of fragmentation test results for classification. This work presents an approach for the glass breakage pattern prediction using a Voronoi Tesselation over Random Fields. The random Voronoi tessellation is trained with and validated against data from several breakage patterns. The fragments in observation areas of 50 mm x 50 mm were used for training and validation. All glass specimen used in this study were commercially available soda-lime glasses at three different thicknesses levels of 4 mm, 8 mm and 12 mm. The results of this work form a Bayesian framework for the training and prediction of breakage patterns of tempered soda-lime glass using a Voronoi Random Field Tesselation. Uncertainties occurring in this process can be well quantified, and several statistical measures of the pattern can be preservation with this method. Within this work it was found, that different Random Fields as basis for the Voronoi Tesselation lead to differently well fitted statistical properties of the glass breakage patterns. As the methodology is derived and kept general, the framework could be also applied to other random tesselations and crack pattern modelling purposes.

Keywords: glass breakage predicition, Voronoi Random Field Tessellation, fragmentation analysis, Bayesian parameter identification

Procedia PDF Downloads 157
954 Developing Reading Methods of Industrial Education Students at King Mongkut’s Institute of Technology Ladkrabang

Authors: Rattana Sangchan, Pattaraporn Thampradit

Abstract:

Teaching students to use a variety of reading methods in developing reading is essential for Thai university students. However, there haven’t been a lot of studies concerned about developing reading methods that are used by Thai students in the industrial education field. Therefore, this study was carried out not only to investigate the developing reading methods of Industrial Education students at King Mongkut’s Institute of Technology Ladkrabang, but also to determine if the developing reading strategies differ among the students’ reading abilities and differ gender: male and female. The research instrument used in collecting the data consisted of fourteen statements which include either metacognitive strategies, cognitive strategies or social / affective strategies. Results of this study revealed that students could develop their reading methods in moderate level (mean=3.13). Furthermore, high reading ability students had different levels of using reading methods to develop their reading from those of mid reading ability students. In addition, high reading ability students could use either metacognitive reading methods or cognitive reading methods to develop their reading much better than mid reading ability students. Interestingly, male students could develop their reading methods in great levels while female students could develop their reading methods only in moderate level. Last but not least, male students could use either metacognitive reading methods or cognitive reading methods to develop their reading much better than female students. Thus, the results of this study could indicate that most students need to apply much more reading strategies to develop their reading. At the same time, suggestions on how to motivate and train their students to apply much more appropriate effective reading strategies to better comprehend their reading were also provided.

Keywords: developing reading methods, industrial education, reading abilities, reading method classification

Procedia PDF Downloads 283
953 Nanoparticle-Based Histidine-Rich Protein-2 Assay for the Detection of the Malaria Parasite Plasmodium Falciparum

Authors: Yagahira E. Castro-Sesquen, Chloe Kim, Robert H. Gilman, David J. Sullivan, Peter C. Searson

Abstract:

Diagnosis of severe malaria is particularly important in highly endemic regions since most patients are positive for parasitemia and treatment differs from non-severe malaria. Diagnosis can be challenging due to the prevalence of diseases with similar symptoms. Accurate diagnosis is increasingly important to avoid overprescribing antimalarial drugs, minimize drug resistance, and minimize costs. A nanoparticle-based assay for detection and quantification of Plasmodium falciparum histidine-rich protein 2 (HRP2) in urine and serum is reported. The assay uses magnetic beads conjugated with anti-HRP2 antibody for protein capture and concentration, and antibody-conjugated quantum dots for optical detection. Western Blot analysis demonstrated that magnetic beads allows the concentration of HRP2 protein in urine by 20-fold. The concentration effect was achieved because large volume of urine can be incubated with beads, and magnetic separation can be easily performed in minutes to isolate beads containing HRP2 protein. Magnetic beads and Quantum Dots 525 conjugated to anti-HRP2 antibodies allows the detection of low concentration of HRP2 protein (0.5 ng mL-1), and quantification in the range of 33 to 2,000 ng mL-1 corresponding to the range associated with non-severe to severe malaria. This assay can be easily adapted to a non-invasive point-of-care test for classification of severe malaria.

Keywords: HRP2 protein, malaria, magnetic beads, Quantum dots

Procedia PDF Downloads 328
952 Bridge Health Monitoring: A Review

Authors: Mohammad Bakhshandeh

Abstract:

Structural Health Monitoring (SHM) is a crucial and necessary practice that plays a vital role in ensuring the safety and integrity of critical structures, and in particular, bridges. The continuous monitoring of bridges for signs of damage or degradation through Bridge Health Monitoring (BHM) enables early detection of potential problems, allowing for prompt corrective action to be taken before significant damage occurs. Although all monitoring techniques aim to provide accurate and decisive information regarding the remaining useful life, safety, integrity, and serviceability of bridges, understanding the development and propagation of damage is vital for maintaining uninterrupted bridge operation. Over the years, extensive research has been conducted on BHM methods, and experts in the field have increasingly adopted new methodologies. In this article, we provide a comprehensive exploration of the various BHM approaches, including sensor-based, non-destructive testing (NDT), model-based, and artificial intelligence (AI)-based methods. We also discuss the challenges associated with BHM, including sensor placement and data acquisition, data analysis and interpretation, cost and complexity, and environmental effects, through an extensive review of relevant literature and research studies. Additionally, we examine potential solutions to these challenges and propose future research ideas to address critical gaps in BHM.

Keywords: structural health monitoring (SHM), bridge health monitoring (BHM), sensor-based methods, machine-learning algorithms, and model-based techniques, sensor placement, data acquisition, data analysis

Procedia PDF Downloads 85
951 Convolutional Neural Networks versus Radiomic Analysis for Classification of Breast Mammogram

Authors: Mehwish Asghar

Abstract:

Breast Cancer (BC) is a common type of cancer among women. Its screening is usually performed using different imaging modalities such as magnetic resonance imaging, mammogram, X-ray, CT, etc. Among these modalities’ mammogram is considered a powerful tool for diagnosis and screening of breast cancer. Sophisticated machine learning approaches have shown promising results in complementing human diagnosis. Generally, machine learning methods can be divided into two major classes: one is Radiomics analysis (RA), where image features are extracted manually; and the other one is the concept of convolutional neural networks (CNN), in which the computer learns to recognize image features on its own. This research aims to improve the incidence of early detection, thus reducing the mortality rate caused by breast cancer through the latest advancements in computer science, in general, and machine learning, in particular. It has also been aimed to ease the burden of doctors by improving and automating the process of breast cancer detection. This research is related to a relative analysis of different techniques for the implementation of different models for detecting and classifying breast cancer. The main goal of this research is to provide a detailed view of results and performances between different techniques. The purpose of this paper is to explore the potential of a convolutional neural network (CNN) w.r.t feature extractor and as a classifier. Also, in this research, it has been aimed to add the module of Radiomics for comparison of its results with deep learning techniques.

Keywords: breast cancer (BC), machine learning (ML), convolutional neural network (CNN), radionics, magnetic resonance imaging, artificial intelligence

Procedia PDF Downloads 219
950 A Preliminary Literature Review of Digital Transformation Case Studies

Authors: Vesna Bosilj Vukšić, Lucija Ivančić, Dalia Suša Vugec

Abstract:

While struggling to succeed in today’s complex market environment and provide better customer experience and services, enterprises encompass digital transformation as a means for reaching competitiveness and foster value creation. A digital transformation process consists of information technology implementation projects, as well as organizational factors such as top management support, digital transformation strategy, and organizational changes. However, to the best of our knowledge, there is little evidence about digital transformation endeavors in organizations and how they perceive it – is it only about digital technologies adoption or a true organizational shift is needed? In order to address this issue and as the first step in our research project, a literature review is conducted. The analysis included case study papers from Scopus and Web of Science databases. The following attributes are considered for classification and analysis of papers: time component; country of case origin; case industry and; digital transformation concept comprehension, i.e. focus. Research showed that organizations – public, as well as private ones, are aware of change necessity and employ digital transformation projects. Also, the changes concerning digital transformation affect both manufacturing and service-based industries. Furthermore, we discovered that organizations understand that besides technologies implementation, organizational changes must also be adopted. However, with only 29 relevant papers identified, research positioned digital transformation as an unexplored and emerging phenomenon in information systems research. The scarcity of evidence-based papers calls for further examination of this topic on cases from practice.

Keywords: digital strategy, digital technologies, digital transformation, literature review

Procedia PDF Downloads 212
949 Design and Optimization of Open Loop Supply Chain Distribution Network Using Hybrid K-Means Cluster Based Heuristic Algorithm

Authors: P. Suresh, K. Gunasekaran, R. Thanigaivelan

Abstract:

Radio frequency identification (RFID) technology has been attracting considerable attention with the expectation of improved supply chain visibility for consumer goods, apparel, and pharmaceutical manufacturers, as well as retailers and government procurement agencies. It is also expected to improve the consumer shopping experience by making it more likely that the products they want to purchase are available. Recent announcements from some key retailers have brought interest in RFID to the forefront. A modified K- Means Cluster based Heuristic approach, Hybrid Genetic Algorithm (GA) - Simulated Annealing (SA) approach, Hybrid K-Means Cluster based Heuristic-GA and Hybrid K-Means Cluster based Heuristic-GA-SA for Open Loop Supply Chain Network problem are proposed. The study incorporated uniform crossover operator and combined crossover operator in GAs for solving open loop supply chain distribution network problem. The algorithms are tested on 50 randomly generated data set and compared with each other. The results of the numerical experiments show that the Hybrid K-means cluster based heuristic-GA-SA, when tested on 50 randomly generated data set, shows superior performance to the other methods for solving the open loop supply chain distribution network problem.

Keywords: RFID, supply chain distribution network, open loop supply chain, genetic algorithm, simulated annealing

Procedia PDF Downloads 161
948 Predicting the Compressive Strength of Geopolymer Concrete Using Machine Learning Algorithms: Impact of Chemical Composition and Curing Conditions

Authors: Aya Belal, Ahmed Maher Eltair, Maggie Ahmed Mashaly

Abstract:

Geopolymer concrete is gaining recognition as a sustainable alternative to conventional Portland Cement concrete due to its environmentally friendly nature, which is a key goal for Smart City initiatives. It has demonstrated its potential as a reliable material for the design of structural elements. However, the production of Geopolymer concrete is hindered by batch-to-batch variations, which presents a significant challenge to the widespread adoption of Geopolymer concrete. To date, Machine learning has had a profound impact on various fields by enabling models to learn from large datasets and predict outputs accurately. This paper proposes an integration between the current drift to Artificial Intelligence and the composition of Geopolymer mixtures to predict their mechanical properties. This study employs Python software to develop machine learning model in specific Decision Trees. The research uses the percentage oxides and the chemical composition of the Alkali Solution along with the curing conditions as the input independent parameters, irrespective of the waste products used in the mixture yielding the compressive strength of the mix as the output parameter. The results showed 90 % agreement of the predicted values to the actual values having the ratio of the Sodium Silicate to the Sodium Hydroxide solution being the dominant parameter in the mixture.

Keywords: decision trees, geopolymer concrete, machine learning, smart cities, sustainability

Procedia PDF Downloads 79
947 Client Hacked Server

Authors: Bagul Abhijeet

Abstract:

Background: Client-Server model is the backbone of today’s internet communication. In which normal user can not have control over particular website or server? By using the same processing model one can have unauthorized access to particular server. In this paper, we discussed about application scenario of hacking for simple website or server consist of unauthorized way to access the server database. This application emerges to autonomously take direct access of simple website or server and retrieve all essential information maintain by administrator. In this system, IP address of server given as input to retrieve user-id and password of server. This leads to breaking administrative security of server and acquires the control of server database. Whereas virus helps to escape from server security by crashing the whole server. Objective: To control malicious attack and preventing all government website, and also find out illegal work to do hackers activity. Results: After implementing different hacking as well as non-hacking techniques, this system hacks simple web sites with normal security credentials. It provides access to server database and allow attacker to perform database operations from client machine. Above Figure shows the experimental result of this application upon different servers and provides satisfactory results as required. Conclusion: In this paper, we have presented a to view to hack the server which include some hacking as well as non-hacking methods. These algorithms and methods provide efficient way to hack server database. By breaking the network security allow to introduce new and better security framework. The terms “Hacking” not only consider for its illegal activities but also it should be use for strengthen our global network.

Keywords: Hacking, Vulnerabilities, Dummy request, Virus, Server monitoring

Procedia PDF Downloads 249
946 Landscape Classification in North of Jordan by Integrated Approach of Remote Sensing and Geographic Information Systems

Authors: Taleb Odeh, Nizar Abu-Jaber, Nour Khries

Abstract:

The southern part of Wadi Al Yarmouk catchment area covers north of Jordan. It locates within latitudes 32° 20’ to 32° 45’N and longitudes 35° 42’ to 36° 23’ E and has an area of about 1426 km2. However, it has high relief topography where the elevation varies between 50 to 1100 meter above sea level. The variations in the topography causes different units of landforms, climatic zones, land covers and plant species. As a results of these different landscapes units exists in that region. Spatial planning is a major challenge in such a vital area for Jordan which could not be achieved without determining landscape units. However, an integrated approach of remote sensing and geographic information Systems (GIS) is an optimized tool to investigate and map landscape units of such a complicated area. Remote sensing has the capability to collect different land surface data, of large landscape areas, accurately and in different time periods. GIS has the ability of storage these land surface data, analyzing them spatially and present them in form of professional maps. We generated a geo-land surface data that include land cover, rock units, soil units, plant species and digital elevation model using ASTER image and Google Earth while analyzing geo-data spatially were done by ArcGIS 10.2 software. We found that there are twenty two different landscape units in the study area which they have to be considered for any spatial planning in order to avoid and environmental problems.

Keywords: landscape, spatial planning, GIS, spatial analysis, remote sensing

Procedia PDF Downloads 522
945 Computer Countenanced Diagnosis of Skin Nodule Detection and Histogram Augmentation: Extracting System for Skin Cancer

Authors: S. Zith Dey Babu, S. Kour, S. Verma, C. Verma, V. Pathania, A. Agrawal, V. Chaudhary, A. Manoj Puthur, R. Goyal, A. Pal, T. Danti Dey, A. Kumar, K. Wadhwa, O. Ved

Abstract:

Background: Skin cancer is now is the buzzing button in the field of medical science. The cyst's pandemic is drastically calibrating the body and well-being of the global village. Methods: The extracted image of the skin tumor cannot be used in one way for diagnosis. The stored image contains anarchies like the center. This approach will locate the forepart of an extracted appearance of skin. Partitioning image models has been presented to sort out the disturbance in the picture. Results: After completing partitioning, feature extraction has been formed by using genetic algorithm and finally, classification can be performed between the trained and test data to evaluate a large scale of an image that helps the doctors for the right prediction. To bring the improvisation of the existing system, we have set our objectives with an analysis. The efficiency of the natural selection process and the enriching histogram is essential in that respect. To reduce the false-positive rate or output, GA is performed with its accuracy. Conclusions: The objective of this task is to bring improvisation of effectiveness. GA is accomplishing its task with perfection to bring down the invalid-positive rate or outcome. The paper's mergeable portion conflicts with the composition of deep learning and medical image processing, which provides superior accuracy. Proportional types of handling create the reusability without any errors.

Keywords: computer-aided system, detection, image segmentation, morphology

Procedia PDF Downloads 144
944 Investigation of Cold Atmospheric Plasma Exposure Protocol on Wound Healing in Diabetic Foot Ulcer

Authors: P. Akbartehrani, M. Khaledi Pour, M. Amini, M. Khani, M. Mohajeri Tehrani, E. Ghasemi, P. Charipoor, B. Shokri

Abstract:

A common problem between diabetic patients is foot ulcers which are chronic and require specialized treatment. Previous studies illustrate that Cold atmospheric plasma (CAP) has beneficial effects on wound healing and infection. Nevertheless, the comparison of different cap exposure protocols in diabetic ulcer wound healing remained to be studied. This study aims to determine the effect of two different exposure protocols on wound healing in diabetic ulcers. A prospective, randomized clinical trial was conducted at two clinics. Diabetic patients with G1 and G2 wanger classification diabetic foot ulcers were divided into two groups of study. One group was treated by the first protocol, which was treating wounds by argon-generated cold atmospheric plasma jet once a week for five weeks in a row. The other group was treated by the second protocol, which was treating wounds every three days for five weeks in a row. The wounds were treated for 40 seconds/cubic centimeter, while the nozzle tip was moved nonlocalized 1 cm above the wounds. A patient with one or more wounds could participate in different groups as wounds were separately randomized, which allow a participant to be treated several times during the study. The study's significant findings were two different reductions rate in wound size, microbial load, and two different healing speeds. This study concludes that CAP therapy by the second protocol yields more effective healing speeds, reduction in wound sizes, and microbial loads of foot ulcers in diabetic patients.

Keywords: wound healing, diabetic ulcers, cold atmospheric plasma, cold argon jet

Procedia PDF Downloads 212
943 Factors Associated with Weight Loss Maintenance after an Intervention Program

Authors: Filipa Cortez, Vanessa Pereira

Abstract:

Introduction: The main challenge of obesity treatment is long-term weight loss maintenance. The 3 phases method is a weight loss program that combines a low carb and moderately high-protein diet, food supplements and a weekly one-to-one consultation with a certified nutritionist. Sustained weight control is the ultimate goal of phase 3. Success criterion was the minimum loss of 10% of initial weight and its maintenance after 12 months. Objective: The aim of this study was to identify factors associated with successful weight loss maintenance after 12 months at the end of 3 phases method. Methods: The study included 199 subjects that achieved their weight loss goal (phase 3). Weight and body mass index (BMI) were obtained at the baseline and every week until the end of the program. Therapeutic adherence was measured weekly on a Likert scale from 1 to 5. Subjects were considered in compliance with nutritional recommendation and supplementation when their classification was ≥ 4. After 12 months of the method, the current weight and number of previous weight-loss attempts were collected by telephone interview. The statistical significance was assumed at p-values < 0.05. Statistical analyses were performed using SPSS TM software v.21. Results: 65.3% of subjects met the success criterion. The factors which displayed a significant weight loss maintenance prediction were: greater initial percentage weight loss (OR=1.44) during the weight loss intervention and a higher number of consultations in phase 3 (OR=1.10). Conclusion: These findings suggest that the percentage weight loss during the weight loss intervention and the number of consultations in phase 3 may facilitate maintenance of weight loss after the 3 phases method.

Keywords: obesity, weight maintenance, low-carbohydrate diet, dietary supplements

Procedia PDF Downloads 146
942 A Neuropsychological Investigation of the Relationship between Anxiety Levels and Loss of Inhibitory Cognitive Control in Ageing and Dementia

Authors: Nasreen Basoudan, Andrea Tales, Frederic Boy

Abstract:

Non-clinical anxiety may be comprised of state anxiety - temporarily experienced anxiety related to a specific situation, and trait anxiety - a longer lasting response or a general disposition to anxiety. While temporary and occasional anxiety whether as a mood state or personality dimension is normal, nonclinical anxiety may influence many more components of information processing than previously recognized. In ageing and dementia-related research, disease characterization now involves attempts to understand a much wider range of brain function such as loss of inhibitory control, as against the more common focus on memory and cognition. However, in many studies, the tendency has been to include individuals with clinical anxiety disorders while excluding persons with lower levels of state or trait anxiety. Loss of inhibitory cognitive control can lead to behaviors such as aggression, reduced sensitivity to others, sociopathic thoughts and actions. Anxiety has also been linked to inhibitory control, with research suggesting that people with anxiety are less capable of inhibiting their emotions than the average person. This study investigates the relationship between anxiety and loss of inhibitory control in younger and older adults, using a variety of questionnaires and computers-based tests. Based on the premise that irrespective of classification, anxiety is associated with a wide range of physical, affective, and cognitive responses, this study explores evidence indicative of the potential influence anxiety per se on loss of inhibitory control, in order to contribute to discussion and appropriate consideration of anxiety-related factors in methodological practice.

Keywords: anxiety, ageing, dementia, inhibitory control

Procedia PDF Downloads 236
941 Optimization of Economic Order Quantity of Multi-Item Inventory Control Problem through Nonlinear Programming Technique

Authors: Prabha Rohatgi

Abstract:

To obtain an efficient control over a huge amount of inventory of drugs in pharmacy department of any hospital, generally, the medicines are categorized on the basis of their cost ‘ABC’ (Always Better Control), first and then categorize on the basis of their criticality ‘VED’ (Vital, Essential, desirable) for prioritization. About one-third of the annual expenditure of a hospital is spent on medicines. To minimize the inventory investment, the hospital management may like to keep the medicines inventory low, as medicines are perishable items. The main aim of each and every hospital is to provide better services to the patients under certain limited resources. To achieve the satisfactory level of health care services to outdoor patients, a hospital has to keep eye on the wastage of medicines because expiry date of medicines causes a great loss of money though it was limited and allocated for a particular period of time. The objectives of this study are to identify the categories of medicines requiring incentive managerial control. In this paper, to minimize the total inventory cost and the cost associated with the wastage of money due to expiry of medicines, an inventory control model is used as an estimation tool and then nonlinear programming technique is used under limited budget and fixed number of orders to be placed in a limited time period. Numerical computations have been given and shown that by using scientific methods in hospital services, we can give more effective way of inventory management under limited resources and can provide better health care services. The secondary data has been collected from a hospital to give empirical evidence.

Keywords: ABC-VED inventory classification, multi item inventory problem, nonlinear programming technique, optimization of EOQ

Procedia PDF Downloads 251
940 Feature Based Unsupervised Intrusion Detection

Authors: Deeman Yousif Mahmood, Mohammed Abdullah Hussein

Abstract:

The goal of a network-based intrusion detection system is to classify activities of network traffics into two major categories: normal and attack (intrusive) activities. Nowadays, data mining and machine learning plays an important role in many sciences; including intrusion detection system (IDS) using both supervised and unsupervised techniques. However, one of the essential steps of data mining is feature selection that helps in improving the efficiency, performance and prediction rate of proposed approach. This paper applies unsupervised K-means clustering algorithm with information gain (IG) for feature selection and reduction to build a network intrusion detection system. For our experimental analysis, we have used the new NSL-KDD dataset, which is a modified dataset for KDDCup 1999 intrusion detection benchmark dataset. With a split of 60.0% for the training set and the remainder for the testing set, a 2 class classifications have been implemented (Normal, Attack). Weka framework which is a java based open source software consists of a collection of machine learning algorithms for data mining tasks has been used in the testing process. The experimental results show that the proposed approach is very accurate with low false positive rate and high true positive rate and it takes less learning time in comparison with using the full features of the dataset with the same algorithm.

Keywords: information gain (IG), intrusion detection system (IDS), k-means clustering, Weka

Procedia PDF Downloads 292