Search results for: automatic processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4425

Search results for: automatic processing

2055 Deep Learning-Based Classification of 3D CT Scans with Real Clinical Data; Impact of Image format

Authors: Maryam Fallahpoor, Biswajeet Pradhan

Abstract:

Background: Artificial intelligence (AI) serves as a valuable tool in mitigating the scarcity of human resources required for the evaluation and categorization of vast quantities of medical imaging data. When AI operates with optimal precision, it minimizes the demand for human interpretations and, thereby, reduces the burden on radiologists. Among various AI approaches, deep learning (DL) stands out as it obviates the need for feature extraction, a process that can impede classification, especially with intricate datasets. The advent of DL models has ushered in a new era in medical imaging, particularly in the context of COVID-19 detection. Traditional 2D imaging techniques exhibit limitations when applied to volumetric data, such as Computed Tomography (CT) scans. Medical images predominantly exist in one of two formats: neuroimaging informatics technology initiative (NIfTI) and digital imaging and communications in medicine (DICOM). Purpose: This study aims to employ DL for the classification of COVID-19-infected pulmonary patients and normal cases based on 3D CT scans while investigating the impact of image format. Material and Methods: The dataset used for model training and testing consisted of 1245 patients from IranMehr Hospital. All scans shared a matrix size of 512 × 512, although they exhibited varying slice numbers. Consequently, after loading the DICOM CT scans, image resampling and interpolation were performed to standardize the slice count. All images underwent cropping and resampling, resulting in uniform dimensions of 128 × 128 × 60. Resolution uniformity was achieved through resampling to 1 mm × 1 mm × 1 mm, and image intensities were confined to the range of (−1000, 400) Hounsfield units (HU). For classification purposes, positive pulmonary COVID-19 involvement was designated as 1, while normal images were assigned a value of 0. Subsequently, a U-net-based lung segmentation module was applied to obtain 3D segmented lung regions. The pre-processing stage included normalization, zero-centering, and shuffling. Four distinct 3D CNN models (ResNet152, ResNet50, DensNet169, and DensNet201) were employed in this study. Results: The findings revealed that the segmentation technique yielded superior results for DICOM images, which could be attributed to the potential loss of information during the conversion of original DICOM images to NIFTI format. Notably, ResNet152 and ResNet50 exhibited the highest accuracy at 90.0%, and the same models achieved the best F1 score at 87%. ResNet152 also secured the highest Area under the Curve (AUC) at 0.932. Regarding sensitivity and specificity, DensNet201 achieved the highest values at 93% and 96%, respectively. Conclusion: This study underscores the capacity of deep learning to classify COVID-19 pulmonary involvement using real 3D hospital data. The results underscore the significance of employing DICOM format 3D CT images alongside appropriate pre-processing techniques when training DL models for COVID-19 detection. This approach enhances the accuracy and reliability of diagnostic systems for COVID-19 detection.

Keywords: deep learning, COVID-19 detection, NIFTI format, DICOM format

Procedia PDF Downloads 88
2054 Improving the Security of Internet of Things Using Encryption Algorithms

Authors: Amirhossein Safi

Abstract:

Internet of things (IOT) is a kind of advanced information technology which has drawn societies’ attention. Sensors and stimulators are usually recognized as smart devices of our environment. Simultaneously, IOT security brings up new issues. Internet connection and possibility of interaction with smart devices cause those devices to involve more in human life. Therefore, safety is a fundamental requirement in designing IOT. IOT has three remarkable features: overall perception, reliable transmission, and intelligent processing. Because of IOT span, security of conveying data is an essential factor for system security. Hybrid encryption technique is a new model that can be used in IOT. This type of encryption generates strong security and low computation. In this paper, we have proposed a hybrid encryption algorithm which has been conducted in order to reduce safety risks and enhancing encryption's speed and less computational complexity. The purpose of this hybrid algorithm is information integrity, confidentiality, non-repudiation in data exchange for IOT. Eventually, the suggested encryption algorithm has been simulated by MATLAB software, and its speed and safety efficiency were evaluated in comparison with conventional encryption algorithm.

Keywords: internet of things, security, hybrid algorithm, privacy

Procedia PDF Downloads 467
2053 Wireless Based System for Continuous Electrocardiography Monitoring during Surgery

Authors: K. Bensafia, A. Mansour, G. Le Maillot, B. Clement, O. Reynet, P. Ariès, S. Haddab

Abstract:

This paper presents a system designed for wireless acquisition, the recording of electrocardiogram (ECG) signals and the monitoring of the heart’s health during surgery. This wireless recording system allows us to visualize and monitor the state of the heart’s health during a surgery, even if the patient is moved from the operating theater to post anesthesia care unit. The acquired signal is transmitted via a Bluetooth unit to a PC where the data are displayed, stored and processed. To test the reliability of our system, a comparison between ECG signals processed by a conventional ECG monitoring system (Datex-Ohmeda) and by our wireless system is made. The comparison is based on the shape of the ECG signal, the duration of the QRS complex, the P and T waves, as well as the position of the ST segments with respect to the isoelectric line. The proposed system is presented and discussed. The results have confirmed that the use of Bluetooth during surgery does not affect the devices used and vice versa. Pre- and post-processing steps are briefly discussed. Experimental results are also provided.

Keywords: electrocardiography, monitoring, surgery, wireless system

Procedia PDF Downloads 370
2052 Overview of Resources and Tools to Bridge Language Barriers Provided by the European Union

Authors: Barbara Heinisch, Mikael Snaprud

Abstract:

A common, well understood language is crucial in critical situations like landing a plane. For e-Government solutions, a clear and common language is needed to allow users to successfully complete transactions online. Misunderstandings here may not risk a safe landing but can cause delays, resubmissions and drive costs. This holds also true for higher education, where misunderstandings can also arise due to inconsistent use of terminology. Thus, language barriers are a societal challenge that needs to be tackled. The major means to bridge language barriers is translation. However, achieving high-quality translation and making texts understandable and accessible require certain framework conditions. Therefore, the EU and individual projects take (strategic) actions. These actions include the identification, collection, processing, re-use and development of language resources. These language resources may be used for the development of machine translation systems and the provision of (public) services including higher education. This paper outlines some of the existing resources and indicate directions for further development to increase the quality and usage of these resources.

Keywords: language resources, machine translation, terminology, translation

Procedia PDF Downloads 319
2051 Would Intra-Individual Variability in Attention to Be the Indicator of Impending the Senior Adults at Risk of Cognitive Decline: Evidence from Attention Network Test(ANT)

Authors: Hanna Lu, Sandra S. M. Chan, Linda C. W. Lam

Abstract:

Objectives: Intra-individual variability (IIV) has been considered as a biomarker of healthy ageing. However, the composite role of IIV in attention, as an impending indicator for neurocognitive disorders warrants further exploration. This study aims to investigate the IIV, as well as their relationships with attention network functions in adults with neurocognitive disorders (NCD). Methods: 36adults with NCD due to Alzheimer’s disease(NCD-AD), 31adults with NCD due to vascular disease (NCD-vascular), and 137 healthy controls were recruited. Intraindividual standard deviations (iSD) and intraindividual coefficient of variation of reaction time (ICV-RT) were used to evaluate the IIV. Results: NCD groups showed greater IIV (iSD: F= 11.803, p < 0.001; ICV-RT:F= 9.07, p < 0.001). In ROC analyses, the indices of IIV could differentiateNCD-AD (iSD: AUC value = 0.687, p= 0.001; ICV-RT: AUC value = 0.677, p= 0.001) and NCD-vascular (iSD: AUC value = 0.631, p= 0.023;ICV-RT: AUC value = 0.615, p= 0.045) from healthy controls. Moreover, the processing speed could distinguish NCD-AD from NCD-vascular (AUC value = 0.647, p= 0.040). Discussion: Intra-individual variability in attention provides a stable measure of cognitive performance, and seems to help distinguish the senior adults with different cognitive status.

Keywords: intra-individual variability, attention network, neurocognitive disorders, ageing

Procedia PDF Downloads 475
2050 Residual Modulus of Elasticity of Self-Compacting Concrete Incorporated Unprocessed Waste Fly Ash after Expose to the Elevated Temperature

Authors: Mohammed Abed, Rita Nemes, Salem Nehme

Abstract:

The present study experimentally investigated the impact of incorporating unprocessed waste fly ash (UWFA) on the residual mechanical properties of self-compacting concrete (SCC) after exposure to elevated temperature. Three mixtures of SCC have been produced by replacing the cement mass by 0%, 15% and 30% of UWFA. Generally, the fire resistance of SCC has been enhanced by replacing the cement up to 15% of UWFA, especially in case of residual modulus of elasticity which considers more sensitive than other mechanical properties at elevated temperature. However, a strong linear relationship has been observed between the residual flexural strength and modulus of elasticity, where both of them affected significantly by the cracks appearance and propagation as a result of elevated temperature. Sustainable products could be produced by incorporating unprocessed waste powder materials in the production of concrete, where the waste materials, CO2 emissions, and the energy needed for processing are reduced.

Keywords: self-compacting high-performance concrete, unprocessed waste fly ash, fire resistance, residual modulus of elasticity

Procedia PDF Downloads 135
2049 Characterization and Degradation Analysis of Tapioca Starch Based Biofilms

Authors: R. R. Ali, W. A. W. A. Rahman, R. M. Kasmani, H. Hasbullah, N. Ibrahim, A. N. Sadikin, U. A. Asli

Abstract:

In this study, tapioca starch which acts as natural polymer was added in the blend in order to produce biodegradable product. Low density polyethylene (LDPE) and tapioca starch blends were prepared by extrusion and the test sample by injection moulding process. Ethylene vinyl acetate (EVA) acts as compatibilizer while glycerol as processing aid was added in the blend. The blends were characterized by using melt flow index (MFI), fourier transform infrared (FTIR) and the effects of water absorption to the sample. As the starch content increased, MFI of the blend was decreased. Tensile testing were conducted shows the tensile strength and elongation at break decreased while the modulus increased as the starch increased. For the biodegradation, soil burial test was conducted and the loss in weight was studied as the starch content increased. Morphology studies were conducted in order to show the distribution between LDPE and starch.

Keywords: biopolymers, degradable polymers, starch based polyethylene, injection moulding

Procedia PDF Downloads 286
2048 Efficient Layout-Aware Pretraining for Multimodal Form Understanding

Authors: Armineh Nourbakhsh, Sameena Shah, Carolyn Rose

Abstract:

Layout-aware language models have been used to create multimodal representations for documents that are in image form, achieving relatively high accuracy in document understanding tasks. However, the large number of parameters in the resulting models makes building and using them prohibitive without access to high-performing processing units with large memory capacity. We propose an alternative approach that can create efficient representations without the need for a neural visual backbone. This leads to an 80% reduction in the number of parameters compared to the smallest SOTA model, widely expanding applicability. In addition, our layout embeddings are pre-trained on spatial and visual cues alone and only fused with text embeddings in downstream tasks, which can facilitate applicability to low-resource of multi-lingual domains. Despite using 2.5% of training data, we show competitive performance on two form understanding tasks: semantic labeling and link prediction.

Keywords: layout understanding, form understanding, multimodal document understanding, bias-augmented attention

Procedia PDF Downloads 148
2047 Structural Characterization and Hot Deformation Behaviour of Al3Ni2/Al3Ni in-situ Core-shell intermetallic in Al-4Cu-Ni Composite

Authors: Ganesh V., Asit Kumar Khanra

Abstract:

An in-situ powder metallurgy technique was employed to create Ni-Al3Ni/Al3Ni2 core-shell-shaped aluminum-based intermetallic reinforced composites. The impact of Ni addition on the phase composition, microstructure, and mechanical characteristics of the Al-4Cu-xNi (x = 0, 2, 4, 6, 8, 10 wt.%) in relation to various sintering temperatures was investigated. Microstructure evolution was extensively examined using X-ray diffraction (XRD), scanning electron microscopy with energy-dispersive X-ray spectroscopy (SEM-EDX), and transmission electron microscopy (TEM) techniques. Initially, under sintering conditions, the formation of "Single Core-Shell" structures was observed, consisting of Ni as the core with Al3Ni2 intermetallic, whereas samples sintered at 620°C exhibited both "Single Core-Shell" and "Double Core-Shell" structures containing Al3Ni2 and Al3Ni intermetallics formed between the Al matrix and Ni reinforcements. The composite achieved a high compressive yield strength of 198.13 MPa and ultimate strength of 410.68 MPa, with 24% total elongation for the sample containing 10 wt.% Ni. Additionally, there was a substantial increase in hardness, reaching 124.21 HV, which is 2.4 times higher than that of the base aluminum. Nanoindentation studies showed hardness values of 1.54, 4.65, 21.01, 13.16, 5.52, 6.27, and 8.39GPa corresponding to α-Al matrix, Ni, Al3Ni2, Ni and Al3Ni2 interface, Al3Ni, and their respective interfaces. Even at 200°C, it retained 54% of its room temperature strength (90.51 MPa). To investigate the deformation behavior of the composite material, experiments were conducted at deformation temperatures ranging from 300°C to 500°C, with strain rates varying from 0.0001s-1 to 0.1s-1. A sine-hyperbolic constitutive equation was developed to characterize the flow stress of the composite, which exhibited a significantly higher hot deformation activation energy of 231.44 kJ/mol compared to the self-diffusion of pure aluminum. The formation of Al2Cu intermetallics at grain boundaries and Al3Ni2/Al3Ni within the matrix hindered dislocation movement, leading to an increase in activation energy, which might have an adverse effect on high-temperature applications. Two models, the Strain-compensated Arrhenius model and the Artificial Neural Network (ANN) model, were developed to predict the composite's flow behavior. The ANN model outperformed the Strain-compensated Arrhenius model with a lower average absolute relative error of 2.266%, a smaller root means square error of 1.2488 MPa, and a higher correlation coefficient of 0.9997. Processing maps revealed that the optimal hot working conditions for the composite were in the temperature range of 420-500°C and strain rates between 0.0001s-1 and 0.001s-1. The changes in the composite microstructure were successfully correlated with the theory of processing maps, considering temperature and strain rate conditions. The uneven distribution in the shape and size of Core-shell/Al3Ni intermetallic compounds influenced the flow stress curves, leading to Dynamic Recrystallization (DRX), followed by partial Dynamic Recovery (DRV), and ultimately strain hardening. This composite material shows promise for applications in the automobile and aerospace industries.

Keywords: core-shell structure, hot deformation, intermetallic compounds, powder metallurgy

Procedia PDF Downloads 20
2046 Monocular 3D Person Tracking AIA Demographic Classification and Projective Image Processing

Authors: McClain Thiel

Abstract:

Object detection and localization has historically required two or more sensors due to the loss of information from 3D to 2D space, however, most surveillance systems currently in use in the real world only have one sensor per location. Generally, this consists of a single low-resolution camera positioned above the area under observation (mall, jewelry store, traffic camera). This is not sufficient for robust 3D tracking for applications such as security or more recent relevance, contract tracing. This paper proposes a lightweight system for 3D person tracking that requires no additional hardware, based on compressed object detection convolutional-nets, facial landmark detection, and projective geometry. This approach involves classifying the target into a demographic category and then making assumptions about the relative locations of facial landmarks from the demographic information, and from there using simple projective geometry and known constants to find the target's location in 3D space. Preliminary testing, although severely lacking, suggests reasonable success in 3D tracking under ideal conditions.

Keywords: monocular distancing, computer vision, facial analysis, 3D localization

Procedia PDF Downloads 139
2045 Using a Robot Companion to Detect and Visualize the Indicators of Dementia Progression and Quality of Life of People Aged 65 and Older

Authors: Jeoffrey Oostrom, Robbert James Schlingmann, Hani Alers

Abstract:

This document depicts the research into the indicators of dementia progression, the automation of quality of life assignments, and the visualization of it. To do this, the Smart Teddy project was initiated to make a smart companion that both monitors the senior citizen as well as processing the captured data into an insightful dashboard. With around 50 million diagnoses worldwide, dementia proves again and again to be a bothersome strain on the lives of many individuals, their relatives, and society as a whole. In 2015 it was estimated that dementia care cost 818 billion U.S Dollars globally. The Smart Teddy project aims to take away a portion of the burden from caregivers by automating the collection of certain data, like movement, geolocation, and sound-levels. This paper proves that the Smart Teddy has the potential to become a useful tool for caregivers but won’t pose as a solution. The Smart Teddy still faces some problems in terms of emotional privacy, but its non-intrusive nature, as well as diversity in usability, can make up for it.

Keywords: dementia care, medical data visualization, quality of life, smart companion

Procedia PDF Downloads 139
2044 Development of a Wind Resource Assessment Framework Using Weather Research and Forecasting (WRF) Model, Python Scripting and Geographic Information Systems

Authors: Jerome T. Tolentino, Ma. Victoria Rejuso, Jara Kaye Villanueva, Loureal Camille Inocencio, Ma. Rosario Concepcion O. Ang

Abstract:

Wind energy is rapidly emerging as the primary source of electricity in the Philippines, although developing an accurate wind resource model is difficult. In this study, Weather Research and Forecasting (WRF) Model, an open source mesoscale Numerical Weather Prediction (NWP) model, was used to produce a 1-year atmospheric simulation with 4 km resolution on the Ilocos Region of the Philippines. The WRF output (netCDF) extracts the annual mean wind speed data using a Python-based Graphical User Interface. Lastly, wind resource assessment was produced using a GIS software. Results of the study showed that it is more flexible to use Python scripts than using other post-processing tools in dealing with netCDF files. Using WRF Model, Python, and Geographic Information Systems, a reliable wind resource map is produced.

Keywords: wind resource assessment, weather research and forecasting (WRF) model, python, GIS software

Procedia PDF Downloads 442
2043 Simulation Study of the Microwave Heating of the Hematite and Coal Mixture

Authors: Prasenjit Singha, Sunil Yadav, Soumya Ranjan Mohantry, Ajay Kumar Shukla

Abstract:

Temperature distribution in the hematite ore mixed with 7.5% coal was predicted by solving a 1-D heat conduction equation using an implicit finite difference approach. In this work, it was considered a square slab of 20 cm x 20 cm, which assumed the coal to be uniformly mixed with hematite ore. It was solved the equations with the use of MATLAB 2018a software. Heat transfer effects in this 1D dimensional slab convective and the radiative boundary conditions are also considered. Temperature distribution obtained inside hematite slab by considering microwave heating time, thermal conductivity, heat capacity, carbon percentage, sample dimensions, and many other factors such as penetration depth, permittivity, and permeability of coal and hematite ore mixtures. The resulting temperature profile can be used as a guiding tool for optimizing the microwave-assisted carbothermal reduction process of hematite slab was extended to other dimensions as well, viz., 1 cm x 1 cm, 5 cm x 5 cm, 10 cm x 10 cm, 20 cm x 20 cm. The model predictions are in good agreement with experimental results.

Keywords: hematite ore, coal, microwave processing, heat transfer, implicit method, temperature distribution

Procedia PDF Downloads 169
2042 Early Detection of Lymphedema in Post-Surgery Oncology Patients

Authors: Sneha Noble, Rahul Krishnan, Uma G., D. K. Vijaykumar

Abstract:

Breast-Cancer related Lymphedema is a major problem that affects many women. Lymphedema is the swelling that generally occurs in the arms or legs caused by the removal of or damage to lymph nodes as a part of cancer treatment. Treating it at the earliest possible stage is the best way to manage the condition and prevent it from leading to pain, recurrent infection, reduced mobility, and impaired function. So, this project aims to focus on the multi-modal approaches to identify the risks of Lymphedema in post-surgical oncology patients and prevent it at the earliest. The Kinect IR Sensor is utilized to capture the images of the body and after image processing techniques, the region of interest is obtained. Then, performing the voxelization method will provide volume measurements in pre-operative and post-operative periods in patients. The formation of a mathematical model will help in the comparison of values. Clinical pathological data of patients will be investigated to assess the factors responsible for the development of lymphedema and its risks.

Keywords: Kinect IR sensor, Lymphedema, voxelization, lymph nodes

Procedia PDF Downloads 138
2041 Getting to Know the Types of Asphalt, Its Manufacturing and Processing Methods and Its Application in Road Construction

Authors: Hamid Fallah

Abstract:

Asphalt is generally a mixture of stone materials with continuous granulation and a binder, which is usually bitumen. Asphalt is made in different shapes according to its use. The most familiar type of asphalt is hot asphalt or hot asphalt concrete. Stone materials usually make up more than 90% of the asphalt mixture. Therefore, stone materials have a significant impact on the quality of the resulting asphalt. According to the method of application and mixing, asphalt is divided into three categories: hot asphalt, protective asphalt, and cold asphalt. Cold mix asphalt is a mixture of stone materials and mixed bitumen or bitumen emulsion whose raw materials are mixed at ambient temperature. In some types of cold asphalt, the bitumen may be heated as necessary, but other materials are mixed with the bitumen without heating. Protective asphalts are used to make the roadbed impermeable, increase its abrasion and sliding resistance, and also temporarily improve the existing asphalt and concrete surfaces. This type of paving is very economical compared to hot asphalt due to the speed and ease of implementation and the limited need for asphalt machines and equipment. The present article, which is prepared in descriptive library form, introduces asphalt, its types, characteristics, and its application.

Keywords: asphalt, type of asphalt, asphalt concrete, sulfur concrete, bitumen in asphalt, sulfur, stone materials

Procedia PDF Downloads 69
2040 Diversity in Finance Literature Revealed through the Lens of Machine Learning: A Topic Modeling Approach on Academic Papers

Authors: Oumaima Lahmar

Abstract:

This paper aims to define a structured topography for finance researchers seeking to navigate the body of knowledge in their extrapolation of finance phenomena. To make sense of the body of knowledge in finance, a probabilistic topic modeling approach is applied on 6000 abstracts of academic articles published in three top journals in finance between 1976 and 2020. This approach combines both machine learning techniques and natural language processing to statistically identify the conjunctions between research articles and their shared topics described each by relevant keywords. The topic modeling analysis reveals 35 coherent topics that can well depict finance literature and provide a comprehensive structure for the ongoing research themes. Comparing the extracted topics to the Journal of Economic Literature (JEL) classification system, a significant similarity was highlighted between the characterizing keywords. On the other hand, we identify other topics that do not match the JEL classification despite being relevant in the finance literature.

Keywords: finance literature, textual analysis, topic modeling, perplexity

Procedia PDF Downloads 170
2039 Artificial Intelligence for Safety Related Aviation Incident and Accident Investigation Scenarios

Authors: Bernabeo R. Alberto

Abstract:

With the tremendous improvements in the processing power of computers, the possibilities of artificial intelligence will increasingly be used in aviation and make autonomous flights, preventive maintenance, ATM (Air Traffic Management) optimization, pilots, cabin crew, ground staff, and airport staff training possible in a cost-saving, less time-consuming and less polluting way. Through the use of artificial intelligence, we foresee an interviewing scenario where the interviewee will interact with the artificial intelligence tool to contextualize the character and the necessary information in a way that aligns reasonably with the character and the scenario. We are creating simulated scenarios connected with either an aviation incident or accident to enhance also the training of future accident/incident investigators integrating artificial intelligence and augmented reality tools. The project's goal is to improve the learning and teaching scenario through academic and professional expertise in aviation and in the artificial intelligence field. Thus, we intend to contribute to the needed high innovation capacity, skills, and training development and management of artificial intelligence, supported by appropriate regulations and attention to ethical problems.

Keywords: artificial intelligence, aviation accident, aviation incident, risk, safety

Procedia PDF Downloads 22
2038 A Review on the Adoption and Acculturation of Digital Technologies among Farmers of Haryana State

Authors: Manisha Ohlan, Manju Dahiya

Abstract:

The present study was conducted in Karnal, Rohtak, and Jhajjar districts of Haryana state, covering 360 respondents. Results showed that 42.78 percent of the respondents had above average knowledge at the preparation stage followed by 48.33 percent of the respondents who had high knowledge at the production stage, and 37.22 percent of the respondents had average knowledge at the processing stage regarding the usage of digital technologies. Nearly half of the respondents (47.50%) agreed with the usage of digital technologies, followed by strongly agreed (19.45%) and strongly disagreed (14.45%). A significant and positive relationship was found between independent variables and knowledge and of digital technologies at 5 percent level of significance. Therefore, the null hypothesis cannot be rejected. All the dependent variables, including knowledge and attitude, had a significant and positive relationship with z value at 5 percent level of significance, which showed that it is between -1.96 to +1.96; therefore, the data falls between the acceptance region, that’s why the null hypothesis is accepted.

Keywords: knowledge, attitude, digital technologies, significant, positive relationship

Procedia PDF Downloads 94
2037 Design of a Controlled BHJ Solar Cell Using Modified Organic Vapor Spray Deposition Technique

Authors: F. Stephen Joe, V. Sathya Narayanan, V. R. Sanal Kumar

Abstract:

A comprehensive review of the literature on photovoltaic cells has been carried out for exploring the better options for cost efficient technologies for future solar cell applications. Literature review reveals that the Bulk Heterojunction (BHJ) Polymer Solar cells offer special opportunities as renewable energy resources. It is evident from the previous studies that the device fabricated with TiOx layer shows better power conversion efficiency than that of the device without TiOx layer. In this paper, authors designed a controlled BHJ solar cell using a modified organic vapor spray deposition technique facilitated with a vertical-moving gun named as 'Stephen Joe Technique' for getting a desirable surface pattern over the substrate to improving its efficiency over the years for industrial applications. We comprehended that the efficient processing and the interface engineering of these solar cells could increase the efficiency up to 5-10 %.

Keywords: BHJ polymer solar cell, photovoltaic cell, solar cell, Stephen Joe technique

Procedia PDF Downloads 543
2036 Geographic Information System for Simulating Air Traffic By Applying Different Multi-Radar Positioning Techniques

Authors: Amara Rafik, Mostefa Belhadj Aissa

Abstract:

Radar data is one of the many data sources used by ATM Air Traffic Management systems. These data come from air navigation radar antennas. These radars intercept signals emitted by the various aircraft crossing the controlled airspace and calculate the position of these aircraft and retransmit their positions to the Air Traffic Management System. For greater reliability, these radars are positioned in such a way as to allow their coverage areas to overlap. An aircraft will therefore be detected by at least one of these radars. However, the position coordinates of the same aircraft and sent by these different radars are not necessarily identical. Therefore, the ATM system must calculate a single position (radar track) which will ultimately be sent to the control position and displayed on the air traffic controller's monitor. There are several techniques for calculating the radar track. Furthermore, the geographical nature of the problem requires the use of a Geographic Information System (GIS), i.e. a geographical database on the one hand and geographical processing. The objective of this work is to propose a GIS for traffic simulation which reconstructs the evolution over time of aircraft positions from a multi-source radar data set and by applying these different techniques.

Keywords: ATM, GIS, radar data, simulation

Procedia PDF Downloads 118
2035 Enhancing Code Security with AI-Powered Vulnerability Detection

Authors: Zzibu Mark Brian

Abstract:

As software systems become increasingly complex, ensuring code security is a growing concern. Traditional vulnerability detection methods often rely on manual code reviews or static analysis tools, which can be time-consuming and prone to errors. This paper presents a distinct approach to enhancing code security by leveraging artificial intelligence (AI) and machine learning (ML) techniques. Our proposed system utilizes a combination of natural language processing (NLP) and deep learning algorithms to identify and classify vulnerabilities in real-world codebases. By analyzing vast amounts of open-source code data, our AI-powered tool learns to recognize patterns and anomalies indicative of security weaknesses. We evaluated our system on a dataset of over 10,000 open-source projects, achieving an accuracy rate of 92% in detecting known vulnerabilities. Furthermore, our tool identified previously unknown vulnerabilities in popular libraries and frameworks, demonstrating its potential for improving software security.

Keywords: AI, machine language, cord security, machine leaning

Procedia PDF Downloads 36
2034 Genodata: The Human Genome Variation Using BigData

Authors: Surabhi Maiti, Prajakta Tamhankar, Prachi Uttam Mehta

Abstract:

Since the accomplishment of the Human Genome Project, there has been an unparalled escalation in the sequencing of genomic data. This project has been the first major vault in the field of medical research, especially in genomics. This project won accolades by using a concept called Bigdata which was earlier, extensively used to gain value for business. Bigdata makes use of data sets which are generally in the form of files of size terabytes, petabytes, or exabytes and these data sets were traditionally used and managed using excel sheets and RDBMS. The voluminous data made the process tedious and time consuming and hence a stronger framework called Hadoop was introduced in the field of genetic sciences to make data processing faster and efficient. This paper focuses on using SPARK which is gaining momentum with the advancement of BigData technologies. Cloud Storage is an effective medium for storage of large data sets which is generated from the genetic research and the resultant sets produced from SPARK analysis.

Keywords: human genome project, Bigdata, genomic data, SPARK, cloud storage, Hadoop

Procedia PDF Downloads 259
2033 Email Phishing Detection Using Natural Language Processing and Convolutional Neural Network

Authors: M. Hilani, B. Nassih

Abstract:

Phishing is one of the oldest and best known scams on the Internet. It can be defined as any type of telecommunications fraud that uses social engineering tricks to obtain confidential data from its victims. It’s a cybercrime aimed at stealing your sensitive information. Phishing is generally done via private email, so scammers impersonate large companies or other trusted entities to encourage victims to voluntarily provide information such as login credentials or, worse yet, credit card numbers. The COVID-19 theme is used by cybercriminals in multiple malicious campaigns like phishing. In this environment, messaging filtering solutions have become essential to protect devices that will now be used outside of the secure perimeter. Despite constantly updating methods to avoid these cyberattacks, the end result is currently insufficient. Many researchers are looking for optimal solutions to filter phishing emails, but we still need good results. In this work, we concentrated on solving the problem of detecting phishing emails using the different steps of NLP preprocessing, and we proposed and trained a model using one-dimensional CNN. Our study results show that our model obtained an accuracy of 99.99%, which demonstrates how well our model is working.

Keywords: phishing, e-mail, NLP preprocessing, CNN, e-mail filtering

Procedia PDF Downloads 126
2032 The Grammatical Dictionary Compiler: A System for Kartvelian Languages

Authors: Liana Lortkipanidze, Nino Amirezashvili, Nino Javashvili

Abstract:

The purpose of the grammatical dictionary is to provide information on the morphological and syntactic characteristics of the basic word in the dictionary entry. The electronic grammatical dictionaries are used as a tool of automated morphological analysis for texts processing. The Georgian Grammatical Dictionary should contain grammatical information for each word: part of speech, type of declension/conjugation, grammatical forms of the word (paradigm), alternative variants of basic word/lemma. In this paper, we present the system for compiling the Georgian Grammatical Dictionary automatically. We propose dictionary-based methods for extending grammatical lexicons. The input lexicon contains only a few number of words with identical grammatical features. The extension is based on similarity measures between features of words; more precisely, we add words to the extended lexicons, which are similar to those, which are already in the grammatical dictionary. Our dictionaries are corpora-based, and for the compiling, we introduce the method for lemmatization of unknown words, i.e., words of which neither full form nor lemma is in the grammatical dictionary.

Keywords: acquisition of lexicon, Georgian grammatical dictionary, lemmatization rules, morphological processor

Procedia PDF Downloads 148
2031 Affects Associations Analysis in Emergency Situations

Authors: Joanna Grzybowska, Magdalena Igras, Mariusz Ziółko

Abstract:

Association rule learning is an approach for discovering interesting relationships in large databases. The analysis of relations, invisible at first glance, is a source of new knowledge which can be subsequently used for prediction. We used this data mining technique (which is an automatic and objective method) to learn about interesting affects associations in a corpus of emergency phone calls. We also made an attempt to match revealed rules with their possible situational context. The corpus was collected and subjectively annotated by two researchers. Each of 3306 recordings contains information on emotion: (1) type (sadness, weariness, anxiety, surprise, stress, anger, frustration, calm, relief, compassion, contentment, amusement, joy) (2) valence (negative, neutral, or positive) (3) intensity (low, typical, alternating, high). Also, additional information, that is a clue to speaker’s emotional state, was annotated: speech rate (slow, normal, fast), characteristic vocabulary (filled pauses, repeated words) and conversation style (normal, chaotic). Exponentially many rules can be extracted from a set of items (an item is a previously annotated single information). To generate the rules in the form of an implication X → Y (where X and Y are frequent k-itemsets) the Apriori algorithm was used - it avoids performing needless computations. Then, two basic measures (Support and Confidence) and several additional symmetric and asymmetric objective measures (e.g. Laplace, Conviction, Interest Factor, Cosine, correlation coefficient) were calculated for each rule. Each applied interestingness measure revealed different rules - we selected some top rules for each measure. Owing to the specificity of the corpus (emergency situations), most of the strong rules contain only negative emotions. There are though strong rules including neutral or even positive emotions. Three examples of the strongest rules are: {sadness} → {anxiety}; {sadness, weariness, stress, frustration} → {anger}; {compassion} → {sadness}. Association rule learning revealed the strongest configurations of affects (as well as configurations of affects with affect-related information) in our emergency phone calls corpus. The acquired knowledge can be used for prediction to fulfill the emotional profile of a new caller. Furthermore, a rule-related possible context analysis may be a clue to the situation a caller is in.

Keywords: data mining, emergency phone calls, emotional profiles, rules

Procedia PDF Downloads 408
2030 Discourse Analysis: Where Cognition Meets Communication

Authors: Iryna Biskub

Abstract:

The interdisciplinary approach to modern linguistic studies is exemplified by the merge of various research methods, which sometimes causes complications related to the verification of the research results. This methodological confusion can be resolved by means of creating new techniques of linguistic analysis combining several scientific paradigms. Modern linguistics has developed really productive and efficient methods for the investigation of cognitive and communicative phenomena of which language is the central issue. In the field of discourse studies, one of the best examples of research methods is the method of Critical Discourse Analysis (CDA). CDA can be viewed both as a method of investigation, as well as a critical multidisciplinary perspective. In CDA the position of the scholar is crucial from the point of view exemplifying his or her social and political convictions. The generally accepted approach to obtaining scientifically reliable results is to use a special well-defined scientific method for researching special types of language phenomena: cognitive methods applied to the exploration of cognitive aspects of language, whereas communicative methods are thought to be relevant only for the investigation of communicative nature of language. In the recent decades discourse as a sociocultural phenomenon has been the focus of careful linguistic research. The very concept of discourse represents an integral unity of cognitive and communicative aspects of human verbal activity. Since a human being is never able to discriminate between cognitive and communicative planes of discourse communication, it doesn’t make much sense to apply cognitive and communicative methods of research taken in isolation. It is possible to modify the classical CDA procedure by means of mapping human cognitive procedures onto the strategic communicative planning of discourse communication. The analysis of the electronic petition 'Block Donald J Trump from UK entry. The signatories believe Donald J Trump should be banned from UK entry' (584, 459 signatures) and the parliamentary debates on it has demonstrated the ability to map cognitive and communicative levels in the following way: the strategy of discourse modeling (communicative level) overlaps with the extraction of semantic macrostructures (cognitive level); the strategy of discourse management overlaps with the analysis of local meanings in discourse communication; the strategy of cognitive monitoring of the discourse overlaps with the formation of attitudes and ideologies at the cognitive level. Thus, the experimental data have shown that it is possible to develop a new complex methodology of discourse analysis, where cognition would meet communication, both metaphorically and literally. The same approach may appear to be productive for the creation of computational models of human-computer interaction, where the automatic generation of a particular type of a discourse could be based on the rules of strategic planning involving cognitive models of CDA.

Keywords: cognition, communication, discourse, strategy

Procedia PDF Downloads 254
2029 Small Text Extraction from Documents and Chart Images

Authors: Rominkumar Busa, Shahira K. C., Lijiya A.

Abstract:

Text recognition is an important area in computer vision which deals with detecting and recognising text from an image. The Optical Character Recognition (OCR) is a saturated area these days and with very good text recognition accuracy. However the same OCR methods when applied on text with small font sizes like the text data of chart images, the recognition rate is less than 30%. In this work, aims to extract small text in images using the deep learning model, CRNN with CTC loss. The text recognition accuracy is found to improve by applying image enhancement by super resolution prior to CRNN model. We also observe the text recognition rate further increases by 18% by applying the proposed method, which involves super resolution and character segmentation followed by CRNN with CTC loss. The efficiency of the proposed method shows that further pre-processing on chart image text and other small text images will improve the accuracy further, thereby helping text extraction from chart images.

Keywords: small text extraction, OCR, scene text recognition, CRNN

Procedia PDF Downloads 125
2028 Design of Wireless Readout System for Resonant Gas Sensors

Authors: S. Mohamed Rabeek, Mi Kyoung Park, M. Annamalai Arasu

Abstract:

This paper presents a design of a wireless read out system for tracking the frequency shift of the polymer coated piezoelectric micro electromechanical resonator due to gas absorption. The measure of this frequency shift indicates the percentage of a particular gas the sensor is exposed to. It is measured using an oscillator and an FPGA based frequency counter by employing the resonator as a frequency determining element in the oscillator. This system consists of a Gas Sensing Wireless Readout (GSWR) and an USB Wireless Transceiver (UWT). GSWR consists of an oscillator based on a trans-impedance sustaining amplifier, an FPGA based frequency readout, a sub 1GHz wireless transceiver and a micro controller. UWT can be plugged into the computer via USB port and function as a wireless module to transfer gas sensor data from GSWR to the computer through its USB port. GUI program running on the computer periodically polls for sensor data through UWT - GSWR wireless link, the response from GSWR is logged in a file for post processing as well as displayed on screen.

Keywords: gas sensor, GSWR, micromechanical system, UWT, volatile emissions

Procedia PDF Downloads 483
2027 Multifunctional Nanofiber Based Aerogels: Bridging Electrospinning with Aerogel Fabrication

Authors: Tahira Pirzada, Zahra Ashrafi, Saad Khan

Abstract:

We present a facile and sustainable solid templating approach to fabricate highly porous, flexible and superhydrophobic aerogels of composite nanofibers of cellulose diacetate and silica which are produced through sol gel electrospinning. Scanning electron microscopy, contact angle measurement, and attenuated total reflection-Fourier transform infrared spectrometry are used to understand the structural features of the resultant aerogels while thermogravimetric analysis and differential scanning calorimetry demonstrate their thermal stability. These aerogels exhibit a self-supportive three-dimensional network abundant in large secondary pores surrounded by primary pores resulting in a highly porous structure. Thermal crosslinking of the aerogels has further stabilized their structure and flexibility without compromising on the porosity. Ease of processing, thermal stability, high porosity and oleophilic nature of these aerogels make them promising candidate for a wide variety of applications including acoustic and thermal insulation and oil and water separation.

Keywords: hybrid aerogels, sol-gel electrospinning, oil-water separation, nanofibers

Procedia PDF Downloads 158
2026 Semantic Textual Similarity on Contracts: Exploring Multiple Negative Ranking Losses for Sentence Transformers

Authors: Yogendra Sisodia

Abstract:

Researchers are becoming more interested in extracting useful information from legal documents thanks to the development of large-scale language models in natural language processing (NLP), and deep learning has accelerated the creation of powerful text mining models. Legal fields like contracts benefit greatly from semantic text search since it makes it quick and easy to find related clauses. After collecting sentence embeddings, it is relatively simple to locate sentences with a comparable meaning throughout the entire legal corpus. The author of this research investigated two pre-trained language models for this task: MiniLM and Roberta, and further fine-tuned them on Legal Contracts. The author used Multiple Negative Ranking Loss for the creation of sentence transformers. The fine-tuned language models and sentence transformers showed promising results.

Keywords: legal contracts, multiple negative ranking loss, natural language inference, sentence transformers, semantic textual similarity

Procedia PDF Downloads 108