Search results for: geological database
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2038

Search results for: geological database

1708 Relationship Between Health Coverage and Emergency Disease Burden

Authors: Karim Hajjar, Luis Lillo, Diego Martinez, Manuel Hermosilla, Nicholas Risko

Abstract:

Objectives: This study examines the relationship between universal health coverage (UCH) and the burden of emergency diseases at a global level. Methods: Data on Disability-Adjusted Life Years (DALYs) from emergency conditions were extracted from the Institute for Health Metrics and Evaluation (IHME) database for the years 2015 and 2019. Data on UHC, measured using two variables, 1) coverage of essential health services and 2) proportion of population spending more than 10% of household income on out-of-pocket health care expenditure, was extracted from the World Bank Database for years preceding our outcome of interest. Linear regression was performed, analyzing the effect of the UHC variables on the DALYs of emergency diseases, controlling for other variables. Results: A total of 133 countries were included. 44.4% of the analyzed countries had coverage of essential health services index of at least 70/100, and 35.3% had at least 10% of their population spend greater than 10% of their household income on healthcare. For every point increase in the coverage of essential health services index, there was a 13-point reduction in DALYs of emergency medical diseases (95% CI -16, -11). Conversely, for every percent decrease in the population with large household expenditure on healthcare, there was a 0.48 increase in DALYs of emergency medical diseases (95% CI -5.6, 4.7). Conclusions: After adjusting for multiple variables, an increase in coverage of essential health services was significantly associated with improvement in DALYs for emergency conditions. There was, however, no association between catastrophic health expenditure and DALYs.

Keywords: emergency medicine, universal healthcare, global health, health economics

Procedia PDF Downloads 79
1707 Improve B-Tree Index’s Performance Using Lock-Free Hash Table

Authors: Zhanfeng Ma, Zhiping Xiong, Hu Yin, Zhengwei She, Aditya P. Gurajada, Tianlun Chen, Ying Li

Abstract:

Many RDBMS vendors use B-tree index to achieve high performance for point queries and range queries, and some of them also employ hash index to further enhance the performance as hash table is more efficient for point queries. However, there are extra overheads to maintain a separate hash index, for example, hash mapping for all data records must always be maintained, which results in more memory space consumption; locking, logging and other mechanisms are needed to guarantee ACID, which affects the concurrency and scalability of the system. To relieve the overheads, Hash Cached B-tree (HCB) index is proposed in this paper, which consists of a standard disk-based B-tree index and an additional in-memory lock-free hash table. Initially, only the B-tree index is constructed for all data records, the hash table is built on the fly based on runtime workload, only data records accessed by point queries are indexed using hash table, this helps reduce the memory footprint. Changes to hash table are done using compare-and-swap (CAS) without performing locking and logging, this helps improve the concurrency and avoid contention. The hash table is also optimized to be cache conscious. HCB index is implemented in SAP ASE database, compared with the standard B-tree index, early experiments and customer adoptions show significant performance improvement. This paper provides an overview of the design of HCB index and reports the experimental results.

Keywords: B-tree, compare-and-swap, lock-free hash table, point queries, range queries, SAP ASE database

Procedia PDF Downloads 268
1706 A Comparative Study of the Impact of Membership in International Climate Change Treaties and the Environmental Kuznets Curve (EKC) in Line with Sustainable Development Theories

Authors: Mojtaba Taheri, Saied Reza Ameli

Abstract:

In this research, we have calculated the effect of membership in international climate change treaties for 20 developed countries based on the human development index (HDI) and compared this effect with the process of pollutant reduction in the Environmental Kuznets Curve (EKC) theory. For this purpose, the data related to The real GDP per capita with 2010 constant prices is selected from the World Development Indicators (WDI) database. Ecological Footprint (ECOFP) is the amount of biologically productive land needed to meet human needs and absorb carbon dioxide emissions. It is measured in global hectares (gha), and the data retrieved from the Global Ecological Footprint (2021) database will be used, and we will proceed by examining step by step and performing several series of targeted statistical regressions. We will examine the effects of different control variables, including Energy Consumption Structure (ECS) will be counted as the share of fossil fuel consumption in total energy consumption and will be extracted from The United States Energy Information Administration (EIA) (2021) database. Energy Production (EP) refers to the total production of primary energy by all energy-producing enterprises in one country at a specific time. It is a comprehensive indicator that shows the capacity of energy production in the country, and the data for its 2021 version, like the Energy Consumption Structure, is obtained from (EIA). Financial development (FND) is defined as the ratio of private credit to GDP, and to some extent based on the stock market value, also as a ratio to GDP, and is taken from the (WDI) 2021 version. Trade Openness (TRD) is the sum of exports and imports of goods and services measured as a share of GDP, and we use the (WDI) data (2021) version. Urbanization (URB) is defined as the share of the urban population in the total population, and for this data, we used the (WDI) data source (2021) version. The descriptive statistics of all the investigated variables are presented in the results section. Related to the theories of sustainable development, Environmental Kuznets Curve (EKC) is more significant in the period of study. In this research, we use more than fourteen targeted statistical regressions to purify the net effects of each of the approaches and examine the results.

Keywords: climate change, globalization, environmental economics, sustainable development, international climate treaty

Procedia PDF Downloads 51
1705 A Robust Spatial Feature Extraction Method for Facial Expression Recognition

Authors: H. G. C. P. Dinesh, G. Tharshini, M. P. B. Ekanayake, G. M. R. I. Godaliyadda

Abstract:

This paper presents a new spatial feature extraction method based on principle component analysis (PCA) and Fisher Discernment Analysis (FDA) for facial expression recognition. It not only extracts reliable features for classification, but also reduces the feature space dimensions of pattern samples. In this method, first each gray scale image is considered in its entirety as the measurement matrix. Then, principle components (PCs) of row vectors of this matrix and variance of these row vectors along PCs are estimated. Therefore, this method would ensure the preservation of spatial information of the facial image. Afterwards, by incorporating the spectral information of the eigen-filters derived from the PCs, a feature vector was constructed, for a given image. Finally, FDA was used to define a set of basis in a reduced dimension subspace such that the optimal clustering is achieved. The method of FDA defines an inter-class scatter matrix and intra-class scatter matrix to enhance the compactness of each cluster while maximizing the distance between cluster marginal points. In order to matching the test image with the training set, a cosine similarity based Bayesian classification was used. The proposed method was tested on the Cohn-Kanade database and JAFFE database. It was observed that the proposed method which incorporates spatial information to construct an optimal feature space outperforms the standard PCA and FDA based methods.

Keywords: facial expression recognition, principle component analysis (PCA), fisher discernment analysis (FDA), eigen-filter, cosine similarity, bayesian classifier, f-measure

Procedia PDF Downloads 410
1704 Probabilistic Slope Stability Analysis of Excavation Induced Landslides Using Hermite Polynomial Chaos

Authors: Schadrack Mwizerwa

Abstract:

The characterization and prediction of landslides are crucial for assessing geological hazards and mitigating risks to infrastructure and communities. This research aims to develop a probabilistic framework for analyzing excavation-induced landslides, which is fundamental for assessing geological hazards and mitigating risks to infrastructure and communities. The study uses Hermite polynomial chaos, a non-stationary random process, to analyze the stability of a slope and characterize the failure probability of a real landslide induced by highway construction excavation. The correlation within the data is captured using the Karhunen-Loève (KL) expansion theory, and the finite element method is used to analyze the slope's stability. The research contributes to the field of landslide characterization by employing advanced random field approaches, providing valuable insights into the complex nature of landslide behavior and the effectiveness of advanced probabilistic models for risk assessment and management. The data collected from the Baiyuzui landslide, induced by highway construction, is used as an illustrative example. The findings highlight the importance of considering the probabilistic nature of landslides and provide valuable insights into the complex behavior of such hazards.

Keywords: Hermite polynomial chaos, Karhunen-Loeve, slope stability, probabilistic analysis

Procedia PDF Downloads 56
1703 Identification of Deep Landslide on Erzurum-Turkey Highway by Geotechnical and Geophysical Methods and its Prevention

Authors: Neşe Işık, Şenol Altıok, Galip Devrim Eryılmaz, Aydın durukan, Hasan Özgür Daş

Abstract:

In this study, an active landslide zone affecting the road alignment on the Tortum-Uzundere (Erzurum/Turkey) highway was investigated. Due to the landslide movement, problems have occurred in the existing road pavement, which has caused both safety problems and reduced driving comfort in the operation of the road. In order to model the landslide, drilling, geophysical and inclinometer studies were carried out in the field within the scope of ground investigation. Laboratory tests were carried out on soil and rock samples obtained from the borings. When the drilling and geophysical studies were evaluated together, it was determined that the study area has a complex geological structure. In addition, according to the inclinometer results, the direction and speed of movement of the landslide mass were observed. In order to create an idealized geological profile, all field and laboratory studies were evaluated together and then the sliding surface of the landslide was determined by back analysis method. According to the findings obtained, it was determined that the landslide was massively large, and the movement occurred had a deep sliding surface. As a result of the numerical analyses, it was concluded that the Slope angle reduction is the most economical and environmentally friendly method for the control of the landslide mass.

Keywords: landslide, geotechnical methods, geophysics, monitoring, highway

Procedia PDF Downloads 56
1702 A Framework for Secure Information Flow Analysis in Web Applications

Authors: Ralph Adaimy, Wassim El-Hajj, Ghassen Ben Brahim, Hazem Hajj, Haidar Safa

Abstract:

Huge amounts of data and personal information are being sent to and retrieved from web applications on daily basis. Every application has its own confidentiality and integrity policies. Violating these policies can have broad negative impact on the involved company’s financial status, while enforcing them is very hard even for the developers with good security background. In this paper, we propose a framework that enforces security-by-construction in web applications. Minimal developer effort is required, in a sense that the developer only needs to annotate database attributes by a security class. The web application code is then converted into an intermediary representation, called Extended Program Dependence Graph (EPDG). Using the EPDG, the provided annotations are propagated to the application code and run against generic security enforcement rules that were carefully designed to detect insecure information flows as early as they occur. As a result, any violation in the data’s confidentiality or integrity policies is reported. As a proof of concept, two PHP web applications, Hotel Reservation and Auction, were used for testing and validation. The proposed system was able to catch all the existing insecure information flows at their source. Moreover and to highlight the simplicity of the suggested approaches vs. existing approaches, two professional web developers assessed the annotation tasks needed in the presented case studies and provided a very positive feedback on the simplicity of the annotation task.

Keywords: web applications security, secure information flow, program dependence graph, database annotation

Procedia PDF Downloads 451
1701 TARF: Web Toolkit for Annotating RNA-Related Genomic Features

Authors: Jialin Ma, Jia Meng

Abstract:

Genomic features, the genome-based coordinates, are commonly used for the representation of biological features such as genes, RNA transcripts and transcription factor binding sites. For the analysis of RNA-related genomic features, such as RNA modification sites, a common task is to correlate these features with transcript components (5'UTR, CDS, 3'UTR) to explore their distribution characteristics in terms of transcriptomic coordinates, e.g., to examine whether a specific type of biological feature is enriched near transcription start sites. Existing approaches for performing these tasks involve the manipulation of a gene database, conversion from genome-based coordinate to transcript-based coordinate, and visualization methods that are capable of showing RNA transcript components and distribution of the features. These steps are complicated and time consuming, and this is especially true for researchers who are not familiar with relevant tools. To overcome this obstacle, we develop a dedicated web app TARF, which represents web toolkit for annotating RNA-related genomic features. TARF web tool intends to provide a web-based way to easily annotate and visualize RNA-related genomic features. Once a user has uploaded the features with BED format and specified a built-in transcript database or uploaded a customized gene database with GTF format, the tool could fulfill its three main functions. First, it adds annotation on gene and RNA transcript components. For every features provided by the user, the overlapping with RNA transcript components are identified, and the information is combined in one table which is available for copy and download. Summary statistics about ambiguous belongings are also carried out. Second, the tool provides a convenient visualization method of the features on single gene/transcript level. For the selected gene, the tool shows the features with gene model on genome-based view, and also maps the features to transcript-based coordinate and show the distribution against one single spliced RNA transcript. Third, a global transcriptomic view of the genomic features is generated utilizing the Guitar R/Bioconductor package. The distribution of features on RNA transcripts are normalized with respect to RNA transcript landmarks and the enrichment of the features on different RNA transcript components is demonstrated. We tested the newly developed TARF toolkit with 3 different types of genomics features related to chromatin H3K4me3, RNA N6-methyladenosine (m6A) and RNA 5-methylcytosine (m5C), which are obtained from ChIP-Seq, MeRIP-Seq and RNA BS-Seq data, respectively. TARF successfully revealed their respective distribution characteristics, i.e. H3K4me3, m6A and m5C are enriched near transcription starting sites, stop codons and 5’UTRs, respectively. Overall, TARF is a useful web toolkit for annotation and visualization of RNA-related genomic features, and should help simplify the analysis of various RNA-related genomic features, especially those related RNA modifications.

Keywords: RNA-related genomic features, annotation, visualization, web server

Procedia PDF Downloads 192
1700 Iris Cancer Detection System Using Image Processing and Neural Classifier

Authors: Abdulkader Helwan

Abstract:

Iris cancer, so called intraocular melanoma is a cancer that starts in the iris; the colored part of the eye that surrounds the pupil. There is a need for an accurate and cost-effective iris cancer detection system since the available techniques used currently are still not efficient. The combination of the image processing and artificial neural networks has a great efficiency for the diagnosis and detection of the iris cancer. Image processing techniques improve the diagnosis of the cancer by enhancing the quality of the images, so the physicians diagnose properly. However, neural networks can help in making decision; whether the eye is cancerous or not. This paper aims to develop an intelligent system that stimulates a human visual detection of the intraocular melanoma, so called iris cancer. The suggested system combines both image processing techniques and neural networks. The images are first converted to grayscale, filtered, and then segmented using prewitt edge detection algorithm to detect the iris, sclera circles and the cancer. The principal component analysis is used to reduce the image size and for extracting features. Those features are considered then as inputs for a neural network which is capable of deciding if the eye is cancerous or not, throughout its experience adopted by many training iterations of different normal and abnormal eye images during the training phase. Normal images are obtained from a public database available on the internet, “Mile Research”, while the abnormal ones are obtained from another database which is the “eyecancer”. The experimental results for the proposed system show high accuracy 100% for detecting cancer and making the right decision.

Keywords: iris cancer, intraocular melanoma, cancerous, prewitt edge detection algorithm, sclera

Procedia PDF Downloads 481
1699 Reducing Flood Risk in a Megacity: Using Mobile Application and Value Capture for Flood Risk Prevention and Risk Reduction Financing

Authors: Dedjo Yao Simon, Takahiro Saito, Norikazu Inuzuka, Ikuo Sugiyama

Abstract:

The megacity of Abidjan is a coastal urban area where the number of floods reported and the associated impacts are on a rapid increase due to climate change, an uncontrolled urbanization, a rapid population increase, a lack of flood disaster mitigation and citizens’ awareness. The objective of this research is to reduce in the short and long term period, the human and socio-economic impact of the flood. Hydrological simulation is applied on free of charge global spatial data (digital elevation model, satellite-based rainfall estimate, landuse) to identify the flood-prone area and to map the risk of flood. A direct interview to a sample residents is used to validate the simulation results. Then a mobile application (Flood Locator) is prototyped to disseminate the risk information to the citizen. In addition, a value capture strategy is proposed to mobilize financial resource for disaster risk reduction (DRRf) to reduce the impact of the flood. The town of Cocody in Abidjan is selected as a case study area to implement this research. The mapping of the flood risk reveals that population living in the study area is highly vulnerable. For a 5-year flood, more than 60% of the floodplain is affected by a water depth of at least 0.5 meters; and more than 1000 ha with at least 5000 buildings are directly exposed. The risk becomes higher for a 50 and 100-year floods. Also, the interview reveals that the majority of the citizen are not aware of the risk and severity of flooding in their community. This shortage of information is overcome by the Flood Locator and by an urban flood database we prototype for accumulate flood data. Flood Locator App allows the users to view floodplain and depth on a digital map; the user can activate the GPS sensor of the mobile to visualize his location on the map. Some more important additional features allow the citizen user to capture flood events and damage information that they can send remotely to the database. Also, the disclosure of the risk information could result to a decrement (-14%) of the value of properties locate inside floodplain and an increment (+19%) of the value of property in the suburb area. The tax increment due to the higher tax increment in the safer area should be captured to constitute the DRRf. The fund should be allocated to the reduction of flood risk for the benefit of people living in flood-prone areas. The flood prevention system discusses in this research will minimize in the short and long term the direct damages in the risky area due to effective awareness of citizen and the availability of DRRf. It will also contribute to the growth of the urban area in the safer zone and reduce human settlement in the risky area in the long term. Data accumulated in the urban flood database through the warning app will contribute to regenerate Abidjan towards the more resilient city by means of risk avoidable landuse in the master plan.

Keywords: abidjan, database, flood, geospatial techniques, risk communication, smartphone, value capture

Procedia PDF Downloads 261
1698 The Impact of Prior Cancer History on the Prognosis of Salivary Gland Cancer Patients: A Population-based Study from the Surveillance, Epidemiology, and End Results (SEER) Database

Authors: Junhong Li, Danni Cheng, Yaxin Luo, Xiaowei Yi, Ke Qiu, Wendu Pang, Minzi Mao, Yufang Rao, Yao Song, Jianjun Ren, Yu Zhao

Abstract:

Background: The number of multiple cancer patients was increasing, and the impact of prior cancer history on salivary gland cancer patients remains unclear. Methods: Clinical, demographic and pathological information on salivary gland cancer patients were retrospectively collected from the Surveillance, Epidemiology, and End Results (SEER) database from 2004 to 2017, and the characteristics and prognosis between patients with a prior cancer and those without prior caner were compared. Univariate and multivariate cox proportional regression models were used for the analysis of prognosis. A risk score model was established to exam the impact of treatment on patients with a prior cancer in different risk groups. Results: A total of 9098 salivary gland cancer patients were identified, and 1635 of them had a prior cancer history. Salivary gland cancer patients with prior cancer had worse survival compared with those without a prior cancer (p<0.001). Patients with a different type of first cancer had a distinct prognosis (p<0.001), and longer latent time was associated with better survival (p=0.006) in the univariate model, although both became nonsignificant in the multivariate model. Salivary gland cancer patients with a prior cancer were divided into low-risk (n= 321), intermediate-risk (n=223), and high-risk (n=62) groups and the results showed that patients at high risk could benefit from surgery, radiation therapy, and chemotherapy, and those at intermediate risk could benefit from surgery. Conclusion: Prior cancer history had an adverse impact on the survival of salivary gland cancer patients, and individualized treatment should be seriously considered for them.

Keywords: prior cancer history, prognosis, salivary gland cancer, SEER

Procedia PDF Downloads 125
1697 Biofilm Text Classifiers Developed Using Natural Language Processing and Unsupervised Learning Approach

Authors: Kanika Gupta, Ashok Kumar

Abstract:

Biofilms are dense, highly hydrated cell clusters that are irreversibly attached to a substratum, to an interface or to each other, and are embedded in a self-produced gelatinous matrix composed of extracellular polymeric substances. Research in biofilm field has become very significant, as biofilm has shown high mechanical resilience and resistance to antibiotic treatment and constituted as a significant problem in both healthcare and other industry related to microorganisms. The massive information both stated and hidden in the biofilm literature are growing exponentially therefore it is not possible for researchers and practitioners to automatically extract and relate information from different written resources. So, the current work proposes and discusses the use of text mining techniques for the extraction of information from biofilm literature corpora containing 34306 documents. It is very difficult and expensive to obtain annotated material for biomedical literature as the literature is unstructured i.e. free-text. Therefore, we considered unsupervised approach, where no annotated training is necessary and using this approach we developed a system that will classify the text on the basis of growth and development, drug effects, radiation effects, classification and physiology of biofilms. For this, a two-step structure was used where the first step is to extract keywords from the biofilm literature using a metathesaurus and standard natural language processing tools like Rapid Miner_v5.3 and the second step is to discover relations between the genes extracted from the whole set of biofilm literature using pubmed.mineR_v1.0.11. We used unsupervised approach, which is the machine learning task of inferring a function to describe hidden structure from 'unlabeled' data, in the above-extracted datasets to develop classifiers using WinPython-64 bit_v3.5.4.0Qt5 and R studio_v0.99.467 packages which will automatically classify the text by using the mentioned sets. The developed classifiers were tested on a large data set of biofilm literature which showed that the unsupervised approach proposed is promising as well as suited for a semi-automatic labeling of the extracted relations. The entire information was stored in the relational database which was hosted locally on the server. The generated biofilm vocabulary and genes relations will be significant for researchers dealing with biofilm research, making their search easy and efficient as the keywords and genes could be directly mapped with the documents used for database development.

Keywords: biofilms literature, classifiers development, text mining, unsupervised learning approach, unstructured data, relational database

Procedia PDF Downloads 150
1696 Identification of Damage Mechanisms in Interlock Reinforced Composites Using a Pattern Recognition Approach of Acoustic Emission Data

Authors: M. Kharrat, G. Moreau, Z. Aboura

Abstract:

The latest advances in the weaving industry, combined with increasingly sophisticated means of materials processing, have made it possible to produce complex 3D composite structures. Mainly used in aeronautics, composite materials with 3D architecture offer better mechanical properties than 2D reinforced composites. Nevertheless, these materials require a good understanding of their behavior. Because of the complexity of such materials, the damage mechanisms are multiple, and the scenario of their appearance and evolution depends on the nature of the exerted solicitations. The AE technique is a well-established tool for discriminating between the damage mechanisms. Suitable sensors are used during the mechanical test to monitor the structural health of the material. Relevant AE-features are then extracted from the recorded signals, followed by a data analysis using pattern recognition techniques. In order to better understand the damage scenarios of interlock composite materials, a multi-instrumentation was set-up in this work for tracking damage initiation and development, especially in the vicinity of the first significant damage, called macro-damage. The deployed instrumentation includes video-microscopy, Digital Image Correlation, Acoustic Emission (AE) and micro-tomography. In this study, a multi-variable AE data analysis approach was developed for the discrimination between the different signal classes representing the different emission sources during testing. An unsupervised classification technique was adopted to perform AE data clustering without a priori knowledge. The multi-instrumentation and the clustered data served to label the different signal families and to build a learning database. This latter is useful to construct a supervised classifier that can be used for automatic recognition of the AE signals. Several materials with different ingredients were tested under various solicitations in order to feed and enrich the learning database. The methodology presented in this work was useful to refine the damage threshold for the new generation materials. The damage mechanisms around this threshold were highlighted. The obtained signal classes were assigned to the different mechanisms. The isolation of a 'noise' class makes it possible to discriminate between the signals emitted by damages without resorting to spatial filtering or increasing the AE detection threshold. The approach was validated on different material configurations. For the same material and the same type of solicitation, the identified classes are reproducible and little disturbed. The supervised classifier constructed based on the learning database was able to predict the labels of the classified signals.

Keywords: acoustic emission, classifier, damage mechanisms, first damage threshold, interlock composite materials, pattern recognition

Procedia PDF Downloads 140
1695 Parameters Identification of Granular Soils around PMT Test by Inverse Analysis

Authors: Younes Abed

Abstract:

The successful application of in-situ testing of soils heavily depends on development of interpretation methods of tests. The pressuremeter test simulates the expansion of a cylindrical cavity and because it has well defined boundary conditions, it is more unable to rigorous theoretical analysis (i. e. cavity expansion theory) then most other in-situ tests. In this article, and in order to make the identification process more convenient, we propose a relatively simple procedure which involves the numerical identification of some mechanical parameters of a granular soil, especially, the elastic modulus and the friction angle from a pressuremeter curve. The procedure, applied here to identify the parameters of generalised prager model associated to the Drucker & Prager criterion from a pressuremeter curve, is based on an inverse analysis approach, which consists of minimizing the function representing the difference between the experimental curve and the curve obtained by integrating the model along the loading path in in-situ testing. The numerical process implemented here is based on the established finite element program. We present a validation of the proposed approach by a database of tests on expansion of cylindrical cavity. This database consists of four types of tests; thick cylinder tests carried out on the Hostun RF sand, pressuremeter tests carried out on the Hostun sand, in-situ pressuremeter tests carried out at the site of Fos with marine self-boring pressuremeter and in-situ pressuremeter tests realized on the site of Labenne with Menard pressuremeter.

Keywords: granular soils, cavity expansion, pressuremeter test, finite element method, identification procedure

Procedia PDF Downloads 275
1694 Design of an Automated Deep Learning Recurrent Neural Networks System Integrated with IoT for Anomaly Detection in Residential Electric Vehicle Charging in Smart Cities

Authors: Wanchalerm Patanacharoenwong, Panaya Sudta, Prachya Bumrungkun

Abstract:

The paper focuses on the development of a system that combines Internet of Things (IoT) technologies and deep learning algorithms for anomaly detection in residential Electric Vehicle (EV) charging in smart cities. With the increasing number of EVs, ensuring efficient and reliable charging systems has become crucial. The aim of this research is to develop an integrated IoT and deep learning system for detecting anomalies in residential EV charging and enhancing EV load profiling and event detection in smart cities. This approach utilizes IoT devices equipped with infrared cameras to collect thermal images and household EV charging profiles from the database of Thailand utility, subsequently transmitting this data to a cloud database for comprehensive analysis. The methodology includes the use of advanced deep learning techniques such as Recurrent Neural Networks (RNN) and Long Short-Term Memory (LSTM) algorithms. IoT devices equipped with infrared cameras are used to collect thermal images and EV charging profiles. The data is transmitted to a cloud database for comprehensive analysis. The researchers also utilize feature-based Gaussian mixture models for EV load profiling and event detection. Moreover, the research findings demonstrate the effectiveness of the developed system in detecting anomalies and critical profiles in EV charging behavior. The system provides timely alarms to users regarding potential issues and categorizes the severity of detected problems based on a health index for each charging device. The system also outperforms existing models in event detection accuracy. This research contributes to the field by showcasing the potential of integrating IoT and deep learning techniques in managing residential EV charging in smart cities. The system ensures operational safety and efficiency while also promoting sustainable energy management. The data is collected using IoT devices equipped with infrared cameras and is stored in a cloud database for analysis. The collected data is then analyzed using RNN, LSTM, and feature-based Gaussian mixture models. The approach includes both EV load profiling and event detection, utilizing a feature-based Gaussian mixture model. This comprehensive method aids in identifying unique power consumption patterns among EV owners and outperforms existing models in event detection accuracy. In summary, the research concludes that integrating IoT and deep learning techniques can effectively detect anomalies in residential EV charging and enhance EV load profiling and event detection accuracy. The developed system ensures operational safety and efficiency, contributing to sustainable energy management in smart cities.

Keywords: cloud computing framework, recurrent neural networks, long short-term memory, Iot, EV charging, smart grids

Procedia PDF Downloads 38
1693 Impact of Interface Soil Layer on Groundwater Aquifer Behaviour

Authors: Hayder H. Kareem, Shunqi Pan

Abstract:

The geological environment where the groundwater is collected represents the most important element that affects the behaviour of groundwater aquifer. As groundwater is a worldwide vital resource, it requires knowing the parameters that affect this source accurately so that the conceptualized mathematical models would be acceptable to the broadest ranges. Therefore, groundwater models have recently become an effective and efficient tool to investigate groundwater aquifer behaviours. Groundwater aquifer may contain aquitards, aquicludes, or interfaces within its geological formations. Aquitards and aquicludes have geological formations that forced the modellers to include those formations within the conceptualized groundwater models, while interfaces are commonly neglected from the conceptualization process because the modellers believe that the interface has no effect on aquifer behaviour. The current research highlights the impact of an interface existing in a real unconfined groundwater aquifer called Dibdibba, located in Al-Najaf City, Iraq where it has a river called the Euphrates River that passes through the eastern part of this city. Dibdibba groundwater aquifer consists of two types of soil layers separated by an interface soil layer. A groundwater model is built for Al-Najaf City to explore the impact of this interface. Calibration process is done using PEST 'Parameter ESTimation' approach and the best Dibdibba groundwater model is obtained. When the soil interface is conceptualized, results show that the groundwater tables are significantly affected by that interface through appearing dry areas of 56.24 km² and 6.16 km² in the upper and lower layers of the aquifer, respectively. The Euphrates River will also leak water into the groundwater aquifer of 7359 m³/day. While these results are changed when the soil interface is neglected where the dry area became 0.16 km², the Euphrates River leakage became 6334 m³/day. In addition, the conceptualized models (with and without interface) reveal different responses for the change in the recharge rates applied on the aquifer through the uncertainty analysis test. The aquifer of Dibdibba in Al-Najaf City shows a slight deficit in the amount of water supplied by the current pumping scheme and also notices that the Euphrates River suffers from stresses applied to the aquifer. Ultimately, this study shows a crucial need to represent the interface soil layer in model conceptualization to be the intended and future predicted behaviours more reliable for consideration purposes.

Keywords: Al-Najaf City, groundwater aquifer behaviour, groundwater modelling, interface soil layer, Visual MODFLOW

Procedia PDF Downloads 170
1692 Ontology-Driven Knowledge Discovery and Validation from Admission Databases: A Structural Causal Model Approach for Polytechnic Education in Nigeria

Authors: Bernard Igoche Igoche, Olumuyiwa Matthew, Peter Bednar, Alexander Gegov

Abstract:

This study presents an ontology-driven approach for knowledge discovery and validation from admission databases in Nigerian polytechnic institutions. The research aims to address the challenges of extracting meaningful insights from vast amounts of admission data and utilizing them for decision-making and process improvement. The proposed methodology combines the knowledge discovery in databases (KDD) process with a structural causal model (SCM) ontological framework. The admission database of Benue State Polytechnic Ugbokolo (Benpoly) is used as a case study. The KDD process is employed to mine and distill knowledge from the database, while the SCM ontology is designed to identify and validate the important features of the admission process. The SCM validation is performed using the conditional independence test (CIT) criteria, and an algorithm is developed to implement the validation process. The identified features are then used for machine learning (ML) modeling and prediction of admission status. The results demonstrate the adequacy of the SCM ontological framework in representing the admission process and the high predictive accuracies achieved by the ML models, with k-nearest neighbors (KNN) and support vector machine (SVM) achieving 92% accuracy. The study concludes that the proposed ontology-driven approach contributes to the advancement of educational data mining and provides a foundation for future research in this domain.

Keywords: admission databases, educational data mining, machine learning, ontology-driven knowledge discovery, polytechnic education, structural causal model

Procedia PDF Downloads 34
1691 Automated Detection of Targets and Retrieve the Corresponding Analytics Using Augmented Reality

Authors: Suvarna Kumar Gogula, Sandhya Devi Gogula, P. Chanakya

Abstract:

Augmented reality is defined as the collection of the digital (or) computer generated information like images, audio, video, 3d models, etc. and overlay them over the real time environment. Augmented reality can be thought as a blend between completely synthetic and completely real. Augmented reality provides scope in a wide range of industries like manufacturing, retail, gaming, advertisement, tourism, etc. and brings out new dimensions in the modern digital world. As it overlays the content, it makes the users enhance the knowledge by providing the content blended with real world. In this application, we integrated augmented reality with data analytics and integrated with cloud so the virtual content will be generated on the basis of the data present in the database and we used marker based augmented reality where every marker will be stored in the database with corresponding unique ID. This application can be used in wide range of industries for different business processes, but in this paper, we mainly focus on the marketing industry which helps the customer in gaining the knowledge about the products in the market which mainly focus on their prices, customer feedback, quality, and other benefits. This application also focuses on providing better market strategy information for marketing managers who obtain the data about the stocks, sales, customer response about the product, etc. In this paper, we also included the reports from the feedback got from different people after the demonstration, and finally, we presented the future scope of Augmented Reality in different business processes by integrating with new technologies like cloud, big data, artificial intelligence, etc.

Keywords: augmented reality, data analytics, catch room, marketing and sales

Procedia PDF Downloads 215
1690 Combination of Geological, Geophysical and Reservoir Engineering Analyses in Field Development: A Case Study

Authors: Atif Zafar, Fan Haijun

Abstract:

A sequence of different Reservoir Engineering methods and tools in reservoir characterization and field development are presented in this paper. The real data of Jin Gas Field of L-Basin of Pakistan is used. The basic concept behind this work is to enlighten the importance of well test analysis in a broader way (i.e. reservoir characterization and field development) unlike to just determine the permeability and skin parameters. Normally in the case of reservoir characterization we rely on well test analysis to some extent but for field development plan, the well test analysis has become a forgotten tool specifically for locations of new development wells. This paper describes the successful implementation of well test analysis in Jin Gas Field where the main uncertainties are identified during initial stage of field development when location of new development well was marked only on the basis of G&G (Geologic and Geophysical) data. The seismic interpretation could not encounter one of the boundary (fault, sub-seismic fault, heterogeneity) near the main and only producing well of Jin Gas Field whereas the results of the model from the well test analysis played a very crucial rule in order to propose the location of second well of the newly discovered field. The results from different methods of well test analysis of Jin Gas Field are also integrated with and supported by other tools of Reservoir Engineering i.e. Material Balance Method and Volumetric Method. In this way, a comprehensive way out and algorithm is obtained in order to integrate the well test analyses with Geological and Geophysical analyses for reservoir characterization and field development. On the strong basis of this working and algorithm, it was successfully evaluated that the proposed location of new development well was not justified and it must be somewhere else except South direction.

Keywords: field development plan, reservoir characterization, reservoir engineering, well test analysis

Procedia PDF Downloads 345
1689 The Development of Digital Commerce in Community Enterprise Products to Promote the Distribution of Samut Songkhram Province

Authors: Natcha Wattanaprapa, Alongkorn Taengtong, Phachaya Chaiwchan

Abstract:

This study investigates and promotes the distribution of community enterprise products of Samut Songkhram province by using e-commerce web technology to help distribute the products. This study also aims to develop the information system to be able to operate on multiple platforms and promote the easy usability on smartphones to increase the efficiency and promote the distribution of community enterprise products of Samut Songkhram province in three areas including Baan Saraphi learning center, the learning center of Bang Noi Floating market as well as Bang Nang Li learning center. The main structure consists of spreading the knowledge regarding the tourist attraction in the area of community enterprise, e-commerce system of community enterprise products, and Chatbot. The researcher developed the system into an application form using the software package to create and manage the content on the internet. Connect management system (CMS) word press was used for managing web pages. Add-on CMS word press was used for creating the system of Chatbot, and the database of PHP My Admin was used as the database management system. The evaluation by the experts and users in 5 aspects, including the system efficiency, the accuracy in the operation of the system, the convenience and ease of use of the system, the design, and the promotion of product distribution in Samut Songkhram province by using questionnaires revealed that the result of evaluation in the promotion of product distribution in Samut Songkhram province was the highest with the mean of 4.20. When evaluating the efficiency of the developed system, it was found that the result of system efficiency was the highest level with a mean of 4.10.

Keywords: community enterprise, digital commerce, promotion of product distribution, Samut Songkhram province

Procedia PDF Downloads 130
1688 The Clinical and Survival Differences between Primary B-Cell and T/NK-Cell Non-Hodgkin Lymphomas in the Nasopharynx, Nasal Cavity, and Nasal Sinus: A Population-Based Study of 3839 Cases in the Seer Database

Authors: Jiajia Peng, Danni Cheng, Jianqing Qiu, Yufang Rao, Minzi Mao, Ke Qiu, Junhong Li, Fei Chen, Feng Liu, Jun Liu, Xiaosong Mu, Wenxin Yu, Wei Zhang, Wei Xu, Yu Zhao, Jianjun Ren

Abstract:

Background: Currently, primary B-cell non-Hodgkin lymphoma (B-NHL) and T/NK-cell non-Hodgkin lymphoma (NKT-NHL) originated from the nasal cavity (NC), nasopharynx (NP) and nasal sinus (NS) distinguished unclearly in the clinic. Objective: We sought to compare the clinical and survival differences of B-NHL and NKT-NHL that occurred in NC, NP, and NS, respectively. Methods: Retrospective data of patients diagnosed with nasal cavity lymphoma (NCL), nasopharyngeal lymphoma (NPL), and nasal sinus lymphoma (NSL) between 1975 and 2017 from the Surveillance, Epidemiology, and End Results (SEER) database were collected. We identified the B/NKT-NHL patients based on the histological type and performed univariate, multivariate, and Kaplan-Meier analyses to investigate the survival rates. Results: Of the identified 3,101 B-NHL and 738 NKT-NHL patients, those with B-NHL in NP were the majority (43%) and had better cancer-specific survival than those in NC and NS from 2010 to 2017 (5-year-CSS, NC vs. NP vs. NS: 81% vs. 83% vs. 82%). In contrast, most of the NKT-NHL originated from NC (68%) and had the highest CSS rate in the recent seven years (2010-2017, 5-year-CSS: 63%). Additionally, the survival outcomes of patients with NKT-NHL-NP (HR: 1.34, 95% CI: 0.62-2.89, P=0.460) who had received surgery were much worse than those of patients with NKT-NHL-NC (HR: 1.07, 95% CI: 0.75-1.52, P=0.710) and NKT-NHL-NS (HR: 1.11, 95% CI: 0.59-2.07, P=0.740). NKT-NHL-NS patients who had radiation performed (HR: 0.38, 95% CI: 0.19-0.73, P=0.004) showed the highest survival rates, while chemotherapy performed (HR: 1.01, 95% CI: 0.43-2.37, P=0.980) presented opposite results. Conclusions: Although B-NHL and NKT-NHL originating from NC, NP and NS had similar anatomical locations, their clinical characteristics, treatment therapies, and prognoses were different in this study. Our findings may suggest that B-NHL and NKT-NHL in NC, NP, and NS should be treated as different diseases in the clinic.

Keywords: nasopharyngeal lymphoma, nasal cavity lymphoma, nasal sinus lymphoma, B-cell non-Hodgkin lymphoma, T/NK-cell non-Hodgkin lymphoma

Procedia PDF Downloads 161
1687 First Earth Size

Authors: Ibrahim M. Metwally

Abstract:

Have you ever thought that earth was not the same earth we live on? Was it bigger or smaller? Was it a great continent surrounded by huge ocean as Alfred Wegener (1912) claimed? Earth is the most amazing planet in our Milky Way galaxy and may be in the universe. It is the only deformed planet that has a variable orbit around the sun and the only planet that has water on its surface. How did earth deformation take place? What does cause earth to deform? What are the results of earth deformation? How does its orbit around the sun change? First earth size computation can be achieved only considering the quantum of iron and nickel rested into earth core. This paper introduces a new theory “Earth expansion Theory”. The principles of “Earth Expansion Theory” are leading to new approaches and concepts to interpret whole earth dynamics and its geological and environmental changes. This theory is not an attempt to unify the two divergent dominant theories of continental drift, plate tectonic theory and earth expansion theory. The new theory is unique since it has a mathematical derivation, explains all the change to and around earth in terms of geological and environmental changes, and answers all unanswered questions in other theories. This paper presents the basic of the introduced theory and discusses the mechanism of earth expansion and how it took place, the forces that made the expansion. The mechanisms of earth size change from its spherical shape with radius about 3447.6 km to an elliptic shape of major radius about 6378.1 km and minor radius of about 6356.8 km and how it took place, are introduced and discussed. This article also introduces, in a more realistic explanation the formation of oceans and seas, the preparation of river formation. It also addresses the role of iron in earth size enlargement process within the continuum mechanics framework.

Keywords: earth size, earth expansion, continuum mechanics, continental and ocean formation

Procedia PDF Downloads 435
1686 Information Literacy among Faculty and Students of Medical Colleges of Haryana, Punjab and Chandigarh

Authors: Sanjeev Sharma, Suman Lata

Abstract:

With the availability of diverse printed, electronic literature and web sites on medical and health related information, it is impossible for the medical professional to get the information he seeks in the shortest possible time. For all these problems information literacy is the only solution. Thus, information literacy is recognized as an important aspect of medical education. In the present study, an attempt has been made to know the information literacy skills of the faculty and students at medical colleges of Haryana, Punjab and Chandigarh. The scope of the study was confined to the 12 selected medical colleges of three States (Haryana, Punjab, and Chandigarh). The findings of the study were based on the data collected through 1018 questionnaires filled by the respondents of the medical colleges. It was found that Online Medical Websites (such as WebMD, eMedicine and Mayo Clinic etc.) were frequently used by 63.43% of the respondents of Chandigarh which is slightly more than Haryana (61%) and Punjab (55.65%). As well, 30.86% of the respondents of Chandigarh, 27.41% of Haryana and 27.05% of Punjab were familiar with the controlled vocabulary tool; 25.14% respondents of Chandigarh, 23.80% of Punjab, 23.17% of Haryana were familiar with the Boolean operators; 33.05% of the respondents of Punjab, 28.19% of Haryana and 25.14% of Chandigarh were familiar with the use and importance of the keywords while searching an electronic database; and 51.43% of the respondents of Chandigarh, 44.52% of Punjab and 36.29% of Haryana were able to make effective use of the retrieved information. For accessing information in electronic format, 47.74% of the respondents rated their skills high, while the majority of respondents (76.13%) were unfamiliar with the basic search technique i.e. Boolean operator used for searching information in an online database. On the basis of the findings, it was suggested that a comprehensive training program based on medical professionals information needs should be organized frequently. Furthermore, it was also suggested that information literacy may be included as a subject in the health science curriculum so as to make the medical professionals information literate and independent lifelong learners.

Keywords: information, information literacy, medical professionals, medical colleges

Procedia PDF Downloads 131
1685 Performance of Bored Pile on Alluvial Deposit

Authors: K. Raja Rajan, D. Nagarajan

Abstract:

Bored cast in-situ pile is a popular choice amongst consultant and contractor due to the ability to adjust the pile length suitably in case if any variation found in the actual geological strata. Bangladesh geological strata are dominated by silt content. Design is normally based on field test such as Standard Penetration test N-values. Initially, pile capacity estimated through static formula with co-relation of N-value and angle of internal friction. Initial pile load test was conducted in order to validate the geotechnical parameters assumed in design. Initial pile load test was conducted on 1.5m diameter bored cast in-situ pile. Kentledge method is used to load the pile for 2.5 times of its working load. Initially, safe working load of pile has been estimated as 570T, so test load is fixed to 1425T. Max load applied is 777T for which the settlement reached around 155mm which is more than 10% of diameter of piles. Pile load test results was not satisfactory and compelled to increase the pile length approximately 20% of its total length. Due to unpredictable geotechnical parameters, length of each pile has been increased which is having a major impact on the project cost and as well as in project schedule. Extra bore holes have been planned along with lab test results in order to redefine the assumed geotechnical parameters. This article presents detailed design assumptions of geotechnical parameters in the design stage and the results of pile load test which made to redefine the assumed geotechnical properties.

Keywords: end bearing, pile load test, settlement, shaft friction

Procedia PDF Downloads 238
1684 Influence of Thermal History on the Undrained Shear Strength of the Bentonite-Sand Mixture

Authors: K. Ravi, Sabu Subhash

Abstract:

Densely compacted bentonite or bentonite–sand mixture has been identified as a suitable buffer in the deep geological repository (DGR) for the safe disposal of high-level nuclear waste (HLW) due to its favourable physicochemical and hydro-mechanical properties. The addition of sand to the bentonite enhances the thermal conductivity and compaction properties and reduces the drying shrinkage of the buffer material. The buffer material may undergo cyclic wetting and drying upon ingress of groundwater from the surrounding rock mass and from evaporation due to high temperature (50–210 °C) derived from the waste canister. The cycles of changes in temperature may result in thermal history, and the hydro-mechanical properties of the buffer material may be affected. This paper examines the influence of thermal history on the undrained shear strength of bentonite and bentonite-sand mixture. Bentonite from Rajasthan state and sand from the Assam state of India are used in this study. The undrained shear strength values are obtained by conducting unconfined compressive strength (UCS) tests on cylindrical specimens (dry densities 1.30 and 1.5 Mg/m3) of bentonite and bentonite-sand mixture consisting of 30 % bentonite+ 70 % sand. The specimens are preheated at temperatures varying from 50-150 °C for one, two and four hours in hot air oven. The results indicate that the undrained shear strength is increased by the thermal history of the buffer material. The specimens of bentonite-sand mixture exhibited more increase in strength compared to the pure bentonite specimens. This indicates that the sand content of the mixture plays a vital role in taking the thermal stresses of the bentonite buffer in DGR conditions.

Keywords: bentonite, deep geological repository, thermal history, undrained shear strength

Procedia PDF Downloads 331
1683 Characteristics of Tremella fuciformis and Annulohypoxylon stygium for Optimal Cultivation Conditions

Authors: Eun-Ji Lee, Hye-Sung Park, Chan-Jung Lee, Won-Sik Kong

Abstract:

We analyzed the DNA sequence of the ITS (Internal Transcribed Spacer) region of the 18S ribosomal gene and compared it with the gene sequence of T. fuciformis and Hypoxylon sp. in the BLAST database. The sequences of collected T. fuciformis and Hypoxylon sp. have over 99% homology in the T. fuciformis and Hypoxylon sp. sequence BLAST database. In order to select the optimal medium for T. fuciformis, five kinds of a medium such as Potato Dextrose Agar (PDA), Mushroom Complete Medium (MCM), Malt Extract Agar (MEA), Yeast extract (YM), and Compost Extract Dextrose Agar (CDA) were used. T. fuciformis showed the best growth on PDA medium, and Hypoxylon sp. showed the best growth on MCM. So as to investigate the optimum pH and temperature, the pH range was set to pH4 to pH8 and the temperature range was set to 15℃ to 35℃ (5℃ degree intervals). Optimum culture conditions for the T. fuciformis growth were pH5 at 25℃. Hypoxylon sp. were pH6 at 25°C. In order to confirm the most suitable carbon source, we used fructose, galactose, saccharose, soluble starch, inositol, glycerol, xylose, dextrose, lactose, dextrin, Na-CMC, adonitol. Mannitol, mannose, maltose, raffinose, cellobiose, ethanol, salicine, glucose, arabinose. In the optimum carbon source, T. fuciformis is xylose and Hypoxylon sp. is arabinose. Using the column test, we confirmed sawdust a suitable for T. fuciformis, since the composition of sawdust affects the growth of fruiting bodies of T. fuciformis. The sawdust we used is oak tree, pine tree, poplar, birch, cottonseed meal, cottonseed hull. In artificial cultivation of T. fuciformis with sawdust medium, T. fuciformis and Hypoxylon sp. showed fast mycelial growth on mixture of oak tree sawdust, cottonseed hull, and wheat bran.

Keywords: cultivation, optimal condition, tremella fuciformis, nutritional source

Procedia PDF Downloads 185
1682 Iris Recognition Based on the Low Order Norms of Gradient Components

Authors: Iman A. Saad, Loay E. George

Abstract:

Iris pattern is an important biological feature of human body; it becomes very hot topic in both research and practical applications. In this paper, an algorithm is proposed for iris recognition and a simple, efficient and fast method is introduced to extract a set of discriminatory features using first order gradient operator applied on grayscale images. The gradient based features are robust, up to certain extents, against the variations may occur in contrast or brightness of iris image samples; the variations are mostly occur due lightening differences and camera changes. At first, the iris region is located, after that it is remapped to a rectangular area of size 360x60 pixels. Also, a new method is proposed for detecting eyelash and eyelid points; it depends on making image statistical analysis, to mark the eyelash and eyelid as a noise points. In order to cover the features localization (variation), the rectangular iris image is partitioned into N overlapped sub-images (blocks); then from each block a set of different average directional gradient densities values is calculated to be used as texture features vector. The applied gradient operators are taken along the horizontal, vertical and diagonal directions. The low order norms of gradient components were used to establish the feature vector. Euclidean distance based classifier was used as a matching metric for determining the degree of similarity between the features vector extracted from the tested iris image and template features vectors stored in the database. Experimental tests were performed using 2639 iris images from CASIA V4-Interival database, the attained recognition accuracy has reached up to 99.92%.

Keywords: iris recognition, contrast stretching, gradient features, texture features, Euclidean metric

Procedia PDF Downloads 313
1681 Classification of Small Towns: Three Methodological Approaches and Their Results

Authors: Jerzy Banski

Abstract:

Small towns represent a key element of settlement structure and serve a number of important functions associated with the servicing of rural areas that surround them. It is in light of this that scientific studies have paid considerable attention to the functional structure of centers of this kind, as well as the relationships with both surrounding rural areas and other urban centers. But a preliminary to such research has typically involved attempts at classifying the urban centers themselves, with this also assisting with the planning and shaping of development policy on different spatial scales. The purpose of the work is to test out the methods underpinning three different classifications of small urban centers, as well as to offer a preliminary interpretation of the outcomes obtained. Research took in 722 settlement units in Poland, granted town rights and populated by fewer than 20,000 inhabitants. A morphologically-based classification making reference to the database of topographic objects as regards land cover within the administrative boundaries of towns and cities was carried out, and it proved possible to distinguish the categories of “housing-estate”, industrial and R&R towns, as well as towns characterized by dichotomy. Equally, a functional/morphological approach taken with the same database allowed for the identification – via an alternative method – of three main categories of small towns (i.e., the monofunctional, multifunctional or oligo functional), which could then be described in far greater detail. A third, multi-criterion classification made simultaneous reference to the conditioning of a structural, a location-related, and an administrative hierarchy-related nature, allowing for distinctions to be drawn between small towns in 9 different categories. The results obtained allow for multifaceted analysis and interpretation of the geographical differentiation characterizing the distribution of Poland’s urban centers across space in the country.

Keywords: small towns, classification, local planning, Poland

Procedia PDF Downloads 65
1680 Innovative Predictive Modeling and Characterization of Composite Material Properties Using Machine Learning and Genetic Algorithms

Authors: Hamdi Beji, Toufik Kanit, Tanguy Messager

Abstract:

This study aims to construct a predictive model proficient in foreseeing the linear elastic and thermal characteristics of composite materials, drawing on a multitude of influencing parameters. These parameters encompass the shape of inclusions (circular, elliptical, square, triangle), their spatial coordinates within the matrix, orientation, volume fraction (ranging from 0.05 to 0.4), and variations in contrast (spanning from 10 to 200). A variety of machine learning techniques are deployed, including decision trees, random forests, support vector machines, k-nearest neighbors, and an artificial neural network (ANN), to facilitate this predictive model. Moreover, this research goes beyond the predictive aspect by delving into an inverse analysis using genetic algorithms. The intent is to unveil the intrinsic characteristics of composite materials by evaluating their thermomechanical responses. The foundation of this research lies in the establishment of a comprehensive database that accounts for the array of input parameters mentioned earlier. This database, enriched with this diversity of input variables, serves as a bedrock for the creation of machine learning and genetic algorithm-based models. These models are meticulously trained to not only predict but also elucidate the mechanical and thermal conduct of composite materials. Remarkably, the coupling of machine learning and genetic algorithms has proven highly effective, yielding predictions with remarkable accuracy, boasting scores ranging between 0.97 and 0.99. This achievement marks a significant breakthrough, demonstrating the potential of this innovative approach in the field of materials engineering.

Keywords: machine learning, composite materials, genetic algorithms, mechanical and thermal proprieties

Procedia PDF Downloads 43
1679 The Role of Artificial Intelligence in Criminal Procedure

Authors: Herke Csongor

Abstract:

The artificial intelligence (AI) has been used in the United States of America in the decisionmaking process of the criminal justice system for decades. In the field of law, including criminal law, AI can provide serious assistance in decision-making in many places. The paper reviews four main areas where AI still plays a role in the criminal justice system and where it is expected to play an increasingly important role. The first area is the predictive policing: a number of algorithms are used to prevent the commission of crimes (by predicting potential crime locations or perpetrators). This may include the so-called linking hot-spot analysis, crime linking and the predictive coding. The second area is the Big Data analysis: huge amounts of data sets are already opaque to human activity and therefore unprocessable. Law is one of the largest producers of digital documents (because not only decisions, but nowadays the entire document material is available digitally), and this volume can only and exclusively be handled with the help of computer programs, which the development of AI systems can have an increasing impact on. The third area is the criminal statistical data analysis. The collection of statistical data using traditional methods required enormous human resources. The AI is a huge step forward in that it can analyze the database itself, based on the requested aspects, a collection according to any aspect can be available in a few seconds, and the AI itself can analyze the database and indicate if it finds an important connection either from the point of view of crime prevention or crime detection. Finally, the use of AI during decision-making in both investigative and judicial fields is analyzed in detail. While some are skeptical about the future role of AI in decision-making, many believe that the question is not whether AI will participate in decision-making, but only when and to what extent it will transform the current decision-making system.

Keywords: artificial intelligence, international criminal cooperation, planning and organizing of the investigation, risk assessment

Procedia PDF Downloads 15