Search results for: geophysical database referenced navigation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2189

Search results for: geophysical database referenced navigation

1169 Wireless Sensor Anomaly Detection Using Soft Computing

Authors: Mouhammd Alkasassbeh, Alaa Lasasmeh

Abstract:

We live in an era of rapid development as a result of significant scientific growth. Like other technologies, wireless sensor networks (WSNs) are playing one of the main roles. Based on WSNs, ZigBee adds many features to devices, such as minimum cost and power consumption, and increasing the range and connect ability of sensor nodes. ZigBee technology has come to be used in various fields, including science, engineering, and networks, and even in medicinal aspects of intelligence building. In this work, we generated two main datasets, the first being based on tree topology and the second on star topology. The datasets were evaluated by three machine learning (ML) algorithms: J48, meta.j48 and multilayer perceptron (MLP). Each topology was classified into normal and abnormal (attack) network traffic. The dataset used in our work contained simulated data from network simulation 2 (NS2). In each database, the Bayesian network meta.j48 classifier achieved the highest accuracy level among other classifiers, of 99.7% and 99.2% respectively.

Keywords: IDS, Machine learning, WSN, ZigBee technology

Procedia PDF Downloads 543
1168 Microstructure and High Temperature Deformation Behavior of Cast 310S Alloy

Authors: Jung-Ho Moon, Myung-Gon Yoon, Tae Kwon Ha

Abstract:

High temperature deformation behavior of cast 310S stainless steel has been investigated in this study by performing tensile and compression tests at temperatures from 900 to 1200°C. Rectangular ingots of which the dimensions were 350×350×100 in millimeter were cast using vacuum induction melting. Phase equilibrium was calculated using the FactSage®, thermodynamic software and database. Thermal expansion coefficient was also measured on the ingot in the temperature range from room temperature to 1200°C. Tensile strength of cast 310S stainless steel was 9 MPa at 1200°C, which is a little higher than that of a wrought 310S. With temperature decreased, tensile strength increased rapidly and reached up to 72 MPa at 900°C. Elongation also increased with temperature decreased. Microstructure observation revealed that σ phase was precipitated along the grain boundary and within the matrix over 1200°C, which is detrimental to high temperature elongation.

Keywords: stainless steel, STS 310S, high temperature deformation, microstructure, mechanical properties

Procedia PDF Downloads 400
1167 A Method for Precise Vertical Position of the Implant When Using Computerized Surgical Guides and Bone Reduction

Authors: Abraham Finkelman

Abstract:

Computerized Surgical Guides have been proven to be a predictable way to perform dental implants, with a relatively high accuracy in comparison to a treatment plan. When using the CSG Bone supported, it allows us to make the necessary changes of the hard tissue prior to the implant placement and after the implant placement. The CSG gives us an accurate position for the drilling, and during the implant placement it allows us to alter the vertical position of the implant altering the final position of the abutment and avoiding any risk of any damage to the adjacent anatomical structures. Any Changes required to the bone level can be done prior to the fixation of the CSG using a reduction guide, which incur extra surgical fees and the need of a second surgical guide. Any changes of the bone level after the implant placement are at the risk of damaging the implant neck surface. The technique consists of a universal system that allows us to remove the excess bone around the implant sockets prior to the implant placement which then enables us to place the implant in the vertical position with accuracy as planned with the CSG. The systems consist of a hollow pin of different sizes and diameters. Depending on the implant system that we are using. Length sizes are from 6mm-16mm and a diameter of 2.6mm-4.8mm. Upon the completion of the drilling, the pin is then inserted into the implant socket-using the insertion tool. Once the insertion tool has unscrewed the pin, we can continue with the bone reduction. The bone reduction can be done using conventional methods upon the removal of all the excess bone around the pin. The insertion tool is then screwed into the pin and the pin is then removed. We now, have the new bone level at the crest of the implant socket which is our mark for the vertical position of the implant. In some cases, when we are locating the implant very close to anatomical structures, any form of deviation to the vertical position of the implant during the surgery, can cause damage to such anatomical structures, creating irreversible damages such as paresthesia or dysesthesia of the mandibular nerve. If we are planning for immediate loading and we have done our temporary restauration in base of our computerized plan, deviation in the vertical position of the implant will affect the position of the abutment, affecting the accuracy of the temporary prosthesis, extending the working time till we adapt the prosthesis to the new position.

Keywords: bone reduction, computer aided navigation, dental implant placement, surgical guides

Procedia PDF Downloads 331
1166 How to Perform Proper Indexing?

Authors: Watheq Mansour, Waleed Bin Owais, Mohammad Basheer Kotit, Khaled Khan

Abstract:

Efficient query processing is one of the utmost requisites in any business environment to satisfy consumer needs. This paper investigates the various types of indexing models, viz. primary, secondary, and multi-level. The investigation is done under the ambit of various types of queries to which each indexing model performs with efficacy. This study also discusses the inherent advantages and disadvantages of each indexing model and how indexing models can be chosen based on a particular environment. This paper also draws parallels between various indexing models and provides recommendations that would help a Database administrator to zero-in on a particular indexing model attributed to the needs and requirements of the production environment. In addition, to satisfy industry and consumer needs attributed to the colossal data generation nowadays, this study has proposed two novel indexing techniques that can be used to index highly unstructured and structured Big Data with efficacy. The study also briefly discusses some best practices that the industry should follow in order to choose an indexing model that is apposite to their prerequisites and requirements.

Keywords: indexing, hashing, latent semantic indexing, B-tree

Procedia PDF Downloads 156
1165 PRISM: An Analytical Tool for Forest Plan Development

Authors: Dung Nguyen, Yu Wei, Eric Henderson

Abstract:

Analytical tools have been used for decades to assist in the development of forest plans. In 2016, a new decision support system, PRISM, was jointly developed by United States Forest Service (USFS) Northern Region and Colorado State University to support the forest planning process. Prism has a friendly user interface with functionality for database management, model development, data visualization, and sensitivity analysis. The software is tailored for USFS planning, but it is flexible enough to support planning efforts by other forestland owners and managers. Here, the core capability of PRISM and its applications in developing plans for several United States national forests are presented. The strengths of PRISM are also discussed to show its potential of being a preferable tool for managers and experts in the domain of forest management and planning.

Keywords: decision support, forest management, forest plan, graphical user interface, software

Procedia PDF Downloads 111
1164 Modeling of Global Solar Radiation on a Horizontal Surface Using Artificial Neural Network: A Case Study

Authors: Laidi Maamar, Hanini Salah

Abstract:

The present work investigates the potential of artificial neural network (ANN) model to predict the horizontal global solar radiation (HGSR). The ANN is developed and optimized using three years meteorological database from 2011 to 2013 available at the meteorological station of Blida (Blida 1 university, Algeria, Latitude 36.5°, Longitude 2.81° and 163 m above mean sea level). Optimal configuration of the ANN model has been determined by minimizing the Root Means Square Error (RMSE) and maximizing the correlation coefficient (R2) between observed and predicted data with the ANN model. To select the best ANN architecture, we have conducted several tests by using different combinations of parameters. A two-layer ANN model with six hidden neurons has been found as an optimal topology with (RMSE=4.036 W/m²) and (R²=0.999). A graphical user interface (GUI), was designed based on the best network structure and training algorithm, to enhance the users’ friendliness application of the model.

Keywords: artificial neural network, global solar radiation, solar energy, prediction, Algeria

Procedia PDF Downloads 498
1163 Coupling Large Language Models with Disaster Knowledge Graphs for Intelligent Construction

Authors: Zhengrong Wu, Haibo Yang

Abstract:

In the context of escalating global climate change and environmental degradation, the complexity and frequency of natural disasters are continually increasing. Confronted with an abundance of information regarding natural disasters, traditional knowledge graph construction methods, which heavily rely on grammatical rules and prior knowledge, demonstrate suboptimal performance in processing complex, multi-source disaster information. This study, drawing upon past natural disaster reports, disaster-related literature in both English and Chinese, and data from various disaster monitoring stations, constructs question-answer templates based on large language models. Utilizing the P-Tune method, the ChatGLM2-6B model is fine-tuned, leading to the development of a disaster knowledge graph based on large language models. This serves as a knowledge database support for disaster emergency response.

Keywords: large language model, knowledge graph, disaster, deep learning

Procedia PDF Downloads 56
1162 The Potential for Recycling Household Wastes Generated from the Residential Areas of Obafemi Awolowo University, Ile-Ife

Authors: Asaolu Olugbenga Stephen, Afolabi Olusegun Temitope

Abstract:

Lack of proper solid waste management is one of the main causes of environmental pollution and degradation in many cities, especially in developing countries. The aim of this study was to estimate the quantity of waste generated per capita per day, determine the composition and identify the potentials for recycling of waste generated. Characterization of wastes from selected households in the residential areas was done for over a 7 day period. The weight of each sorted category of waste was recorded in a structured database that calculated the proportion of each waste component. The results indicated that 85.4% of the sampled waste characterized was found to be recyclable; with an estimated average waste generated of 1.82kg/capita/day. The various solid waste fractions were organic (64.6%), plastics (15.6%), metals (9.2%), glass materials (1.6%) and unclassified (8.9%). It was concluded from this study that a large proportion of the waste generated from OAU campus residential area was recyclable and that there is a need to enact policy on waste recycling within the university campus.

Keywords: recycling, household wastes, residential, solid waste management

Procedia PDF Downloads 401
1161 Development of a Decision-Making Method by Using Machine Learning Algorithms in the Early Stage of School Building Design

Authors: Pegah Eshraghi, Zahra Sadat Zomorodian, Mohammad Tahsildoost

Abstract:

Over the past decade, energy consumption in educational buildings has steadily increased. The purpose of this research is to provide a method to quickly predict the energy consumption of buildings using separate evaluation of zones and decomposing the building to eliminate the complexity of geometry at the early design stage. To produce this framework, machine learning algorithms such as Support vector regression (SVR) and Artificial neural network (ANN) are used to predict energy consumption and thermal comfort metrics in a school as a case. The database consists of more than 55000 samples in three climates of Iran. Cross-validation evaluation and unseen data have been used for validation. In a specific label, cooling energy, it can be said the accuracy of prediction is at least 84% and 89% in SVR and ANN, respectively. The results show that the SVR performed much better than the ANN.

Keywords: early stage of design, energy, thermal comfort, validation, machine learning

Procedia PDF Downloads 98
1160 Relationship between Growth of Non-Performing Assets and Credit Risk Management Practices in Indian Banks

Authors: Sirus Sharifi, Arunima Haldar, S. V. D. Nageswara Rao

Abstract:

The study attempts to analyze the impact of credit risk management practices of Indian scheduled commercial banks on their non-performing assets (NPAs). The data on credit risk practices was collected by administering a questionnaire to risk managers/executives at different banks. The data on NPAs (from 2012 to 2016) is sourced from Prowess, a database compiled by the Centre for Monitoring Indian Economy (CMIE). The model was estimated using cross-sectional regression method. As expected, the findings suggest that there is a negative relationship between credit risk management and NPA growth in Indian banks. The study has implications for Indian banks given the high level of losses, and the implementation of Basel III norms by the central bank, i.e. Reserve Bank of India (RBI). Evidence on credit risk management in Indian banks, and their relationship with non-performing assets held by them.

Keywords: credit risk, identification, Indian Banks, NPAs, ownership

Procedia PDF Downloads 408
1159 iCCS: Development of a Mobile Web-Based Student Integrated Information System using Hill Climbing Algorithm

Authors: Maria Cecilia G. Cantos, Lorena W. Rabago, Bartolome T. Tanguilig III

Abstract:

This paper describes a conducive and structured information exchange environment for the students of the College of Computer Studies in Manuel S. Enverga University Foundation in. The system was developed to help the students to check their academic result, manage profile, make self-enlistment and assist the students to manage their academic status that can be viewed also in mobile phones. Developing class schedules in a traditional way is a long process that involves making many numbers of choices. With Hill Climbing Algorithm, however, the process of class scheduling, particularly with regards to courses to be taken by the student aligned with the curriculum, can perform these processes and end up with an optimum solution. The proponent used Rapid Application Development (RAD) for the system development method. The proponent also used the PHP as the programming language and MySQL as the database.

Keywords: hill climbing algorithm, integrated system, mobile web-based, student information system

Procedia PDF Downloads 384
1158 Location Uncertainty – A Probablistic Solution for Automatic Train Control

Authors: Monish Sengupta, Benjamin Heydecker, Daniel Woodland

Abstract:

New train control systems rely mainly on Automatic Train Protection (ATP) and Automatic Train Operation (ATO) dynamically to control the speed and hence performance. The ATP and the ATO form the vital element within the CBTC (Communication Based Train Control) and within the ERTMS (European Rail Traffic Management System) system architectures. Reliable and accurate measurement of train location, speed and acceleration are vital to the operation of train control systems. In the past, all CBTC and ERTMS system have deployed a balise or equivalent to correct the uncertainty element of the train location. Typically a CBTC train is allowed to miss only one balise on the track, after which the Automatic Train Protection (ATP) system applies emergency brake to halt the service. This is because the location uncertainty, which grows within the train control system, cannot tolerate missing more than one balise. Balises contribute a significant amount towards wayside maintenance and studies have shown that balises on the track also forms a constraint for future track layout change and change in speed profile.This paper investigates the causes of the location uncertainty that is currently experienced and considers whether it is possible to identify an effective filter to ascertain, in conjunction with appropriate sensors, more accurate speed, distance and location for a CBTC driven train without the need of any external balises. An appropriate sensor fusion algorithm and intelligent sensor selection methodology will be deployed to ascertain the railway location and speed measurement at its highest precision. Similar techniques are already in use in aviation, satellite, submarine and other navigation systems. Developing a model for the speed control and the use of Kalman filter is a key element in this research. This paper will summarize the research undertaken and its significant findings, highlighting the potential for introducing alternative approaches to train positioning that would enable removal of all trackside location correction balises, leading to huge reduction in maintenances and more flexibility in future track design.

Keywords: ERTMS, CBTC, ATP, ATO

Procedia PDF Downloads 410
1157 Settlement Prediction for Tehran Subway Line-3 via FLAC3D and ANFIS

Authors: S. A. Naeini, A. Khalili

Abstract:

Nowadays, tunnels with different applications are developed, and most of them are related to subway tunnels. The excavation of shallow tunnels that pass under municipal utilities is very important, and the surface settlement control is an important factor in the design. The study sought to analyze the settlement and also to find an appropriate model in order to predict the behavior of the tunnel in Tehran subway line-3. The displacement in these sections is also determined by using numerical analyses and numerical modeling. In addition, the Adaptive Neuro-Fuzzy Inference System (ANFIS) method is utilized by Hybrid training algorithm. The database pertinent to the optimum network was obtained from 46 subway tunnels in Iran and Turkey which have been constructed by the new Austrian tunneling method (NATM) with similar parameters based on type of their soil. The surface settlement was measured, and the acquired results were compared to the predicted values. The results disclosed that computing intelligence is a good substitute for numerical modeling.

Keywords: settlement, Subway Line, FLAC3D, ANFIS Method

Procedia PDF Downloads 233
1156 Mining in Peru and Local Governance: Assessing the Contribution of CRS Projects

Authors: Sandra Carrillo Hoyos

Abstract:

Mining activities in South America have significantly grown during the last decades, given the abundance of natural resources, the implemented governmental policies to incentivize foreign investment as well as the boom in international prices for metals and oil between 2002 and 2008. While this context allowed the region to occupy a leading position between the top producers of minerals around the world, it has also meant an increase in socio-environmental conflicts which have generated costs and negative impacts not only for the companies but especially for the governments and local communities.During the latest decade, the mining sector in Peru has faced with the social resistance of a large number of communities, which began organizing actions against the implementation of high investing projects. The dissatisfaction has derived in the prevalence of socio-environmental conflicts associated with mining activities, some of them never solved into an agreement. In order to prevent those socio-environmental conflicts and obtain the social license from local communities, most of the mining companies have developed diverse initiatives within the framework of policies and practices of corporate social responsibility (CSR). This paper has assessed the mining sector’s contribution toward the local development management along the last decade, as part of CSR strategies as well as the policies promoted by the Peruvian State. This assessment found that, in the beginning, these initiatives have been based on a philanthropic approach and were reacting to pressures from local stakeholders to maintain the consent to operate from the surrounding communities as well as to create, as a result, a harmonious atmosphere for operations. Due to the weak State presence, such practices have increased the expectations of communities related to the participation of mining companies in solving structural development problems, especially those related to primary needs, infrastructure, education, health, among others. In other words, this paper was focused on analyze in what extent these initiatives have promoted local empowerment for development planning and integrated management of natural resources from a territorial approach. From this perspective, the analysis demonstrates that, while the design and planning of social investment initiatives have improved due to the sector´s sustainability approach, many companies have developed actions beyond their competence during this process. In some cases, the referenced actions have generated dependency with communities, even though this relationship has not exempted the companies of conflict situations with unfortunate consequences. Furthermore, the social programs developed have not necessarily generated a significant impact in improving the quality of life of affected populations. In fact, it is possible to identify that those regions with high mining resources and investment are facing with a situation of poverty and high dependency on mining production. In spite of the revenues derived from mining industry, local governments have not been able to translate the royalties into sustainable development opportunities. For this reason, the proposed paper suggests some challenges for the mining sector contribution to local development based on the best practices and lessons learnt from a benchmarking for the leading mining companies.

Keywords: corporate social responsibility, local development, mining, socio-environmental conflict

Procedia PDF Downloads 404
1155 An Erudite Technique for Face Detection and Recognition Using Curvature Analysis

Authors: S. Jagadeesh Kumar

Abstract:

Face detection and recognition is an authoritative technology for image database management, video surveillance, and human computer interface (HCI). Face recognition is a rapidly nascent method, which has been extensively discarded in forensics such as felonious identification, tenable entree, and custodial security. This paper recommends an erudite technique using curvature analysis (CA) that has less false positives incidence, operative in different light environments and confiscates the artifacts that are introduced during image acquisition by ring correction in polar coordinate (RCP) method. This technique affronts mean and median filtering technique to remove the artifacts but it works in polar coordinate during image acquisition. Investigational fallouts for face detection and recognition confirms decent recitation even in diagonal orientation and stance variation.

Keywords: curvature analysis, ring correction in polar coordinate method, face detection, face recognition, human computer interaction

Procedia PDF Downloads 287
1154 Blockchain Technology in Supply Chain Management: A Systematic Review And Meta-Analysis

Authors: Mohammad Yousuf Khan, Bhavya Alankar

Abstract:

Blockchain is a promising technology with its features such as immutability and decentralized database. It has applications in various fields such as pharmaceutical, finance, & the food industry. At the core of its heart lies its feature, traceability which is the most desired key in supply chains. However, supply chains have always been hit rock bottom by scandals and controversies. In this review paper, we have explored the advancement and research gaps of blockchain technology (BT) in supply chain management (SCM). We have used the Prisma framework for systematic literature review (SLR) and included a minuscule amount of grey literature to reduce publication bias. We found that supply chain traceability and transparency is the most researched objective in SCM. There was hardly any research in supply chain resilience. Further, we found that 40 % of the papers were application based. Most articles have focused on the advantages of BT, rather than analyzing it critically. This study will help identify gaps and suitable actions to be followed for an efficient implementation of BT in SCM.

Keywords: blockchain technology, supply chain management, supply chain transparency, supply chain resilience

Procedia PDF Downloads 161
1153 Scientometrics Analysis of Food Supply Chain Risk Assessment Literature: Based On Web of Science Record 1996-2014

Authors: Mohsen Shirani, Shadi Asadzandi, Micaela Demichela

Abstract:

This paper presents the results of a study to assess crucial aspects and the strength of the scientific basis of a typically interdisciplinary, applied field: food supply chain risk assessment research. Our approach is based on an advanced scientometrics analysis with novel elements to assess the influence and dissemination of research results and to measure interdisciplinary. This paper aims to describe the quantity and quality of the publication trends in food supply chain risk assessment. The population under study was composed of 266 articles from database web of science. The results were analyzed based on date of publication, type of document, language of the documents, source of publications, subject areas, authors and their affiliations, and the countries involved in developing the articles.

Keywords: food supply chain, risk assessment, scientometrics, web of science

Procedia PDF Downloads 495
1152 Generating Real-Time Visual Summaries from Located Sensor-Based Data with Chorems

Authors: Z. Bouattou, R. Laurini, H. Belbachir

Abstract:

This paper describes a new approach for the automatic generation of the visual summaries dealing with cartographic visualization methods and sensors real time data modeling. Hence, the concept of chorems seems an interesting candidate to visualize real time geographic database summaries. Chorems have been defined by Roger Brunet (1980) as schematized visual representations of territories. However, the time information is not yet handled in existing chorematic map approaches, issue has been discussed in this paper. Our approach is based on spatial analysis by interpolating the values recorded at the same time, by sensors available, so we have a number of distributed observations on study areas and used spatial interpolation methods to find the concentration fields, from these fields and by using some spatial data mining procedures on the fly, it is possible to extract important patterns as geographic rules. Then, those patterns are visualized as chorems.

Keywords: geovisualization, spatial analytics, real-time, geographic data streams, sensors, chorems

Procedia PDF Downloads 400
1151 The Efficacy of Open Educational Resources in Students’ Performance and Engagement

Authors: Huda Al-Shuaily, E. M. Lacap

Abstract:

Higher Education is one of the most essential fundamentals for the advancement and progress of a country. It demands to be as accessible as possible and as comprehensive as it can be reached. In this paper, we succeeded to expand the accessibility and delivery of higher education using an Open Educational Resources (OER), a freely accessible, openly licensed documents, and media for teaching and learning. This study creates a comparative design of student’s academic performance on the course Introduction to Database and student engagement to the virtual learning environment (VLE). The study was done in two successive semesters - one without using the OER and the other is using OER. In the study, we established that there is a significant increase in student’s engagement in VLE in the latter semester compared to the former. By using the latter semester’s data, we manage to show that the student’s engagement has a positive impact on students’ academic performance. Moreso, after clustering their academic performance, the impact is seen higher for students who are low performing. The results show that these engagements can be used to potentially predict the learning styles of the student with a high degree of precision.

Keywords: EDM, learning analytics, moodle, OER, student-engagement

Procedia PDF Downloads 339
1150 A Context-Sensitive Algorithm for Media Similarity Search

Authors: Guang-Ho Cha

Abstract:

This paper presents a context-sensitive media similarity search algorithm. One of the central problems regarding media search is the semantic gap between the low-level features computed automatically from media data and the human interpretation of them. This is because the notion of similarity is usually based on high-level abstraction but the low-level features do not sometimes reflect the human perception. Many media search algorithms have used the Minkowski metric to measure similarity between image pairs. However those functions cannot adequately capture the aspects of the characteristics of the human visual system as well as the nonlinear relationships in contextual information given by images in a collection. Our search algorithm tackles this problem by employing a similarity measure and a ranking strategy that reflect the nonlinearity of human perception and contextual information in a dataset. Similarity search in an image database based on this contextual information shows encouraging experimental results.

Keywords: context-sensitive search, image search, similarity ranking, similarity search

Procedia PDF Downloads 365
1149 Simulation of Gamma Rays Attenuation Coefficient for Some common Shielding Materials Using Monte Carlo Program

Authors: Cherief Houria, Fouka Mourad

Abstract:

In this work, the simulation of the radiation attenuation is carried out in a photon detector consisting of different common shielding material using a Monte Carlo program called PTM. The aim of the study is to investigate the effect of atomic weight and the thickness of shielding materials on the gamma radiation attenuation ability. The linear attenuation coefficients of Aluminum (Al), Iron (Fe), and lead (Pb) elements were evaluated at photons energy of 661:7KeV that are considered to be emitted from a standard radioactive point source Cs 137. The experimental measurements have been performed for three materials to obtain these linear attenuation coefficients, using a Gamma NaI(Tl) scintillation detector. Our results have been compared with the simulation results of the linear attenuation coefficient using the XCOM database and Geant4 codes and reveal that they are well agreed with both simulation data.

Keywords: gamma photon, Monte Carlo program, radiation attenuation, shielding material, the linear attenuation coefficient

Procedia PDF Downloads 203
1148 Single Cell and Spatial Transcriptomics: A Beginners Viewpoint from the Conceptual Pipeline

Authors: Leo Nnamdi Ozurumba-Dwight

Abstract:

Messenger ribooxynucleic acid (mRNA) molecules are compositional, protein-based. These proteins, encoding mRNA molecules (which collectively connote the transcriptome), when analyzed by RNA sequencing (RNAseq), unveils the nature of gene expression in the RNA. The obtained gene expression provides clues of cellular traits and their dynamics in presentations. These can be studied in relation to function and responses. RNAseq is a practical concept in Genomics as it enables detection and quantitative analysis of mRNA molecules. Single cell and spatial transcriptomics both present varying avenues for expositions in genomic characteristics of single cells and pooled cells in disease conditions such as cancer, auto-immune diseases, hematopoietic based diseases, among others, from investigated biological tissue samples. Single cell transcriptomics helps conduct a direct assessment of each building unit of tissues (the cell) during diagnosis and molecular gene expressional studies. A typical technique to achieve this is through the use of a single-cell RNA sequencer (scRNAseq), which helps in conducting high throughput genomic expressional studies. However, this technique generates expressional gene data for several cells which lack presentations on the cells’ positional coordinates within the tissue. As science is developmental, the use of complimentary pre-established tissue reference maps using molecular and bioinformatics techniques has innovatively sprung-forth and is now used to resolve this set back to produce both levels of data in one shot of scRNAseq analysis. This is an emerging conceptual approach in methodology for integrative and progressively dependable transcriptomics analysis. This can support in-situ fashioned analysis for better understanding of tissue functional organization, unveil new biomarkers for early-stage detection of diseases, biomarkers for therapeutic targets in drug development, and exposit nature of cell-to-cell interactions. Also, these are vital genomic signatures and characterizations of clinical applications. Over the past decades, RNAseq has generated a wide array of information that is igniting bespoke breakthroughs and innovations in Biomedicine. On the other side, spatial transcriptomics is tissue level based and utilized to study biological specimens having heterogeneous features. It exposits the gross identity of investigated mammalian tissues, which can then be used to study cell differentiation, track cell line trajectory patterns and behavior, and regulatory homeostasis in disease states. Also, it requires referenced positional analysis to make up of genomic signatures that will be sassed from the single cells in the tissue sample. Given these two presented approaches to RNA transcriptomics study in varying quantities of cell lines, with avenues for appropriate resolutions, both approaches have made the study of gene expression from mRNA molecules interesting, progressive, developmental, and helping to tackle health challenges head-on.

Keywords: transcriptomics, RNA sequencing, single cell, spatial, gene expression.

Procedia PDF Downloads 122
1147 Journals' Productivity in the Literature on Malaria in Africa

Authors: Yahya Ibrahim Harande

Abstract:

The purpose of this study was to identify the journals that published articles on malaria disease in Africa and to determine the core of productive journals from the identified journals. The data for the study were culled out from African Index Medicus (AIM) database. A total of 529 articles was gathered from 115 journal titles from 1979-2011. In order to obtain the core of productive journals, Bradford`s law was applied to the collected data. Five journal titles were identified and determined as core journals. The data used for the study was analyzed and that, the subject matter used, Malaria was in conformity with the Bradford`s law. On the aspect dispersion of the literature, English was found to be the dominant language of the journals. (80.9%) followed by French (16.5%). Followed by Portuguese (1.7%) and German (0.9%). Recommendation is hereby proposed for the medical libraries to acquire these five journals that constitute the core in malaria literature for the use of their clients. It could also help in streamlining their acquision and selection exercises. More researches in the subject area using Bibliometrics approaches are hereby recommended.

Keywords: productive journals, malaria disease literature, Bradford`s law, core journals, African scholars

Procedia PDF Downloads 345
1146 Corporate Social Responsibility Practices and Financial Performance: The Case of French Unlisted SMEs

Authors: Zineb Abidi, Marc-Arthur Diaye

Abstract:

There exists a large empirical literature concerning the relationship between corporate social responsibility (CSR) and corporate financial performance. This literature, however, applies mainly to large corporations and/or listed firms. To the best of our knowledge, the question of whether meeting CSR requirements impacts the financial performance of small and medium-sized unlisted SMEs has not so far been analyzed. This paper aims to analyze, for the first time, the effect of CSR on the financial performance of SMEs. Using an original database including 5,257 French SMEs, we show that adopting CSR practices has a positive but weak effect on a firm’s financial performance. To develop this further, we analyzed CSR practices interactions assessing the best combination of CSR components that positively influence SME financial performance. Our results show that French SMEs benefit more from their pro-social behavior when they choose a combination of CSR components best adapted to their individual characteristics.

Keywords: corporate social responsibility, financial performance, unlisted firms, SMEs

Procedia PDF Downloads 172
1145 Geospatial Data Complexity in Electronic Airport Layout Plan

Authors: Shyam Parhi

Abstract:

Airports GIS program collects Airports data, validate and verify it, and stores it in specific database. Airports GIS allows authorized users to submit changes to airport data. The verified data is used to develop several engineering applications. One of these applications is electronic Airport Layout Plan (eALP) whose primary aim is to move from paper to digital form of ALP. The first phase of development of eALP was completed recently and it was tested for a few pilot program airports across different regions. We conducted gap analysis and noticed that a lot of development work is needed to fine tune at least six mandatory sheets of eALP. It is important to note that significant amount of programming is needed to move from out-of-box ArcGIS to a much customized ArcGIS which will be discussed. The ArcGIS viewer capability to display essential features like runway or taxiway or the perpendicular distance between them will be discussed. An enterprise level workflow which incorporates coordination process among different lines of business will be highlighted.

Keywords: geospatial data, geology, geographic information systems, aviation

Procedia PDF Downloads 416
1144 Automatic Facial Skin Segmentation Using Possibilistic C-Means Algorithm for Evaluation of Facial Surgeries

Authors: Elham Alaee, Mousa Shamsi, Hossein Ahmadi, Soroosh Nazem, Mohammad Hossein Sedaaghi

Abstract:

Human face has a fundamental role in the appearance of individuals. So the importance of facial surgeries is undeniable. Thus, there is a need for the appropriate and accurate facial skin segmentation in order to extract different features. Since Fuzzy C-Means (FCM) clustering algorithm doesn’t work appropriately for noisy images and outliers, in this paper we exploit Possibilistic C-Means (PCM) algorithm in order to segment the facial skin. For this purpose, first, we convert facial images from RGB to YCbCr color space. To evaluate performance of the proposed algorithm, the database of Sahand University of Technology, Tabriz, Iran was used. In order to have a better understanding from the proposed algorithm; FCM and Expectation-Maximization (EM) algorithms are also used for facial skin segmentation. The proposed method shows better results than the other segmentation methods. Results include misclassification error (0.032) and the region’s area error (0.045) for the proposed algorithm.

Keywords: facial image, segmentation, PCM, FCM, skin error, facial surgery

Procedia PDF Downloads 586
1143 Associated Factors of Hypercholesterolemia, Hyperuricemia and Double Burden of Hypercuricémia-Hypercholesterolemia in Gout Patients: Hospital Based Study

Authors: Pierre Mintom, Armel Assiene Agamou, Leslie Toukem, William Dakam, Christine Fernande Nyangono Biyegue

Abstract:

Context: Hyperuricemia, the presence of high levels of uric acid in the blood, is a known precursor to the development of gout. Recent studies have suggested a strong association between hyperuricemia and disorders of lipoprotein metabolism, specifically hypercholesterolemia. Understanding the factors associated with these conditions in gout patients is essential for effective treatment and management. Research Aim: The objective of this study was to determine the prevalence of hyperuricemia, hypercholesterolemia, and the double burden of hyperuricemia-hypercholesterolemia in the gouty population. Additionally, the study aimed to identify the factors associated with these conditions. Methodology: The study utilized a database from a survey of 150 gouty patients recruited at the Laquintinie Hospital in Douala between August 2017 and February 2018. The database contained information on anthropometric parameters, biochemical markers, and the food and drugs consumed by the patients. Hyperuricemia and hypercholesterolemia were defined based on specific serum uric acid and total cholesterol thresholds, and the double burden was defined as the co-occurrence of hyperuricemia and hypercholesterolemia. Findings: The study found that the prevalence rates for hyperuricemia, hypercholesterolemia, and the double burden were 61.3%, 76%, and 50.7% respectively. Factors associated with these conditions included hypertriglyceridemia, atherogenicity index TC/HDL ratio, atherogenicity index LDL/HDL ratio, family history, and the consumption of specific foods and drinks. Theoretical Importance: The study highlights the strong association between hyperuricemia and dyslipidemia, providing important insights for guiding treatment strategies in gout patients. Additionally, it emphasizes the significance of nutritional education in managing these metabolic disorders, suggesting the need to address eating habits in gout patients. Data Collection and Analysis Procedures: Data was collected through surveys and medical records of gouty patients. Information on anthropometric parameters, biochemical markers, and dietary habits was recorded. Prevalence rates and associated factors were determined through statistical analysis, employing odds ratios to assess the risks. Question Addressed: The study aimed to address the prevalence rates and associated factors of hyperuricemia, hypercholesterolemia, and the double burden in gouty patients. It sought to understand the relationships between these conditions and determine their implications for treatment and nutritional education. Conclusion: Findings show that it’s exists an association between hyperuricemia and hypercholesterolemia in gout patients, thus creating a double burden. The findings underscore the importance of considering family history and eating habits in addressing the double burden of hyperuricemia-hypercholesterolemia. This study provides valuable insights for guiding treatment approaches and emphasizes the need for nutritional education in gout patients. This study specifically focussed on the sick population. A case–control study between gouty and non-gouty populations would be interesting to better compare and explain the results observed.

Keywords: gout, hyperuricemia, hypercholesterolemia, double burden

Procedia PDF Downloads 61
1142 A t-SNE and UMAP Based Neural Network Image Classification Algorithm

Authors: Shelby Simpson, William Stanley, Namir Naba, Xiaodi Wang

Abstract:

Both t-SNE and UMAP are brand new state of art tools to predominantly preserve the local structure that is to group neighboring data points together, which indeed provides a very informative visualization of heterogeneity in our data. In this research, we develop a t-SNE and UMAP base neural network image classification algorithm to embed the original dataset to a corresponding low dimensional dataset as a preprocessing step, then use this embedded database as input to our specially designed neural network classifier for image classification. We use the fashion MNIST data set, which is a labeled data set of images of clothing objects in our experiments. t-SNE and UMAP are used for dimensionality reduction of the data set and thus produce low dimensional embeddings. Furthermore, we use the embeddings from t-SNE and UMAP to feed into two neural networks. The accuracy of the models from the two neural networks is then compared to a dense neural network that does not use embedding as an input to show which model can classify the images of clothing objects more accurately.

Keywords: t-SNE, UMAP, fashion MNIST, neural networks

Procedia PDF Downloads 198
1141 Optimization of Vertical Axis Wind Turbine Based on Artificial Neural Network

Authors: Mohammed Affanuddin H. Siddique, Jayesh S. Shukla, Chetan B. Meshram

Abstract:

The neural networks are one of the power tools of machine learning. After the invention of perceptron in early 1980's, the neural networks and its application have grown rapidly. Neural networks are a technique originally developed for pattern investigation. The structure of a neural network consists of neurons connected through synapse. Here, we have investigated the different algorithms and cost function reduction techniques for optimization of vertical axis wind turbine (VAWT) rotor blades. The aerodynamic force coefficients corresponding to the airfoils are stored in a database along with the airfoil coordinates. A forward propagation neural network is created with the input as aerodynamic coefficients and output as the airfoil co-ordinates. In the proposed algorithm, the hidden layer is incorporated into cost function having linear and non-linear error terms. In this article, it is observed that the ANNs (Artificial Neural Network) can be used for the VAWT’s optimization.

Keywords: VAWT, ANN, optimization, inverse design

Procedia PDF Downloads 324
1140 3D Modeling for Frequency and Time-Domain Airborne EM Systems with Topography

Authors: C. Yin, B. Zhang, Y. Liu, J. Cai

Abstract:

Airborne EM (AEM) is an effective geophysical exploration tool, especially suitable for ridged mountain areas. In these areas, topography will have serious effects on AEM system responses. However, until now little study has been reported on topographic effect on airborne EM systems. In this paper, an edge-based unstructured finite-element (FE) method is developed for 3D topographic modeling for both frequency and time-domain airborne EM systems. Starting from the frequency-domain Maxwell equations, a vector Helmholtz equation is derived to obtain a stable and accurate solution. Considering that the AEM transmitter and receiver are both located in the air, the scattered field method is used in our modeling. The Galerkin method is applied to discretize the Helmholtz equation for the final FE equations. Solving the FE equations, the frequency-domain AEM responses are obtained. To accelerate the calculation speed, the response of source in free-space is used as the primary field and the PARDISO direct solver is used to deal with the problem with multiple transmitting sources. After calculating the frequency-domain AEM responses, a Hankel’s transform is applied to obtain the time-domain AEM responses. To check the accuracy of present algorithm and to analyze the characteristic of topographic effect on airborne EM systems, both the frequency- and time-domain AEM responses for 3 model groups are simulated: 1) a flat half-space model that has a semi-analytical solution of EM response; 2) a valley or hill earth model; 3) a valley or hill earth with an abnormal body embedded. Numerical experiments show that close to the node points of the topography, AEM responses demonstrate sharp changes. Special attentions need to be paid to the topographic effects when interpreting AEM survey data over rugged topographic areas. Besides, the profile of the AEM responses presents a mirror relation with the topographic earth surface. In comparison to the topographic effect that mainly occurs at the high-frequency end and early time channels, the EM responses of underground conductors mainly occur at low frequencies and later time channels. For the signal of the same time channel, the dB/dt field reflects the change of conductivity better than the B-field. The research of this paper will serve airborne EM in the identification and correction of the topographic effects.

Keywords: 3D, Airborne EM, forward modeling, topographic effect

Procedia PDF Downloads 317