Search results for: mobility features
3618 Optimized Deep Learning-Based Facial Emotion Recognition System
Authors: Erick C. Valverde, Wansu Lim
Abstract:
Facial emotion recognition (FER) system has been recently developed for more advanced computer vision applications. The ability to identify human emotions would enable smart healthcare facility to diagnose mental health illnesses (e.g., depression and stress) as well as better human social interactions with smart technologies. The FER system involves two steps: 1) face detection task and 2) facial emotion recognition task. It classifies the human expression in various categories such as angry, disgust, fear, happy, sad, surprise, and neutral. This system requires intensive research to address issues with human diversity, various unique human expressions, and variety of human facial features due to age differences. These issues generally affect the ability of the FER system to detect human emotions with high accuracy. Early stage of FER systems used simple supervised classification task algorithms like K-nearest neighbors (KNN) and artificial neural networks (ANN). These conventional FER systems have issues with low accuracy due to its inefficiency to extract significant features of several human emotions. To increase the accuracy of FER systems, deep learning (DL)-based methods, like convolutional neural networks (CNN), are proposed. These methods can find more complex features in the human face by means of the deeper connections within its architectures. However, the inference speed and computational costs of a DL-based FER system is often disregarded in exchange for higher accuracy results. To cope with this drawback, an optimized DL-based FER system is proposed in this study.An extreme version of Inception V3, known as Xception model, is leveraged by applying different network optimization methods. Specifically, network pruning and quantization are used to enable lower computational costs and reduce memory usage, respectively. To support low resource requirements, a 68-landmark face detector from Dlib is used in the early step of the FER system.Furthermore, a DL compiler is utilized to incorporate advanced optimization techniques to the Xception model to improve the inference speed of the FER system. In comparison to VGG-Net and ResNet50, the proposed optimized DL-based FER system experimentally demonstrates the objectives of the network optimization methods used. As a result, the proposed approach can be used to create an efficient and real-time FER system.Keywords: deep learning, face detection, facial emotion recognition, network optimization methods
Procedia PDF Downloads 1183617 Performance Analysis of Heterogeneous Cellular Networks with Multiple Connectivity
Authors: Sungkyung Kim, Jee-Hyeon Na, Dong-Seung Kwon
Abstract:
Future mobile networks following 5th generation will be characterized by one thousand times higher gains in capacity; connections for at least one hundred billion devices; user experience capable of extremely low latency and response times. To be close to the capacity requirements and higher reliability, advanced technologies have been studied, such as multiple connectivity, small cell enhancement, heterogeneous networking, and advanced interference and mobility management. This paper is focused on the multiple connectivity in heterogeneous cellular networks. We investigate the performance of coverage and user throughput in several deployment scenarios. Using the stochastic geometry approach, the SINR distributions and the coverage probabilities are derived in case of dual connection. Also, to compare the user throughput enhancement among the deployment scenarios, we calculate the spectral efficiency and discuss our results.Keywords: heterogeneous networks, multiple connectivity, small cell enhancement, stochastic geometry
Procedia PDF Downloads 3313616 A Multi-Release Software Reliability Growth Models Incorporating Imperfect Debugging and Change-Point under the Simulated Testing Environment and Software Release Time
Authors: Sujit Kumar Pradhan, Anil Kumar, Vijay Kumar
Abstract:
The testing process of the software during the software development time is a crucial step as it makes the software more efficient and dependable. To estimate software’s reliability through the mean value function, many software reliability growth models (SRGMs) were developed under the assumption that operating and testing environments are the same. Practically, it is not true because when the software works in a natural field environment, the reliability of the software differs. This article discussed an SRGM comprising change-point and imperfect debugging in a simulated testing environment. Later on, we extended it in a multi-release direction. Initially, the software was released to the market with few features. According to the market’s demand, the software company upgraded the current version by adding new features as time passed. Therefore, we have proposed a generalized multi-release SRGM where change-point and imperfect debugging concepts have been addressed in a simulated testing environment. The failure-increasing rate concept has been adopted to determine the change point for each software release. Based on nine goodness-of-fit criteria, the proposed model is validated on two real datasets. The results demonstrate that the proposed model fits the datasets better. We have also discussed the optimal release time of the software through a cost model by assuming that the testing and debugging costs are time-dependent.Keywords: software reliability growth models, non-homogeneous Poisson process, multi-release software, mean value function, change-point, environmental factors
Procedia PDF Downloads 743615 An Approach for Association Rules Ranking
Authors: Rihab Idoudi, Karim Saheb Ettabaa, Basel Solaiman, Kamel Hamrouni
Abstract:
Medical association rules induction is used to discover useful correlations between pertinent concepts from large medical databases. Nevertheless, ARs algorithms produce huge amount of delivered rules and do not guarantee the usefulness and interestingness of the generated knowledge. To overcome this drawback, we propose an ontology based interestingness measure for ARs ranking. According to domain expert, the goal of the use of ARs is to discover implicit relationships between items of different categories such as ‘clinical features and disorders’, ‘clinical features and radiological observations’, etc. That’s to say, the itemsets which are composed of ‘similar’ items are uninteresting. Therefore, the dissimilarity between the rule’s items can be used to judge the interestingness of association rules; the more different are the items, the more interesting the rule is. In this paper, we design a distinct approach for ranking semantically interesting association rules involving the use of an ontology knowledge mining approach. The basic idea is to organize the ontology’s concepts into a hierarchical structure of conceptual clusters of targeted subjects, where each cluster encapsulates ‘similar’ concepts suggesting a specific category of the domain knowledge. The interestingness of association rules is, then, defined as the dissimilarity between corresponding clusters. That is to say, the further are the clusters of the items in the AR, the more interesting the rule is. We apply the method in our domain of interest – mammographic domain- using an existing mammographic ontology called Mammo with the goal of deriving interesting rules from past experiences, to discover implicit relationships between concepts modeling the domain.Keywords: association rule, conceptual clusters, interestingness measures, ontology knowledge mining, ranking
Procedia PDF Downloads 3223614 Etude 3D Quantum Numerical Simulation of Performance in the HEMT
Authors: A. Boursali, A. Guen-Bouazza
Abstract:
We present a simulation of a HEMT (high electron mobility transistor) structure with and without a field plate. We extract the device characteristics through the analysis of DC, AC and high frequency regimes, as shown in this paper. This work demonstrates the optimal device with a gate length of 15 nm, InAlN/GaN heterostructure and field plate structure, making it superior to modern HEMTs when compared with otherwise equivalent devices. This improves the ability to bear the burden of the current density passes in the channel. We have demonstrated an excellent current density, as high as 2.05 A/m, a peak extrinsic transconductance of 0.59S/m at VDS=2 V, and cutting frequency cutoffs of 638 GHz in the first HEMT and 463 GHz for Field plate HEMT., maximum frequency of 1.7 THz, maximum efficiency of 73%, maximum breakdown voltage of 400 V, leakage current density IFuite=1 x 10-26 A, DIBL=33.52 mV/V and an ON/OFF current density ratio higher than 1 x 1010. These values were determined through the simulation by deriving genetic and Monte Carlo algorithms that optimize the design and the future of this technology.Keywords: HEMT, silvaco, field plate, genetic algorithm, quantum
Procedia PDF Downloads 3493613 Clinical Features of Acute Aortic Dissection Patients Initially Diagnosed with ST-Segment Elevation Myocardial Infarction
Authors: Min Jee Lee, Young Sun Park, Shin Ahn, Chang Hwan Sohn, Dong Woo Seo, Jae Ho Lee, Yoon Seon Lee, Kyung Soo Lim, Won Young Kim
Abstract:
Background: Acute myocardial infarction (AMI) concomitant with acute aortic syndrome (AAS) is rare but prompt recognition of concomitant AAS is crucial, especially in patients with ST-segment elevation myocardial infarction (STEMI) because misdiagnosis with early thrombolytic or anticoagulant treatment may result in catastrophic consequences. Objectives: This study investigated the clinical features of patients of STEMI concomitant with AAS that may lead to the diagnostic clue. Method: Between 1 January 2010 and 31 December 2014, 22 patients who were the initial diagnosis of acute coronary syndrome (AMI and unstable angina) and AAS (aortic dissection, intramural hematoma and ruptured thoracic aneurysm) in our emergency department were reviewed. Among these, we excluded 10 patients who were transferred from other hospital and 4 patients with non-STEMI, leaving a total of 8 patients of STEMI concomitant with AAS for analysis. Result: The mean age of study patients was 57.5±16.31 years and five patients were Standford type A and three patients were type B aortic dissection. Six patients had ST-segment elevation in anterior leads and two patients had in inferior leads. Most of the patients had acute onset, severe chest pain but no patients had dissecting nature chest pain. Serum troponin I was elevated in three patients but all patients had D-dimer elevation. Aortic regurgitation or regional wall motion abnormality was founded in four patients. However, widened mediastinum was seen in all study patients. Conclusion: When patients with STEMI have elevated D-dimer and widened mediastinum, concomitant AAS may have to be suspected.Keywords: aortic dissection, myocardial infarction, ST-segment, d-dimer
Procedia PDF Downloads 3983612 Change Detection Analysis on Support Vector Machine Classifier of Land Use and Land Cover Changes: Case Study on Yangon
Authors: Khin Mar Yee, Mu Mu Than, Kyi Lint, Aye Aye Oo, Chan Mya Hmway, Khin Zar Chi Winn
Abstract:
The dynamic changes of Land Use and Land Cover (LULC) changes in Yangon have generally resulted the improvement of human welfare and economic development since the last twenty years. Making map of LULC is crucially important for the sustainable development of the environment. However, the exactly data on how environmental factors influence the LULC situation at the various scales because the nature of the natural environment is naturally composed of non-homogeneous surface features, so the features in the satellite data also have the mixed pixels. The main objective of this study is to the calculation of accuracy based on change detection of LULC changes by Support Vector Machines (SVMs). For this research work, the main data was satellite images of 1996, 2006 and 2015. Computing change detection statistics use change detection statistics to compile a detailed tabulation of changes between two classification images and Support Vector Machines (SVMs) process was applied with a soft approach at allocation as well as at a testing stage and to higher accuracy. The results of this paper showed that vegetation and cultivated area were decreased (average total 29 % from 1996 to 2015) because of conversion to the replacing over double of the built up area (average total 30 % from 1996 to 2015). The error matrix and confidence limits led to the validation of the result for LULC mapping.Keywords: land use and land cover change, change detection, image processing, support vector machines
Procedia PDF Downloads 1393611 Emotional and Embodied Knowledge and Responses
Authors: Salman Khokhar
Abstract:
The geopolitical landscape in Pakistan has become shrouded with suspicion between the state and the Ahmadiyya Muslim Community. The study argues that the social mobility of the community has become severely compromised, especially after the inception of the blasphemy laws and their subsequent enhancements in later years. The securitization of the community has ensured that the daily lives of Ahmadi Muslims have become severely restricted as their integration and assimilation into society become defined through their religious identity and beliefs. Consequently, performing congregational prayers or engaging in any other community activity is carried out secretly as the repercussions of such actions may lead to incarceration or, in some cases, even more extreme apprehension measures. The securitization of Ahmadis, and their daily lives are severely curtailed in Pakistan; however, due to transnational approaches, the community must implement specific measures to ensure the safety of its members, even in the West. The eyes of suspicion are always on the activities of the Ahmadiyya Muslim Community, and the community’s headquarters in Rabwah is always being viewed with suspicious lenses. The study considers how secrecy has enveloped the everyday life of the Ahmadi Muslim community and how it embodies characteristics which we thought had come to an end many years ago.Keywords: freedom, ideology, Islam, persecution
Procedia PDF Downloads 1243610 Optical Heterodyning of Injection-Locked Laser Sources: A Novel Technique for Millimeter-Wave Signal Generation
Authors: Subal Kar, Madhuja Ghosh, Soumik Das, Antara Saha
Abstract:
A novel technique has been developed to generate ultra-stable millimeter-wave signal by optical heterodyning of the output from two slave laser (SL) sources injection-locked to the sidebands of a frequency modulated (FM) master laser (ML). Precise thermal tuning of the SL sources is required to lock the particular slave laser frequency to the desired FM sidebands of the ML. The output signals from the injection-locked SL when coherently heterodyned in a fast response photo detector like high electron mobility transistor (HEMT), extremely stable millimeter-wave signal having very narrow line width can be generated. The scheme may also be used to generate ultra-stable sub-millimeter-wave/terahertz signal.Keywords: FM sideband injection locking, master-slave injection locking, millimetre-wave signal generation, optical heterodyning
Procedia PDF Downloads 3923609 3D Quantum Simulation of a HEMT Device Performance
Authors: Z. Kourdi, B. Bouazza, M. Khaouani, A. Guen-Bouazza, Z. Djennati, A. Boursali
Abstract:
We present a simulation of a HEMT (high electron mobility transistor) structure with and without a field plate. We extract the device characteristics through the analysis of DC, AC and high frequency regimes, as shown in this paper. This work demonstrates the optimal device with a gate length of 15 nm, InAlN/GaN heterostructure and field plate structure, making it superior to modern HEMTs when compared with otherwise equivalent devices. This improves the ability to bear the burden of the current density passes in the channel. We have demonstrated an excellent current density, as high as 2.05 A/mm, a peak extrinsic transconductance of 590 mS/mm at VDS=2 V, and cutting frequency cutoffs of 638 GHz in the first HEMT and 463 GHz for Field plate HEMT., maximum frequency of 1.7 THz, maximum efficiency of 73%, maximum breakdown voltage of 400 V, DIBL=33.52 mV/V and an ON/OFF current density ratio higher than 1 x 1010. These values were determined through the simulation by deriving genetic and Monte Carlo algorithms that optimize the design and the future of this technology.Keywords: HEMT, Silvaco, field plate, genetic algorithm, quantum
Procedia PDF Downloads 4763608 Traffic Congestion Problem and Possible Solution in Kabul City
Authors: Sayed Abdul Rahman Sadaat, Nsenda Lukumwena
Abstract:
Traffic congestion is a worldwide issue, especially in developing countries. This is also the case of Afghanistan, especially in Kabul-the capital city, whose rapid population growth makes it the fifth fastest growing city in the world. Traffic congestion affects not only the mobility of people and goods but also the air quality that leads to numerous deaths (3000 people) every year. There are many factors that contribute to traffic congestion. The insufficiency and inefficiency of public transportation system along with the increase of private vehicles can be considered among the most important contributing factors. This paper addresses the traffic congestion and attempts to suggest possible solutions that can help improve the current public transportation system in Kabul. To this end, the methodology used in this paper includes field work conducted in Kabul city and literature review. The outcome suggests that improving the public transportation system is likely to contribute to the reduction of traffic congestion and the improvement of air quality, thereby reducing the number of death related to air quality.Keywords: air quality, Kabul, Afghanistan, public transportation system, improvements, traffic congestion
Procedia PDF Downloads 3813607 Exclusive Value Adding by iCenter Analytics on Transient Condition
Authors: Zhu Weimin, Allegorico Carmine, Ruggiero Gionata
Abstract:
During decades of Baker Hughes (BH) iCenter experience, it is demonstrated that in addition to conventional insights on equipment steady operation conditions, insights on transient conditions can add significant and exclusive value for anomaly detection, downtime saving, and predictive maintenance. Our work shows examples from the BH iCenter experience to introduce the advantages and features of using transient condition analytics: (i) Operation under critical engine conditions: e.g., high level or high change rate of temperature, pressure, flow, vibration, etc., that would not be reachable in normal operation, (ii) Management of dedicated sub-systems or components, many of which are often bottlenecks for reliability and maintenance, (iii) Indirect detection of anomalies in the absence of instrumentation, (iv) Repetitive sequences: if data is properly processed, the engineering features of transients provide not only anomaly detection but also problem characterization and prognostic indicators for predictive maintenance, (v) Engine variables accounting for fatigue analysis. iCenter has been developing and deploying a series of analytics based on transient conditions. They are contributing to exclusive value adding in the following areas: (i) Reliability improvement, (ii) Startup reliability improvement, (iii) Predictive maintenance, (iv) Repair/overhaul cost down. Illustrative examples for each of the above areas are presented in our study, focusing on challenges and adopted techniques ranging from purely statistical approaches to the implementation of machine learning algorithms. The obtained results demonstrate how the value is obtained using transient condition analytics in the BH iCenter experience.Keywords: analytics, diagnostics, monitoring, turbomachinery
Procedia PDF Downloads 743606 Contribution of Geomatics Technology in the Capability to Implement an On-Demand Transport in Oran Wilaya (the Northwestern of Algeria)
Authors: Brahmia Nadjet
Abstract:
The growing needs of displacements led advanced countries in this field install new specific transport systems, able to palliate any deficiencies, especially when regular public transport does not adequately meet the requests of users. In this context, on-demand transport systems (ODT) are very efficient. They rely on techniques based on the location of trip generators which should be assured effectively with the use of operators responsible for the advance reservation, planning and organization, and studying the different ODT criteria (organizational, technical, geographical, etc.). As the advanced countries in the field of transport, some developing countries are involved in the adaptation of the new technologies to reduce the deficit in their communication system. This paper presents the study of an ODT implementation in the west of Algeria, by developing the geomatics side of the study. This part requires the use of specific systems such as Geographic Information System (GIS), Road Database Management System (RDBMS). So, we developed the process through an application in an environment of mobility by using the computer tools dedicated to the management of the entities related to the transport field.Keywords: ODT, geomatics, GIS, transport systems
Procedia PDF Downloads 4703605 Transport of Reactive Carbo-Iron Composite Particles for in situ Groundwater Remediation Investigated at Laboratory and Field Scale
Authors: Sascha E. Oswald, Jan Busch
Abstract:
The in-situ dechlorination of contamination by chlorinated solvents in groundwater via zero-valent iron (nZVI) is potentially an efficient and prompt remediation method. A key requirement is that nZVI has to be introduced in the subsurface in a way that substantial quantities of the contaminants are actually brought into direct contact with the nZVI in the aquifer. Thus it could be a more flexible and precise alternative to permeable reactive barrier techniques using granular iron. However, nZVI are often limited by fast agglomeration and sedimentation in colloidal suspensions, even more so in the aquifer sediments, which is a handicap for the application to treat source zones or contaminant plumes. Colloid-supported nZVI show promising characteristics to overcome these limitations and Carbo-Iron Colloids is a newly developed composite material aiming for that. The nZVI is built onto finely ground activated carbon of about a micrometer diameter acting as a carrier for it. The Carbo-Iron Colloids are often suspended with a polyanionic stabilizer, and carboxymethyl cellulose is one with good properties for that. We have investigated the transport behavior of Carbo-Iron Colloids (CIC) on different scales and for different conditions to assess its mobility in aquifer sediments as a key property for making its application feasible. The transport properties were tested in one-dimensional laboratory columns, a two-dimensional model aquifer and also an injection experiment in the field. Those experiments were accompanied by non-invasive tomographic investigations of the transport and filtration processes of CIC suspensions. The laboratory experiments showed that a larger part of the CIC can travel at least scales of meters for favorable but realistic conditions. Partly this is even similar to a dissolved tracer. For less favorable conditions this can be much smaller and in all cases a particular fraction of the CIC injected is retained mainly shortly after entering the porous medium. As field experiment a horizontal flow field was established, between two wells with a distance of 5 meters, in a confined, shallow aquifer at a contaminated site in North German lowlands. First a tracer test was performed and a basic model was set up to define the design of the CIC injection experiment. Then CIC suspension was introduced into the aquifer at the injection well while the second well was pumped and samples taken there to observe the breakthrough of CIC. This was based on direct visual inspection and total particle and iron concentrations of water samples analyzed in the laboratory later. It could be concluded that at least 12% of the CIC amount injected reached the extraction well in due course, some of it traveling distances larger than 10 meters in the non-uniform dipole flow field. This demonstrated that these CIC particles have a substantial mobility for reaching larger volumes of a contaminated aquifer and for interacting there by their reactivity with dissolved contaminants in the pore space. Therefore they seem suited well for groundwater remediation by in-situ formation of reactive barriers for chlorinated solvent plumes or even source removal.Keywords: carbo-iron colloids, chlorinated solvents, in-situ remediation, particle transport, plume treatment
Procedia PDF Downloads 2463604 Paratransit as Tool for Peri-Urban Connectivity: A Comparative Case Study of Indore and Bhopal, Madhya Pradesh, India
Authors: Sumit Rahangdale
Abstract:
This research paper is a comparative study of two BRTS cities of Madhya Pradesh (INDIA), Bhopal and Indore. Indore is the largest and most populous city of Madhya Pradesh, with heavy traffic, while Bhopal though being the capital of Madhya Pradesh is comparatively less developed and shows less traffic The cities show similarity in case of peri-urban nature, but variation is observed in transportation fare, where Indore has been able to reduce it but Bhopal couldn’t, one of the reason for it is the para-transit services. Indore can be considered as a successful model due to the low fares and can be implemented in other parts of the city. The research paper tries to identify relation of para-transit services with the peri-urban connectivity and provide a solution for the Bhopal case study.Keywords: demand-supply-fare relationship, mobility and accessibility, paratransit, peri-urban connectivity
Procedia PDF Downloads 1723603 Developing Pandi-Tekki to Tourism Destination in Tanglang, Billiri Local Government Area, Gombe State, Nigeria
Authors: Sanusi Abubakar Sadiq
Abstract:
Despite the significance of tourism as a key revenue earner and employment generator, it is still being disregarded in many areas. The prospects of existing resources could boost development in communities; region, etc. are less used. This study is carried out with the view of developing Pandi-Tekki in Tanglang in Billiri Local Government Area as a Tourism Destination. It was primarily aimed at identifying features of Pandi-Tekki that could be developed into tourism attraction and suggest ways of developing the prospective site into a tourism destination, as well as exploring its possible contribution to tourism sector in Gombe State. Literature was reviewed based on relevant published materials. Data was collected through the use of qualitative and quantitative methods which include personal observation and structured questionnaire. Data was analyzed using the statistical package for the social sciences (SPSS) software. Result based on the data collected shows that Pandi-Tekki has potentials that can be developed as an attraction. The result also shows that the local community perceives tourism as a good development that will open them up to the entire world and also generate revenue to stimulate their economy. Conclusions were drawn based on the findings with regard to the analysis carried out in this research. It was discovered that Pandi-Tekki can be developed as a tourism destination, and there will be a great success towards achieving the aim and objectives of the development. Therefore, recommendations were made on creating awareness on the need to develop Pandi-Tekki as a Tourism Destination and the need for government to provide tourism facilities at the destination since it is a public outfit.Keywords: attraction, destination, developing, features
Procedia PDF Downloads 2873602 A Data-Driven Compartmental Model for Dengue Forecasting and Covariate Inference
Authors: Yichao Liu, Peter Fransson, Julian Heidecke, Jonas Wallin, Joacim Rockloev
Abstract:
Dengue, a mosquito-borne viral disease, poses a significant public health challenge in endemic tropical or subtropical countries, including Sri Lanka. To reveal insights into the complexity of the dynamics of this disease and study the drivers, a comprehensive model capable of both robust forecasting and insightful inference of drivers while capturing the co-circulating of several virus strains is essential. However, existing studies mostly focus on only one aspect at a time and do not integrate and carry insights across the siloed approach. While mechanistic models are developed to capture immunity dynamics, they are often oversimplified and lack integration of all the diverse drivers of disease transmission. On the other hand, purely data-driven methods lack constraints imposed by immuno-epidemiological processes, making them prone to overfitting and inference bias. This research presents a hybrid model that combines machine learning techniques with mechanistic modelling to overcome the limitations of existing approaches. Leveraging eight years of newly reported dengue case data, along with socioeconomic factors, such as human mobility, weekly climate data from 2011 to 2018, genetic data detecting the introduction and presence of new strains, and estimates of seropositivity for different districts in Sri Lanka, we derive a data-driven vector (SEI) to human (SEIR) model across 16 regions in Sri Lanka at the weekly time scale. By conducting ablation studies, the lag effects allowing delays up to 12 weeks of time-varying climate factors were determined. The model demonstrates superior predictive performance over a pure machine learning approach when considering lead times of 5 and 10 weeks on data withheld from model fitting. It further reveals several interesting interpretable findings of drivers while adjusting for the dynamics and influences of immunity and introduction of a new strain. The study uncovers strong influences of socioeconomic variables: population density, mobility, household income and rural vs. urban population. The study reveals substantial sensitivity to the diurnal temperature range and precipitation, while mean temperature and humidity appear less important in the study location. Additionally, the model indicated sensitivity to vegetation index, both max and average. Predictions on testing data reveal high model accuracy. Overall, this study advances the knowledge of dengue transmission in Sri Lanka and demonstrates the importance of incorporating hybrid modelling techniques to use biologically informed model structures with flexible data-driven estimates of model parameters. The findings show the potential to both inference of drivers in situations of complex disease dynamics and robust forecasting models.Keywords: compartmental model, climate, dengue, machine learning, social-economic
Procedia PDF Downloads 843601 Site Analysis’ Importance as a Valid Factor in Building Design
Authors: Mekwa Eme, Anya chukwuma
Abstract:
The act of evaluating a particular site physically and socially in order to create a good design solution that will address the physical and interior environment of the location is known as architectural site analysis. This essay will describe site analysis as a useful design component. According to the introduction and supporting research, site evaluation and analysis are crucial to good design in terms of topography, orientation, site size, accessibility, rainfall, wind direction, and times of sunrise and sunset. Methodology: Both quantitative and qualitative analyses are used in this paper. The primary and secondary types of data collection are as follows. This information was gathered via the case study approach, already published literature, journals, the internet, a local poll, oral interviews, inquiries, and in-person interviews. The purpose of this is to clarify the benefits of site analysis for the design process and its implications for the working or building stage. Results: Each site's criteria are unique in terms of things like soil, plants, trees, accessibility, topography, and security. This will make it easier for the architect and environmentalist to decide on the idea, shape, and supporting structures of the design. It is crucial because before any design work is done, the nature of the target location will be determined through site visits and research. The location, contours, site features, and accessibility are just a few of the topics included in this site study. In order for students and working architects to understand the nature of the site they will be working on, site analysis is a key component of architectural education. The building's orientation, the site's circulation, and the sustainability of the site may all be determined with thorough research of the site's features.Keywords: analysis, climate, statistics, design
Procedia PDF Downloads 2493600 Digital Manufacturing: Evolution and a Process Oriented Approach to Align with Business Strategy
Authors: Abhimanyu Pati, Prabir K. Bandyopadhyay
Abstract:
The paper intends to highlight the significance of Digital Manufacturing (DM) strategy in support and achievement of business strategy and goals of any manufacturing organization. Towards this end, DM initiatives have been given a process perspective, while not undermining its technological significance, with a view to link its benefits directly with fulfilment of customer needs and expectations in a responsive and cost-effective manner. A digital process model has been proposed to categorize digitally enabled organizational processes with a view to create synergistic groups, which adopt and use digital tools having similar characteristics and functionalities. This will throw future opportunities for researchers and developers to create a unified technology environment for integration and orchestration of processes. Secondly, an effort has been made to apply “what” and “how” features of Quality Function Deployment (QFD) framework to establish the relationship between customers’ needs – both for external and internal customers, and the features of various digital processes, which support for the achievement of these customer expectations. The paper finally concludes that in the present highly competitive environment, business organizations cannot thrive to sustain unless they understand the significance of digital strategy and integrate it with their business strategy with a clearly defined implementation roadmap. A process-oriented approach to DM strategy will help business executives and leaders to appreciate its value propositions and its direct link to organization’s competitiveness.Keywords: knowledge management, cloud computing, knowledge management approaches, cloud-based knowledge management
Procedia PDF Downloads 3093599 Virulence Phenotypes Among Multi-Drug Resistant Uropathogenic Bacteria
Authors: V. V. Lakshmi, Y. V. S. Annapurna
Abstract:
Urinary tract infection (UTI) is one of the most common infectious diseases seen in the community. Susceptible individuals experience multiple episodes, and progress to acute pyelonephritis or uro-sepsis or develop asymptomatic bacteriuria (ABU). Ability to cause extraintestinal infections depends on several virulence factors required for survival at extraintestinal sites. Presence of virulence phenotypes enhances the pathogenicity of these otherwise commensal organisms and thus augments its ability to cause extraintestinal infections, the most frequent in urinary tract infections(UTI). The present study focuses on detection of the virulence characters exhibited by the uropathogenic organism and most common factors exhibited in the local pathogens. A total of 700 isolates of E.coli and Klebsiella spp were included in the study. These were isolated from patients from local hospitals reported to be suffering with UTI over a period of three years. Isolation and identification was done based on Gram character and IMVIC reactions. Antibiotic sensitivity profile was carried out by disc diffusion method and multi drug resistant strains with MAR index of 0.7 were further selected.. Virulence features examined included their ability to produce exopolysaccharides, protease- gelatinase production, hemolysin production, haemagglutination and hydrophobicity test. Exopolysaccharide production was most predominant virulence feature among the isolates when checked by congo red method. The biofilms production examined by microtitre plates using ELISA reader confirmed that this is the major factor contributing to virulencity of the pathogens followed by hemolysin productionKeywords: Escherichia coli, Klebsiella sp, Uropathogens, Virulence features.
Procedia PDF Downloads 4213598 Historical Development of Negative Emotive Intensifiers in Hungarian
Authors: Martina Katalin Szabó, Bernadett Lipóczi, Csenge Guba, István Uveges
Abstract:
In this study, an exhaustive analysis was carried out about the historical development of negative emotive intensifiers in the Hungarian language via NLP methods. Intensifiers are linguistic elements which modify or reinforce a variable character in the lexical unit they apply to. Therefore, intensifiers appear with other lexical items, such as adverbs, adjectives, verbs, infrequently with nouns. Due to the complexity of this phenomenon (set of sociolinguistic, semantic, and historical aspects), there are many lexical items which can operate as intensifiers. The group of intensifiers are admittedly one of the most rapidly changing elements in the language. From a linguistic point of view, particularly interesting are a special group of intensifiers, the so-called negative emotive intensifiers, that, on their own, without context, have semantic content that can be associated with negative emotion, but in particular cases, they may function as intensifiers (e.g.borzasztóanjó ’awfully good’, which means ’excellent’). Despite their special semantic features, negative emotive intensifiers are scarcely examined in literature based on large Historical corpora via NLP methods. In order to become better acquainted with trends over time concerning the intensifiers, The exhaustively analysed a specific historical corpus, namely the Magyar TörténetiSzövegtár (Hungarian Historical Corpus). This corpus (containing 3 millions text words) is a collection of texts of various genres and styles, produced between 1772 and 2010. Since the corpus consists of raw texts and does not contain any additional information about the language features of the data (such as stemming or morphological analysis), a large amount of manual work was required to process the data. Thus, based on a lexicon of negative emotive intensifiers compiled in a previous phase of the research, every occurrence of each intensifier was queried, and the results were stored in a separate data frame. Then, basic linguistic processing (POS-tagging, lemmatization etc.) was carried out automatically with the ‘magyarlanc’ NLP-toolkit. Finally, the frequency and collocation features of all the negative emotive words were automatically analyzed in the corpus. Outcomes of the research revealed in detail how these words have proceeded through grammaticalization over time, i.e., they change from lexical elements to grammatical ones, and they slowly go through a delexicalization process (their negative content diminishes over time). What is more, it was also pointed out which negative emotive intensifiers are at the same stage in this process in the same time period. Giving a closer look to the different domains of the analysed corpus, it also became certain that during this process, the pragmatic role’s importance increases: the newer use expresses the speaker's subjective, evaluative opinion at a certain level.Keywords: historical corpus analysis, historical linguistics, negative emotive intensifiers, semantic changes over time
Procedia PDF Downloads 2333597 Iris Cancer Detection System Using Image Processing and Neural Classifier
Authors: Abdulkader Helwan
Abstract:
Iris cancer, so called intraocular melanoma is a cancer that starts in the iris; the colored part of the eye that surrounds the pupil. There is a need for an accurate and cost-effective iris cancer detection system since the available techniques used currently are still not efficient. The combination of the image processing and artificial neural networks has a great efficiency for the diagnosis and detection of the iris cancer. Image processing techniques improve the diagnosis of the cancer by enhancing the quality of the images, so the physicians diagnose properly. However, neural networks can help in making decision; whether the eye is cancerous or not. This paper aims to develop an intelligent system that stimulates a human visual detection of the intraocular melanoma, so called iris cancer. The suggested system combines both image processing techniques and neural networks. The images are first converted to grayscale, filtered, and then segmented using prewitt edge detection algorithm to detect the iris, sclera circles and the cancer. The principal component analysis is used to reduce the image size and for extracting features. Those features are considered then as inputs for a neural network which is capable of deciding if the eye is cancerous or not, throughout its experience adopted by many training iterations of different normal and abnormal eye images during the training phase. Normal images are obtained from a public database available on the internet, “Mile Research”, while the abnormal ones are obtained from another database which is the “eyecancer”. The experimental results for the proposed system show high accuracy 100% for detecting cancer and making the right decision.Keywords: iris cancer, intraocular melanoma, cancerous, prewitt edge detection algorithm, sclera
Procedia PDF Downloads 5033596 Rapid Design Approach for Electric Long-Range Drones
Authors: Adrian Sauer, Lorenz Einberger, Florian Hilpert
Abstract:
The advancements and technical innovations in the field of electric unmanned aviation over the past years opened the third dimension in areas like surveillance, logistics, and mobility for a wide range of private and commercial users. Researchers and companies are faced with the task of integrating their technology into airborne platforms. Especially start-ups and researchers require unmanned aerial vehicles (UAV), which can be quickly developed for specific use cases without spending significant time and money. This paper shows a design approach for the rapid development of a lightweight automatic separate-lift-thrust (SLT) electric vertical take-off and landing (eVTOL) UAV prototype, which is able to fulfill basic transportation as well as surveillance missions. The design approach does not require expensive or time-consuming design loop software. Thereby developers can easily understand, adapt, and adjust the presented method for their own project. The approach is mainly focused on crucial design aspects such as aerofoil, tuning, and powertrain.Keywords: aerofoil, drones, rapid prototyping, powertrain
Procedia PDF Downloads 713595 From Comfort to Safety: Assessing the Influence of Car Seat Design on Driver Reaction and Performance
Authors: Sabariah Mohd Yusoff, Qamaruddin Adzeem Muhamad Murad
Abstract:
This study investigates the impact of car seat design on driver response time, addressing a critical gap in understanding how ergonomic features influence both performance and safety. Controlled driving experiments were conducted with fourteen participants (11 male, 3 female) across three locations chosen for their varying traffic conditions to account for differences in driver alertness. Participants interacted with various seat designs while performing driving tasks, and objective metrics such as braking and steering response times were meticulously recorded. Advanced statistical methods, including regression analysis and t-tests, were employed to identify design factors that significantly affect driver response times. Subjective feedback was gathered through detailed questionnaires—focused on driving experience and knowledge of response time—and in-depth interviews. This qualitative data was analyzed thematically to provide insights into driver comfort and usability preferences. The study aims to identify key seat design features that impact driver response time and to gain a deeper understanding of driver preferences for comfort and usability. The findings are expected to inform evidence-based guidelines for optimizing car seat design, ultimately enhancing driver performance and safety. The research offers valuable implications for automotive manufacturers and designers, contributing to the development of seats that improve driver response time and overall driving safety.Keywords: car seat design, driver response time, cognitive driving, ergonomics optimization
Procedia PDF Downloads 243594 Virulence Phenotypes among Multi Drug Resistant Uropathogenic E. Coli and Klebsiella SPP
Authors: V. V. Lakshmi, Y. V. S. Annapurna
Abstract:
Urinary tract infection (UTI) is one of the most common infectious diseases seen in the community. Susceptible individuals experience multiple episodes, and progress to acute pyelonephritis or uro-sepsis or develop asymptomatic bacteriuria (ABU). Ability to cause extraintestinal infections depends on several virulence factors required for survival at extraintestinal sites. Presence of virulence phenotypes enhances the pathogenicity of these otherwise commensal organisms and thus augments its ability to cause extraintestinal infections, the most frequent in urinary tract infections(UTI). The present study focuses on detection of the virulence characters exhibited by the uropathogenic organism and most common factors exhibited in the local pathogens. A total of 700 isolates of E.coli and Klebsiella spp were included in the study.These were isolated from patients from local hospitals reported to be suffering with UTI over a period of three years. Isolation and identification was done based on Gram character and IMVIC reactions. Antibiotic sensitivity profile was carried out by disc diffusion method and multi drug resistant strains with MAR index of 0.7 were further selected. Virulence features examined included their ability to produce exopolysaccharides, protease- gelatinase production, hemolysin production, haemagglutination and hydrophobicity test. Exopolysaccharide production was most predominant virulence feature among the isolates when checked by congo red method. The biofilms production examined by microtitre plates using ELISA reader confirmed that this is the major factor contributing to virulencity of the pathogens followed by hemolysin production.Keywords: Escherichia coli, Klebsiella spp, Uropathogens, virulence features
Procedia PDF Downloads 3193593 Spectrogram Pre-Processing to Improve Isotopic Identification to Discriminate Gamma and Neutrons Sources
Authors: Mustafa Alhamdi
Abstract:
Industrial application to classify gamma rays and neutron events is investigated in this study using deep machine learning. The identification using a convolutional neural network and recursive neural network showed a significant improvement in predication accuracy in a variety of applications. The ability to identify the isotope type and activity from spectral information depends on feature extraction methods, followed by classification. The features extracted from the spectrum profiles try to find patterns and relationships to present the actual spectrum energy in low dimensional space. Increasing the level of separation between classes in feature space improves the possibility to enhance classification accuracy. The nonlinear nature to extract features by neural network contains a variety of transformation and mathematical optimization, while principal component analysis depends on linear transformations to extract features and subsequently improve the classification accuracy. In this paper, the isotope spectrum information has been preprocessed by finding the frequencies components relative to time and using them as a training dataset. Fourier transform implementation to extract frequencies component has been optimized by a suitable windowing function. Training and validation samples of different isotope profiles interacted with CdTe crystal have been simulated using Geant4. The readout electronic noise has been simulated by optimizing the mean and variance of normal distribution. Ensemble learning by combing voting of many models managed to improve the classification accuracy of neural networks. The ability to discriminate gamma and neutron events in a single predication approach using deep machine learning has shown high accuracy using deep learning. The paper findings show the ability to improve the classification accuracy by applying the spectrogram preprocessing stage to the gamma and neutron spectrums of different isotopes. Tuning deep machine learning models by hyperparameter optimization of neural network models enhanced the separation in the latent space and provided the ability to extend the number of detected isotopes in the training database. Ensemble learning contributed significantly to improve the final prediction.Keywords: machine learning, nuclear physics, Monte Carlo simulation, noise estimation, feature extraction, classification
Procedia PDF Downloads 1503592 Analysis of NFC and Biometrics in the Retail Industry
Authors: Ziwei Xu
Abstract:
The increasing emphasis on mobility has driven the application of innovative communication technologies across various industries. In the retail sector, Near Field Communication (NFC) has emerged as a significant and transformative technology, particularly in the payment and retail supermarket sectors. NFC enables new payment methods, such as electronic wallets, and enhances information management in supermarkets, contributing to the growth of the trade. This report presents a comprehensive analysis of NFC technology, focusing on five key aspects. Firstly, it provides an overview of NFC, including its application methods and development history. Additionally, it incorporates Arthur's work on combinatorial evolution to elucidate the emergence and impact of NFC technology, while acknowledging the limitations of the model in analyzing NFC. The report then summarizes the positive influence of NFC on the retail industry along with its associated constraints. Furthermore, it explores the adoption of NFC from both organizational and individual perspectives, employing the Best Predictors of organizational IT adoption and UTAUT2 models, respectively. Finally, the report discusses the potential future replacement of NFC with biometrics technology, highlighting its advantages over NFC and leveraging Arthur's model to investigate its future development prospects.Keywords: innovation, NFC, industry, biometrics
Procedia PDF Downloads 753591 Applying the Regression Technique for Prediction of the Acute Heart Attack
Authors: Paria Soleimani, Arezoo Neshati
Abstract:
Myocardial infarction is one of the leading causes of death in the world. Some of these deaths occur even before the patient reaches the hospital. Myocardial infarction occurs as a result of impaired blood supply. Because the most of these deaths are due to coronary artery disease, hence the awareness of the warning signs of a heart attack is essential. Some heart attacks are sudden and intense, but most of them start slowly, with mild pain or discomfort, then early detection and successful treatment of these symptoms is vital to save them. Therefore, importance and usefulness of a system designing to assist physicians in the early diagnosis of the acute heart attacks is obvious. The purpose of this study is to determine how well a predictive model would perform based on the only patient-reportable clinical history factors, without using diagnostic tests or physical exams. This type of the prediction model might have application outside of the hospital setting to give accurate advice to patients to influence them to seek care in appropriate situations. For this purpose, the data were collected on 711 heart patients in Iran hospitals. 28 attributes of clinical factors can be reported by patients; were studied. Three logistic regression models were made on the basis of the 28 features to predict the risk of heart attacks. The best logistic regression model in terms of performance had a C-index of 0.955 and with an accuracy of 94.9%. The variables, severe chest pain, back pain, cold sweats, shortness of breath, nausea, and vomiting were selected as the main features.Keywords: Coronary heart disease, Acute heart attacks, Prediction, Logistic regression
Procedia PDF Downloads 4493590 Failure Analysis of Fuel Pressure Supply from an Aircraft Engine
Authors: M. Pilar Valles-gonzalez, Alejandro Gonzalez Meije, Ana Pastor Muro, Maria Garcia-Martinez, Beatriz Gonzalez Caballero
Abstract:
This paper studies a failure case of a fuel pressure supply tube from an aircraft engine. Multiple fracture cases of the fuel pressure control tube from aircraft engines have been reported. The studied set was composed of the mentioned tube, a welded connecting pipe, where the fracture has been produced, and a union nut. The fracture has been produced in one most critical zones of the tube, in a region next to the supporting body of the union nut to the connector. The tube material was X6CrNiTi18-10, an austenitic stainless steel. Chemical composition was determined using an X-Ray fluorescence spectrometer (XRF) and combustion equipment. Furthermore, the material has been mechanical, by hardness test, and microstructural characterized using a stereomicroscope and an optical microscope. The results confirmed that it is within specifications. To determine the macrofractographic features, a visual examination and a stereo microscope of the tube fracture surface have been carried out. The results revealed a tube plastic macrodeformation, surface damaged, and signs of a possible corrosion process. Fracture surface was also inspected by scanning electron microscopy (FE-SEM), equipped with a microanalysis system by X-ray dispersive energy (EDX), to determine the microfractographic features in order to find out the failure mechanism involved in the fracture. Fatigue striations, which are typical from a progressive fracture by a fatigue mechanism, have been observed. The origin of the fracture has been placed in defects located on the outer wall of the tube, leading to a final overload fracture.Keywords: aircraft engine, fatigue, FE-SEM, fractography, fracture, fuel tube, microstructure, stainless steel
Procedia PDF Downloads 1553589 On the Possibility of Real Time Characterisation of Ambient Toxicity Using Multi-Wavelength Photoacoustic Instrument
Authors: Tibor Ajtai, Máté Pintér, Noémi Utry, Gergely Kiss-Albert, Andrea Palágyi, László Manczinger, Csaba Vágvölgyi, Gábor Szabó, Zoltán Bozóki
Abstract:
According to the best knowledge of the authors, here we experimentally demonstrate first, a quantified correlation between the real-time measured optical feature of the ambient and the off-line measured toxicity data. Finally, using these correlations we are presenting a novel methodology for real time characterisation of ambient toxicity based on the multi wavelength aerosol phase photoacoustic measurement. Ambient carbonaceous particulate matter is one of the most intensively studied atmospheric constituent in climate science nowadays. Beyond their climatic impact, atmospheric soot also plays an important role as an air pollutant that harms human health. Moreover, according to the latest scientific assessments ambient soot is the second most important anthropogenic emission source, while in health aspect its being one of the most harmful atmospheric constituents as well. Despite of its importance, generally accepted standard methodology for the quantitative determination of ambient toxicology is not available yet. Dominantly, ambient toxicology measurement is based on the posterior analysis of filter accumulated aerosol with limited time resolution. Most of the toxicological studies are based on operational definitions using different measurement protocols therefore the comprehensive analysis of the existing data set is really limited in many cases. The situation is further complicated by the fact that even during its relatively short residence time the physicochemical features of the aerosol can be masked significantly by the actual ambient factors. Therefore, decreasing the time resolution of the existing methodology and developing real-time methodology for air quality monitoring are really actual issues in the air pollution research. During the last decades many experimental studies have verified that there is a relation between the chemical composition and the absorption feature quantified by Absorption Angström Exponent (AAE) of the carbonaceous particulate matter. Although the scientific community are in the common platform that the PhotoAcoustic Spectroscopy (PAS) is the only methodology that can measure the light absorption by aerosol with accurate and reliable way so far, the multi-wavelength PAS which are able to selectively characterise the wavelength dependency of absorption has become only available in the last decade. In this study, the first results of the intensive measurement campaign focusing the physicochemical and toxicological characterisation of ambient particulate matter are presented. Here we demonstrate the complete microphysical characterisation of winter time urban ambient including optical absorption and scattering as well as size distribution using our recently developed state of the art multi-wavelength photoacoustic instrument (4λ-PAS), integrating nephelometer (Aurora 3000) as well as single mobility particle sizer and optical particle counter (SMPS+C). Beyond this on-line characterisation of the ambient, we also demonstrate the results of the eco-, cyto- and genotoxicity measurements of ambient aerosol based on the posterior analysis of filter accumulated aerosol with 6h time resolution. We demonstrate a diurnal variation of toxicities and AAE data deduced directly from the multi-wavelength absorption measurement results.Keywords: photoacoustic spectroscopy, absorption Angström exponent, toxicity, Ames-test
Procedia PDF Downloads 302