Search results for: database approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 14517

Search results for: database approach

14337 An Automatic Large Classroom Attendance Conceptual Model Using Face Counting

Authors: Sirajdin Olagoke Adeshina, Haidi Ibrahim, Akeem Salawu

Abstract:

large lecture theatres cannot be covered by a single camera but rather by a multicamera setup because of their size, shape, and seating arrangements. Although, classroom capture is achievable through a single camera. Therefore, a design and implementation of a multicamera setup for a large lecture hall were considered. Researchers have shown emphasis on the impact of class attendance taken on the academic performance of students. However, the traditional method of carrying out this exercise is below standard, especially for large lecture theatres, because of the student population, the time required, sophistication, exhaustiveness, and manipulative influence. An automated large classroom attendance system is, therefore, imperative. The common approach in this system is face detection and recognition, where known student faces are captured and stored for recognition purposes. This approach will require constant face database updates due to constant changes in the facial features. Alternatively, face counting can be performed by cropping the localized faces on the video or image into a folder and then count them. This research aims to develop a face localization-based approach to detect student faces in classroom images captured using a multicamera setup. A selected Haar-like feature cascade face detector trained with an asymmetric goal to minimize the False Rejection Rate (FRR) relative to the False Acceptance Rate (FAR) was applied on Raspberry Pi 4B. A relationship between the two factors (FRR and FAR) was established using a constant (λ) as a trade-off between the two factors for automatic adjustment during training. An evaluation of the proposed approach and the conventional AdaBoost on classroom datasets shows an improvement of 8% TPR (output result of low FRR) and 7% minimization of the FRR. The average learning speed of the proposed approach was improved with 1.19s execution time per image compared to 2.38s of the improved AdaBoost. Consequently, the proposed approach achieved 97% TPR with an overhead constraint time of 22.9s compared to 46.7s of the improved Adaboost when evaluated on images obtained from a large lecture hall (DK5) USM.

Keywords: automatic attendance, face detection, haar-like cascade, manual attendance

Procedia PDF Downloads 48
14336 The Future of Hospitals: A Systematic Review in the Field of Architectural Design with a Disruptive Research and Development Approach

Authors: María Araya Léon, Ainoa Abella, Aura Murillo, Ricardo Guasch, Laura Clèries

Abstract:

Objectives: This article aims to examine scientific theory framed within the term hospitals of the future from a multidisciplinary and cross-sectional perspective. To understand the connection that the various cross-sectional areas we studied have with architectural spaces and to determine the future outlook of the works examined and how they can be classified into the categories of need/solution, evolution/revolution, collective/individual, and preventive/corrective. Background: The changes currently taking place within the context of healthcare demonstrate how important these projects are and the need for companies to face future changes. Method: A systematic review has been carried out focused on what will the hospitals of the future be like in relation to the elements that form part of their use, design, and architectural space experience, using the WOS database from 2016 to 2019. Results: The large number of works about sensoring & big data and the scarce amount related to the area of materials is worth highlighting. Furthermore, no growth concerning future issues is envisaged over time. Regarding classifications, the articles we reviewed address evolutionary and collective solutions more, and in terms of preventive and corrective solutions, they were found at a similar level. Conclusions: Although our research focused on the future of hospitals, there is little evidence representing this approach. We also detected that, given the relevance of the research on how the built environment influences human health and well-being, these studies should be promoted within the context of healthcare.

Keywords: hospitals, future, architectural space, disruptive approach

Procedia PDF Downloads 59
14335 Development of a Software System for Management and Genetic Analysis of Biological Samples for Forensic Laboratories

Authors: Mariana Lima, Rodrigo Silva, Victor Stange, Teodiano Bastos

Abstract:

Due to the high reliability reached by DNA tests, since the 1980s this kind of test has allowed the identification of a growing number of criminal cases, including old cases that were unsolved, now having a chance to be solved with this technology. Currently, the use of genetic profiling databases is a typical method to increase the scope of genetic comparison. Forensic laboratories must process, analyze, and generate genetic profiles of a growing number of samples, which require time and great storage capacity. Therefore, it is essential to develop methodologies capable to organize and minimize the spent time for both biological sample processing and analysis of genetic profiles, using software tools. Thus, the present work aims the development of a software system solution for laboratories of forensics genetics, which allows sample, criminal case and local database management, minimizing the time spent in the workflow and helps to compare genetic profiles. For the development of this software system, all data related to the storage and processing of samples, workflows and requirements that incorporate the system have been considered. The system uses the following software languages: HTML, CSS, and JavaScript in Web technology, with NodeJS platform as server, which has great efficiency in the input and output of data. In addition, the data are stored in a relational database (MySQL), which is free, allowing a better acceptance for users. The software system here developed allows more agility to the workflow and analysis of samples, contributing to the rapid insertion of the genetic profiles in the national database and to increase resolution of crimes. The next step of this research is its validation, in order to operate in accordance with current Brazilian national legislation.

Keywords: database, forensic genetics, genetic analysis, sample management, software solution

Procedia PDF Downloads 346
14334 An Attempt to Decipher the Meaning of a Mithraic Motif

Authors: Attila Simon

Abstract:

The subject of this research is an element of Mithras' iconography. It is a new element in the series of research begun with the study of the Bull in the Boat motif. The stylized altars represented by the seven adjacent rectangles appear on only a small group of Mithraic reliefs, which may explain why they have received less attention and fewer attempts at decipherment than other motifs. Using Vermaseren's database, CIMRM (Corpus Inscriptionum et Monumentorum Religionis Mithriacae), one collected all the cases containing the motif under investigation, created a database of them grouped by location, then used a comparative method to compare the different forms of the motif and to isolate these cases, and finally evaluated the results. The aim of this research is to interpret the iconographic element in question and attempt to determine its place of origin. The study may provide an interpretation of a Mithraic representation that, to the best of the author's knowledge, has not been explained so far, and the question may generate scientific discourses.

Keywords: roman history, religion, Mithras, iconography

Procedia PDF Downloads 57
14333 Forensic Speaker Verification in Noisy Environmental by Enhancing the Speech Signal Using ICA Approach

Authors: Ahmed Kamil Hasan Al-Ali, Bouchra Senadji, Ganesh Naik

Abstract:

We propose a system to real environmental noise and channel mismatch for forensic speaker verification systems. This method is based on suppressing various types of real environmental noise by using independent component analysis (ICA) algorithm. The enhanced speech signal is applied to mel frequency cepstral coefficients (MFCC) or MFCC feature warping to extract the essential characteristics of the speech signal. Channel effects are reduced using an intermediate vector (i-vector) and probabilistic linear discriminant analysis (PLDA) approach for classification. The proposed algorithm is evaluated by using an Australian forensic voice comparison database, combined with car, street and home noises from QUT-NOISE at a signal to noise ratio (SNR) ranging from -10 dB to 10 dB. Experimental results indicate that the MFCC feature warping-ICA achieves a reduction in equal error rate about (48.22%, 44.66%, and 50.07%) over using MFCC feature warping when the test speech signals are corrupted with random sessions of street, car, and home noises at -10 dB SNR.

Keywords: noisy forensic speaker verification, ICA algorithm, MFCC, MFCC feature warping

Procedia PDF Downloads 381
14332 Project and Module Based Teaching and Learning

Authors: Jingyu Hou

Abstract:

This paper proposes a new teaching and learning approach-project and Module Based Teaching and Learning (PMBTL). The PMBTL approach incorporates the merits of project/problem based and module based learning methods, and overcomes the limitations of these methods. The correlation between teaching, learning, practice, and assessment is emphasized in this approach, and new methods have been proposed accordingly. The distinct features of these new methods differentiate the PMBTL approach from conventional teaching approaches. Evaluation of this approach on practical teaching and learning activities demonstrates the effectiveness and stability of the approach in improving the performance and quality of teaching and learning. The approach proposed in this paper is also intuitive to the design of other teaching units.

Keywords: computer science education, project and module based, software engineering, module based teaching and learning

Procedia PDF Downloads 464
14331 A Psychophysiological Evaluation of an Effective Recognition Technique Using Interactive Dynamic Virtual Environments

Authors: Mohammadhossein Moghimi, Robert Stone, Pia Rotshtein

Abstract:

Recording psychological and physiological correlates of human performance within virtual environments and interpreting their impacts on human engagement, ‘immersion’ and related emotional or ‘effective’ states is both academically and technologically challenging. By exposing participants to an effective, real-time (game-like) virtual environment, designed and evaluated in an earlier study, a psychophysiological database containing the EEG, GSR and Heart Rate of 30 male and female gamers, exposed to 10 games, was constructed. Some 174 features were subsequently identified and extracted from a number of windows, with 28 different timing lengths (e.g. 2, 3, 5, etc. seconds). After reducing the number of features to 30, using a feature selection technique, K-Nearest Neighbour (KNN) and Support Vector Machine (SVM) methods were subsequently employed for the classification process. The classifiers categorised the psychophysiological database into four effective clusters (defined based on a 3-dimensional space – valence, arousal and dominance) and eight emotion labels (relaxed, content, happy, excited, angry, afraid, sad, and bored). The KNN and SVM classifiers achieved average cross-validation accuracies of 97.01% (±1.3%) and 92.84% (±3.67%), respectively. However, no significant differences were found in the classification process based on effective clusters or emotion labels.

Keywords: virtual reality, effective computing, effective VR, emotion-based effective physiological database

Procedia PDF Downloads 207
14330 COVID-19 and Heart Failure Outcomes: Readmission Insights from the 2020 United States National Readmission Database

Authors: Induja R. Nimma, Anand Reddy Maligireddy, Artur Schneider, Melissa Lyle

Abstract:

Background: Although heart failure is one of the most common causes of hospitalization in adult patients, there is limited knowledge on outcomes following initial hospitalization for COVID-19 with heart failure (HCF-19). We felt it pertinent to analyze 30-day readmission causes and outcomes among patients with HCF-19 using the United States using real-world big data via the National readmission database. Objective: The aim is to describe the rate and causes of readmissions and morbidity of heart failure with coinciding COVID-19 (HFC-19) in the United States, using the 2020 National Readmission Database (NRD). Methods: A descriptive, retrospective study was conducted on the 2020 NRD, a nationally representative sample of all US hospitalizations. Adult (>18 years) inpatient admissions with COVID-19 with HF and readmissions in 30 days were selected based on the International Classification of Diseases-Tenth Revision, Procedure Code. Results: In 2020, 2,60,372 adult patients were hospitalized with COVID-19 and HF. The median age was 74 (IQR: 64-83), and 47% were female. The median length of stay was 7(4-13) days, and the total cost of stay was 62,025 (31,956 – 130,670) United States dollars, respectively. Among the index hospital admissions, 61,527 (23.6%) died, and 22,794 (11.5%) were readmitted within 30 days. The median age of patients readmitted in 30 days was 73 (63-82), 45% were female, and 1,962 (16%) died. The most common principal diagnosis for readmission in these patients was COVID-19= 34.8%, Sepsis= 16.5%, HF = 7.1%, AKI = 2.2%, respiratory failure with hypoxia =1.7%, and Pneumonia = 1%. Conclusion: The rate of readmission in patients with heart failure exacerbations is increasing yearly. COVID-19 was observed to be the most common principal diagnosis in patients readmitted within 30 days. Complicated hypertension, chronic pulmonary disease, complicated diabetes, renal failure, alcohol use, drug use, and peripheral vascular disorders are risk factors associated with readmission. Familiarity with the most common causes and predictors for readmission helps guide the development of initiatives to minimize adverse outcomes and the cost of medical care.

Keywords: Covid-19, heart failure, national readmission database, readmission outcomes

Procedia PDF Downloads 52
14329 Using India’s Traditional Knowledge Digital Library on Traditional Tibetan Medicine

Authors: Chimey Lhamo, Ngawang Tsering

Abstract:

Traditional Tibetan medicine, known as Sowa Rigpa (Science of healing), originated more than 2500 years ago with an insightful background, and it has been growing significant attention in many Asian countries like China, India, Bhutan, and Nepal. Particularly, the Indian government has targeted Traditional Tibetan medicine as its major Indian medical system, including Ayurveda. Although Traditional Tibetan medicine has been growing interest and has a long history, it is not easily recognized worldwide because it exists only in the Tibetan language and it is neither accessible nor understood by patent examiners at the international patent office, data about Traditional Tibetan medicine is not yet broadly exist in the Internet. There has also been the exploitation of traditional Tibetan medicine increasing. The Traditional Knowledge Digital Library is a database aiming to prevent the patenting and misappropriation of India’s traditional medicine knowledge by using India’s Traditional knowledge Digital Library on Sowa Rigpa in order to prevent its exploitation at international patent with the help of information technology tools and an innovative classification systems-traditional knowledge resource classification (TKRC). As of date, more than 3000 Sowa Rigpa formulations have been transcribed into a Traditional Knowledge Digital Library database. In this paper, we are presenting India's Traditional Knowledge Digital Library for Traditional Tibetan medicine, and this database system helps to preserve and prevent the exploitation of Sowa Rigpa. Gradually it will be approved and accepted globally.

Keywords: traditional Tibetan medicine, India's traditional knowledge digital library, traditional knowledge resources classification, international patent classification

Procedia PDF Downloads 105
14328 Human Action Recognition Using Wavelets of Derived Beta Distributions

Authors: Neziha Jaouedi, Noureddine Boujnah, Mohamed Salim Bouhlel

Abstract:

In the framework of human machine interaction systems enhancement, we focus throw this paper on human behavior analysis and action recognition. Human behavior is characterized by actions and reactions duality (movements, psychological modification, verbal and emotional expression). It’s worth noting that many information is hidden behind gesture, sudden motion points trajectories and speeds, many research works reconstructed an information retrieval issues. In our work we will focus on motion extraction, tracking and action recognition using wavelet network approaches. Our contribution uses an analysis of human subtraction by Gaussian Mixture Model (GMM) and body movement through trajectory models of motion constructed from kalman filter. These models allow to remove the noise using the extraction of the main motion features and constitute a stable base to identify the evolutions of human activity. Each modality is used to recognize a human action using wavelets of derived beta distributions approach. The proposed approach has been validated successfully on a subset of KTH and UCF sports database.

Keywords: feautures extraction, human action classifier, wavelet neural network, beta wavelet

Procedia PDF Downloads 385
14327 A Computational Cost-Effective Clustering Algorithm in Multidimensional Space Using the Manhattan Metric: Application to the Global Terrorism Database

Authors: Semeh Ben Salem, Sami Naouali, Moetez Sallami

Abstract:

The increasing amount of collected data has limited the performance of the current analyzing algorithms. Thus, developing new cost-effective algorithms in terms of complexity, scalability, and accuracy raised significant interests. In this paper, a modified effective k-means based algorithm is developed and experimented. The new algorithm aims to reduce the computational load without significantly affecting the quality of the clusterings. The algorithm uses the City Block distance and a new stop criterion to guarantee the convergence. Conducted experiments on a real data set show its high performance when compared with the original k-means version.

Keywords: pattern recognition, global terrorism database, Manhattan distance, k-means clustering, terrorism data analysis

Procedia PDF Downloads 353
14326 Wavelets Contribution on Textual Data Analysis

Authors: Habiba Ben Abdessalem

Abstract:

The emergence of giant set of textual data was the push that has encouraged researchers to invest in this field. The purpose of textual data analysis methods is to facilitate access to such type of data by providing various graphic visualizations. Applying these methods requires a corpus pretreatment step, whose standards are set according to the objective of the problem studied. This step determines the forms list contained in contingency table by keeping only those information carriers. This step may, however, lead to noisy contingency tables, so the use of wavelet denoising function. The validity of the proposed approach is tested on a text database that offers economic and political events in Tunisia for a well definite period.

Keywords: textual data, wavelet, denoising, contingency table

Procedia PDF Downloads 257
14325 Stress-Strain Relation for Hybrid Fiber Reinforced Concrete at Elevated Temperature

Authors: Josef Novák, Alena Kohoutková

Abstract:

The performance of concrete structures in fire depends on several factors which include, among others, the change in material properties due to the fire. Today, fiber reinforced concrete (FRC) belongs to materials which have been widely used for various structures and elements. While the knowledge and experience with FRC behavior under ambient temperature is well-known, the effect of elevated temperature on its behavior has to be deeply investigated. This paper deals with an experimental investigation and stress‑strain relations for hybrid fiber reinforced concrete (HFRC) which contains siliceous aggregates, polypropylene and steel fibers. The main objective of the experimental investigation is to enhance a database of mechanical properties of concrete composites with addition of fibers subject to elevated temperature as well as to validate existing stress-strain relations for HFRC. Within the investigation, a unique heat transport test, compressive test and splitting tensile test were performed on 150 mm cubes heated up to 200, 400, and 600 °C with the aim to determine a time period for uniform heat distribution in test specimens and the mechanical properties of the investigated concrete composite, respectively. Both findings obtained from the presented experimental test as well as experimental data collected from scientific papers so far served for validating the computational accuracy of investigated stress-strain relations for HFRC which have been developed during last few years. Owing to the presence of steel and polypropylene fibers, HFRC becomes a unique material whose structural performance differs from conventional plain concrete when exposed to elevated temperature. Polypropylene fibers in HFRC lower the risk of concrete spalling as the fibers burn out shortly with increasing temperature due to low ignition point and as a consequence pore pressure decreases. On the contrary, the increase in the concrete porosity might affect the mechanical properties of the material. To validate this thought requires enhancing the existing result database which is very limited and does not contain enough data. As a result of the poor database, only few stress-strain relations have been developed so far to describe the structural performance of HFRC at elevated temperature. Moreover, many of them are inconsistent and need to be refined. Most of them also do not take into account the effect of both a fiber type and fiber content. Such approach might be vague especially when high amount of polypropylene fibers are used. Therefore, the existing relations should be validated in detail based on other experimental results.

Keywords: elevated temperature, fiber reinforced concrete, mechanical properties, stress strain relation

Procedia PDF Downloads 309
14324 Classification of Small Towns: Three Methodological Approaches and Their Results

Authors: Jerzy Banski

Abstract:

Small towns represent a key element of settlement structure and serve a number of important functions associated with the servicing of rural areas that surround them. It is in light of this that scientific studies have paid considerable attention to the functional structure of centers of this kind, as well as the relationships with both surrounding rural areas and other urban centers. But a preliminary to such research has typically involved attempts at classifying the urban centers themselves, with this also assisting with the planning and shaping of development policy on different spatial scales. The purpose of the work is to test out the methods underpinning three different classifications of small urban centers, as well as to offer a preliminary interpretation of the outcomes obtained. Research took in 722 settlement units in Poland, granted town rights and populated by fewer than 20,000 inhabitants. A morphologically-based classification making reference to the database of topographic objects as regards land cover within the administrative boundaries of towns and cities was carried out, and it proved possible to distinguish the categories of “housing-estate”, industrial and R&R towns, as well as towns characterized by dichotomy. Equally, a functional/morphological approach taken with the same database allowed for the identification – via an alternative method – of three main categories of small towns (i.e., the monofunctional, multifunctional or oligo functional), which could then be described in far greater detail. A third, multi-criterion classification made simultaneous reference to the conditioning of a structural, a location-related, and an administrative hierarchy-related nature, allowing for distinctions to be drawn between small towns in 9 different categories. The results obtained allow for multifaceted analysis and interpretation of the geographical differentiation characterizing the distribution of Poland’s urban centers across space in the country.

Keywords: small towns, classification, local planning, Poland

Procedia PDF Downloads 50
14323 Methodology to Achieve Non-Cooperative Target Identification Using High Resolution Range Profiles

Authors: Olga Hernán-Vega, Patricia López-Rodríguez, David Escot-Bocanegra, Raúl Fernández-Recio, Ignacio Bravo

Abstract:

Non-Cooperative Target Identification has become a key research domain in the Defense industry since it provides the ability to recognize targets at long distance and under any weather condition. High Resolution Range Profiles, one-dimensional radar images where the reflectivity of a target is projected onto the radar line of sight, are widely used for identification of flying targets. According to that, to face this problem, an approach to Non-Cooperative Target Identification based on the exploitation of Singular Value Decomposition to a matrix of range profiles is presented. Target Identification based on one-dimensional radar images compares a collection of profiles of a given target, namely test set, with the profiles included in a pre-loaded database, namely training set. The classification is improved by using Singular Value Decomposition since it allows to model each aircraft as a subspace and to accomplish recognition in a transformed domain where the main features are easier to extract hence, reducing unwanted information such as noise. Singular Value Decomposition permits to define a signal subspace which contain the highest percentage of the energy, and a noise subspace which will be discarded. This way, only the valuable information of each target is used in the recognition process. The identification algorithm is based on finding the target that minimizes the angle between subspaces and takes place in a transformed domain. Two metrics, F1 and F2, based on Singular Value Decomposition are accomplished in the identification process. In the case of F2, the angle is weighted, since the top vectors set the importance in the contribution to the formation of a target signal, on the contrary F1 simply shows the evolution of the unweighted angle. In order to have a wide database or radar signatures and evaluate the performance, range profiles are obtained through numerical simulation of seven civil aircraft at defined trajectories taken from an actual measurement. Taking into account the nature of the datasets, the main drawback of using simulated profiles instead of actual measured profiles is that the former implies an ideal identification scenario, since measured profiles suffer from noise, clutter and other unwanted information and simulated profiles don't. In this case, the test and training samples have similar nature and usually a similar high signal-to-noise ratio, so as to assess the feasibility of the approach, the addition of noise has been considered before the creation of the test set. The identification results applying the unweighted and weighted metrics are analysed for demonstrating which algorithm provides the best robustness against noise in an actual possible scenario. So as to confirm the validity of the methodology, identification experiments of profiles coming from electromagnetic simulations are conducted, revealing promising results. Considering the dissimilarities between the test and training sets when noise is added, the recognition performance has been improved when weighting is applied. Future experiments with larger sets are expected to be conducted with the aim of finally using actual profiles as test sets in a real hostile situation.

Keywords: HRRP, NCTI, simulated/synthetic database, SVD

Procedia PDF Downloads 328
14322 The Management Information System for Convenience Stores: Case Study in 7 Eleven Shop in Bangkok

Authors: Supattra Kanchanopast

Abstract:

The purpose of this research is to develop and design a management information system for 7 eleven shop in Bangkok. The system was designed and developed to meet users’ requirements via the internet network by use of application software such as My SQL for database management, Apache HTTP Server for Web Server and PHP Hypertext Preprocessor for an interface between web server, database and users. The system was designed into two subsystems as the main system, or system for head office, and the branch system for branch shops. These consisted of three parts which are classified by user management as shop management, inventory management and Point of Sale (POS) management. The implementation of the MIS for the mini-mart shop, can lessen the amount of paperwork and reduce repeating tasks so it may decrease the capital of the business and support an extension of branches in the future as well.

Keywords: convenience store, the management information system, inventory management, 7 eleven shop

Procedia PDF Downloads 431
14321 Innovative Predictive Modeling and Characterization of Composite Material Properties Using Machine Learning and Genetic Algorithms

Authors: Hamdi Beji, Toufik Kanit, Tanguy Messager

Abstract:

This study aims to construct a predictive model proficient in foreseeing the linear elastic and thermal characteristics of composite materials, drawing on a multitude of influencing parameters. These parameters encompass the shape of inclusions (circular, elliptical, square, triangle), their spatial coordinates within the matrix, orientation, volume fraction (ranging from 0.05 to 0.4), and variations in contrast (spanning from 10 to 200). A variety of machine learning techniques are deployed, including decision trees, random forests, support vector machines, k-nearest neighbors, and an artificial neural network (ANN), to facilitate this predictive model. Moreover, this research goes beyond the predictive aspect by delving into an inverse analysis using genetic algorithms. The intent is to unveil the intrinsic characteristics of composite materials by evaluating their thermomechanical responses. The foundation of this research lies in the establishment of a comprehensive database that accounts for the array of input parameters mentioned earlier. This database, enriched with this diversity of input variables, serves as a bedrock for the creation of machine learning and genetic algorithm-based models. These models are meticulously trained to not only predict but also elucidate the mechanical and thermal conduct of composite materials. Remarkably, the coupling of machine learning and genetic algorithms has proven highly effective, yielding predictions with remarkable accuracy, boasting scores ranging between 0.97 and 0.99. This achievement marks a significant breakthrough, demonstrating the potential of this innovative approach in the field of materials engineering.

Keywords: machine learning, composite materials, genetic algorithms, mechanical and thermal proprieties

Procedia PDF Downloads 36
14320 Operational Excellence Performance in Pharmaceutical Quality Control Labs: An Empirical Investigation of the Effectiveness and Efficiency Relation

Authors: Stephan Koehler, Thomas Friedli

Abstract:

Performance measurement has evolved over time from a unidimensional short-term efficiency focused approach into a balanced multidimensional approach. Today, integrated performance measurement frameworks are often used to avoid local optimization and to encourage continuous improvement of an organization. In literature, the multidimensional characteristic of performance measurement is often described by competitive priorities. At the same time, on the highest abstraction level an effectiveness and efficiency dimension of performance measurement can be distinguished. This paper aims at a better understanding of the composition of effectiveness and efficiency and their relation in pharmaceutical quality control labs. The research comprises a lab-specific operationalization of effectiveness and efficiency and examines how the two dimensions are interlinked. The basis for the analysis represents a database of the University of St. Gallen including a divers set of 40 different pharmaceutical quality control labs. The research provides empirical evidence that labs with a high effectiveness also accompany a high efficiency. Lab effectiveness explains 29.5 % of the variance in lab efficiency. In addition, labs with an above median operational excellence performance have a statistically significantly higher lab effectiveness and lab efficiency compared to the below median performing labs.

Keywords: empirical study, operational excellence, performance measurement, pharmaceutical quality control lab

Procedia PDF Downloads 132
14319 Thermochemical Study of the Degradation of the Panels of Wings in a Space Shuttle by Utilization of HSC Chemistry Software and Its Database

Authors: Ahmed Ait Hou

Abstract:

The wing leading edge and nose cone of the space shuttle are fabricated from a reinforced carbon/carbon material. This material attains its durability from a diffusion coating of silicon carbide (SiC) and a glass sealant. During re-entry into the atmosphere, this material is subject to an oxidizing high-temperature environment. The use of thermochemical calculations resulting at the HSC CHEMISTRY software and its database allows us to interpret the phenomena of oxidation and chloridation observed on the wing leading edge and nose cone of the space shuttle during its mission in space. First study is the monitoring of the oxidation reaction of SiC. It has been demonstrated that thermal oxidation of the SiC gives the two compounds SiO₂(s) and CO(g). In the extreme conditions of very low oxygen partial pressures and high temperatures, there is a reaction between SiC and SiO₂, leading to SiO(g) and CO(g). We had represented the phase stability diagram of Si-C-O system calculated by the use of the HSC Chemistry at 1300°C. The principal characteristic of this diagram of predominance is the line of SiC + SiO₂ coexistence. Second study is the monitoring of the chloridation reaction of SiC. The other problem encountered in addition to oxidation is the phenomenon of chloridation due to the presence of NaCl. Indeed, after many missions, the leading edge wing surfaces have exhibited small pinholes. We have used the HSC Chemistry database to analyze these various reactions. Our calculations concorde with the phenomena we announced in research work resulting in NASA LEWIS Research center.

Keywords: thermochchemicals calculations, HSC software, oxidation and chloridation, wings in space

Procedia PDF Downloads 90
14318 AI-Based Autonomous Plant Health Monitoring and Control System with Visual Health-Scoring Models

Authors: Uvais Qidwai, Amor Moursi, Mohamed Tahar, Malek Hamad, Hamad Alansi

Abstract:

This paper focuses on the development and implementation of an advanced plant health monitoring system with an AI backbone and IoT sensory network. Our approach involves addressing the critical environmental factors essential for preserving a plant’s well-being, including air temperature, soil moisture, soil temperature, soil conductivity, pH, water levels, and humidity, as well as the presence of essential nutrients like nitrogen, phosphorus, and potassium. Central to our methodology is the utilization of computer vision technology, particularly a night vision camera. The captured data is then compared against a reference database containing different health statuses. This comparative analysis is implemented using an AI deep learning model, which enables us to generate accurate assessments of plant health status. By combining the AI-based decision-making approach, our system aims to provide precise and timely insights into the overall health and well-being of plants, offering a valuable tool for effective plant care and management.

Keywords: deep learning image model, IoT sensing, cloud-based analysis, remote monitoring app, computer vision, fuzzy control

Procedia PDF Downloads 2
14317 An Approach for the Assessment of Semi-Elliptical Surface Crack

Authors: Muhammad Naweed, Usman Tariq Murtaza, Waseem Siddique

Abstract:

A pallet body approach is a finite element-based computational approach used for the modeling and assessment of a three-dimensional surface crack. The approach is capable of inserting the crack in an engineering structure and generating high-quality hexahedral mesh in the cracked region of the structure. The approach is capable of computing the stress intensity factors along a semi-elliptical surface crack numerically. The objective of this work is to present that the stress intensity factors produced by the approach can be used with confidence for capturing the parameters during the fatigue crack growth.

Keywords: pallet body approach, semi-elliptical surface crack, stress intensity factors, fatigue crack growth

Procedia PDF Downloads 68
14316 Regional Dynamics of Innovation and Entrepreneurship in the Optics and Photonics Industry

Authors: Mustafa İlhan Akbaş, Özlem Garibay, Ivan Garibay

Abstract:

The economic entities in innovation ecosystems form various industry clusters, in which they compete and cooperate to survive and grow. Within a successful and stable industry cluster, the entities acquire different roles that complement each other in the system. The universities and research centers have been accepted to have a critical role in these systems for the creation and development of innovations. However, the real effect of research institutions on regional economic growth is difficult to assess. In this paper, we present our approach for the identification of the impact of research activities on the regional entrepreneurship for a specific high-tech industry: optics and photonics. The optics and photonics has been defined as an enabling industry, which combines the high-tech photonics technology with the developing optics industry. The recent literature suggests that the growth of optics and photonics firms depends on three important factors: the embedded regional specializations in the labor market, the research and development infrastructure, and a dynamic small firm network capable of absorbing new technologies, products and processes. Therefore, the role of each factor and the dynamics among them must be understood to identify the requirements of the entrepreneurship activities in optics and photonics industry. There are three main contributions of our approach. The recent studies show that the innovation in optics and photonics industry is mostly located around metropolitan areas. There are also studies mentioning the importance of research center locations and universities in the regional development of optics and photonics industry. These studies are mostly limited with the number of patents received within a short period of time or some limited survey results. Therefore the first contribution of our approach is conducting a comprehensive analysis for the state and recent history of the photonics and optics research in the US. For this purpose, both the research centers specialized in optics and photonics and the related research groups in various departments of institutions (e.g. Electrical Engineering, Materials Science) are identified and a geographical study of their locations is presented. The second contribution of the paper is the analysis of regional entrepreneurship activities in optics and photonics in recent years. We use the membership data of the International Society for Optics and Photonics (SPIE) and the regional photonics clusters to identify the optics and photonics companies in the US. Then the profiles and activities of these companies are gathered by extracting and integrating the related data from the National Establishment Time Series (NETS) database, ES-202 database and the data sets from the regional photonics clusters. The number of start-ups, their employee numbers and sales are some examples of the extracted data for the industry. Our third contribution is the utilization of collected data to investigate the impact of research institutions on the regional optics and photonics industry growth and entrepreneurship. In this analysis, the regional and periodical conditions of the overall market are taken into consideration while discovering and quantifying the statistical correlations.

Keywords: entrepreneurship, industrial clusters, optics, photonics, emerging industries, research centers

Procedia PDF Downloads 386
14315 Risk of Heatstroke Occurring in Indoor Built Environment Determined with Nationwide Sports and Health Database and Meteorological Outdoor Data

Authors: Go Iwashita

Abstract:

The paper describes how the frequencies of heatstroke occurring in indoor built environment are related to the outdoor thermal environment with big statistical data. As the statistical accident data of heatstroke, the nationwide accident data were obtained from the National Agency for the Advancement of Sports and Health (NAASH) . The meteorological database of the Japanese Meteorological Agency supplied data about 1-hour average temperature, humidity, wind speed, solar radiation, and so forth. Each heatstroke data point from the NAASH database was linked to the meteorological data point acquired from the nearest meteorological station where the accident of heatstroke occurred. This analysis was performed for a 10-year period (2005–2014). During the 10-year period, 3,819 cases of heatstroke were reported in the NAASH database for the investigated secondary/high schools of the nine Japanese representative cities. Heatstroke most commonly occurred in the outdoor schoolyard at a wet-bulb globe temperature (WBGT) of 31°C and in the indoor gymnasium during athletic club activities at a WBGT > 31°C. The determined accident ratio (number of accidents during each club activity divided by the club’s population) in the gymnasium during the female badminton club activities was the highest. Although badminton is played in a gymnasium, these WBGT results show that the risk level during badminton under hot and humid conditions is equal to that of baseball or rugby played in the schoolyard. Except sports, the high risk of heatstroke was observed in schools houses during cultural activities. The risk level for indoor environment under hot and humid condition would be equal to that for outdoor environment based on the above results of WBGT. Therefore control measures against hot and humid indoor condition were needed as installing air conditions not only schools but also residences.

Keywords: accidents in schools, club activity, gymnasium, heatstroke

Procedia PDF Downloads 196
14314 Design and Optimization of a Small Hydraulic Propeller Turbine

Authors: Dario Barsi, Marina Ubaldi, Pietro Zunino, Robert Fink

Abstract:

A design and optimization procedure is proposed and developed to provide the geometry of a high efficiency compact hydraulic propeller turbine for low head. For the preliminary design of the machine, classic design criteria, based on the use of statistical correlations for the definition of the fundamental geometric parameters and the blade shapes are used. These relationships are based on the fundamental design parameters (i.e., specific speed, flow coefficient, work coefficient) in order to provide a simple yet reliable procedure. Particular attention is paid, since from the initial steps, on the correct conformation of the meridional channel and on the correct arrangement of the blade rows. The preliminary geometry thus obtained is used as a starting point for the hydrodynamic optimization procedure, carried out using a CFD calculation software coupled with a genetic algorithm that generates and updates a large database of turbine geometries. The optimization process is performed using a commercial approach that solves the turbulent Navier Stokes equations (RANS) by exploiting the axial-symmetric geometry of the machine. The geometries generated within the database are therefore calculated in order to determine the corresponding overall performance. In order to speed up the optimization calculation, an artificial neural network (ANN) based on the use of an objective function is employed. The procedure was applied for the specific case of a propeller turbine with an innovative design of a modular type, specific for applications characterized by very low heads. The procedure is tested in order to verify its validity and the ability to automatically obtain the targeted net head and the maximum for the total to total internal efficiency.

Keywords: renewable energy conversion, hydraulic turbines, low head hydraulic energy, optimization design

Procedia PDF Downloads 119
14313 Applying Neural Networks for Solving Record Linkage Problem via Fuzzy Description Logics

Authors: Mikheil Kalmakhelidze

Abstract:

Record linkage (RL) problem has become more and more important in recent years due to the growing interest towards big data analysis. The problem can be formulated in a very simple way: Given two entries a and b of a database, decide whether they represent the same object or not. There are two classical deterministic and probabilistic ways of solving the RL problem. Using simple Bayes classifier in many cases produces useful results but sometimes they show to be poor. In recent years several successful approaches have been made towards solving specific RL problems by neural network algorithms including single layer perception, multilayer back propagation network etc. In our work, we model the RL problem for specific dataset of student applications in fuzzy description logic (FDL) where linkage of specific pair (a,b) depends on the truth value of corresponding formula A(a,b) in a canonical FDL model. As a main result, we build neural network for deciding truth value of FDL formulas in a canonical model and thus link RL problem to machine learning. We apply the approach to dataset with 10000 entries and also compare to classical RL solving approaches. The results show to be more accurate than standard probabilistic approach.

Keywords: description logic, fuzzy logic, neural networks, record linkage

Procedia PDF Downloads 245
14312 A Novel Approach to 3D Thrust Vectoring CFD via Mesh Morphing

Authors: Umut Yıldız, Berkin Kurtuluş, Yunus Emre Muslubaş

Abstract:

Thrust vectoring, especially in military aviation, is a concept that sees much use to improve maneuverability in already agile aircraft. As this concept is fairly new and cost intensive to design and test, computational methods are useful in easing the preliminary design process. Computational Fluid Dynamics (CFD) can be utilized in many forms to simulate nozzle flow, and there exist various CFD studies in both 2D mechanical and 3D injection based thrust vectoring, and yet, 3D mechanical thrust vectoring analyses, at this point in time, are lacking variety. Additionally, the freely available test data is constrained to limited pitch angles and geometries. In this study, based on a test case provided by NASA, both steady and unsteady 3D CFD simulations are conducted to examine the aerodynamic performance of a mechanical thrust vectoring nozzle model and to validate the utilized numerical model. Steady analyses are performed to verify the flow characteristics of the nozzle at pitch angles of 0, 10 and 20 degrees, and the results are compared with experimental data. It is observed that the pressure data obtained on the inner surface of the nozzle at each specified pitch angle and under different flow conditions with pressure ratios of 1.5, 2 and 4, as well as at azimuthal angle of 0, 45, 90, 135, and 180 degrees exhibited a high level of agreement with the corresponding experimental results. To validate the CFD model, the insights from the steady analyses are utilized, followed by unsteady analyses covering a wide range of pitch angles from 0 to 20 degrees. Throughout the simulations, a mesh morphing method using a carefully calculated mathematical shape deformation model that simulates the vectored nozzle shape exactly at each point of its travel is employed to dynamically alter the divergent part of the nozzle over time within this pitch angle range. The mesh morphing based vectored nozzle shapes were compared with the drawings provided by NASA, ensuring a complete match was achieved. This computational approach allowed for the creation of a comprehensive database of results without the need to generate separate solution domains. The database contains results at every 0.01° increment of nozzle pitch angle. The unsteady analyses, generated using the morphing method, are found to be in excellent agreement with experimental data, further confirming the accuracy of the CFD model.

Keywords: thrust vectoring, computational fluid dynamics, 3d mesh morphing, mathematical shape deformation model

Procedia PDF Downloads 57
14311 Expert System: Debugging Using MD5 Process Firewall

Authors: C. U. Om Kumar, S. Kishore, A. Geetha

Abstract:

An Operating system (OS) is software that manages computer hardware and software resources by providing services to computer programs. One of the important user expectations of the operating system is to provide the practice of defending information from unauthorized access, disclosure, modification, inspection, recording or destruction. Operating system is always vulnerable to the attacks of malwares such as computer virus, worm, Trojan horse, backdoors, ransomware, spyware, adware, scareware and more. And so the anti-virus software were created for ensuring security against the prominent computer viruses by applying a dictionary based approach. The anti-virus programs are not always guaranteed to provide security against the new viruses proliferating every day. To clarify this issue and to secure the computer system, our proposed expert system concentrates on authorizing the processes as wanted and unwanted by the administrator for execution. The Expert system maintains a database which consists of hash code of the processes which are to be allowed. These hash codes are generated using MD5 message-digest algorithm which is a widely used cryptographic hash function. The administrator approves the wanted processes that are to be executed in the client in a Local Area Network by implementing Client-Server architecture and only the processes that match with the processes in the database table will be executed by which many malicious processes are restricted from infecting the operating system. The add-on advantage of this proposed Expert system is that it limits CPU usage and minimizes resource utilization. Thus data and information security is ensured by our system along with increased performance of the operating system.

Keywords: virus, worm, Trojan horse, back doors, Ransomware, Spyware, Adware, Scareware, sticky software, process table, MD5, CPU usage and resource utilization

Procedia PDF Downloads 389
14310 Sequential Pattern Mining from Data of Medical Record with Sequential Pattern Discovery Using Equivalent Classes (SPADE) Algorithm (A Case Study : Bolo Primary Health Care, Bima)

Authors: Rezky Rifaini, Raden Bagus Fajriya Hakim

Abstract:

This research was conducted at the Bolo primary health Care in Bima Regency. The purpose of the research is to find out the association pattern that is formed of medical record database from Bolo Primary health care’s patient. The data used is secondary data from medical records database PHC. Sequential pattern mining technique is the method that used to analysis. Transaction data generated from Patient_ID, Check_Date and diagnosis. Sequential Pattern Discovery Algorithms Using Equivalent Classes (SPADE) is one of the algorithm in sequential pattern mining, this algorithm find frequent sequences of data transaction, using vertical database and sequence join process. Results of the SPADE algorithm is frequent sequences that then used to form a rule. It technique is used to find the association pattern between items combination. Based on association rules sequential analysis with SPADE algorithm for minimum support 0,03 and minimum confidence 0,75 is gotten 3 association sequential pattern based on the sequence of patient_ID, check_Date and diagnosis data in the Bolo PHC.

Keywords: diagnosis, primary health care, medical record, data mining, sequential pattern mining, SPADE algorithm

Procedia PDF Downloads 373
14309 Lean Models Classification: Towards a Holistic View

Authors: Y. Tiamaz, N. Souissi

Abstract:

The purpose of this paper is to present a classification of Lean models which aims to capture all the concepts related to this approach and thus facilitate its implementation. This classification allows the identification of the most relevant models according to several dimensions. From this perspective, we present a review and an analysis of Lean models literature and we propose dimensions for the classification of the current proposals while respecting among others the axes of the Lean approach, the maturity of the models as well as their application domains. This classification allowed us to conclude that researchers essentially consider the Lean approach as a toolbox also they design their models to solve problems related to a specific environment. Since Lean approach is no longer intended only for the automotive sector where it was invented, but to all fields (IT, Hospital, ...), we consider that this approach requires a generic model that is capable of being implemented in all areas.

Keywords: lean approach, lean models, classification, dimensions, holistic view

Procedia PDF Downloads 406
14308 Review and Comparison of Associative Classification Data Mining Approaches

Authors: Suzan Wedyan

Abstract:

Data mining is one of the main phases in the Knowledge Discovery Database (KDD) which is responsible of finding hidden and useful knowledge from databases. There are many different tasks for data mining including regression, pattern recognition, clustering, classification, and association rule. In recent years a promising data mining approach called associative classification (AC) has been proposed, AC integrates classification and association rule discovery to build classification models (classifiers). This paper surveys and critically compares several AC algorithms with reference of the different procedures are used in each algorithm, such as rule learning, rule sorting, rule pruning, classifier building, and class allocation for test cases.

Keywords: associative classification, classification, data mining, learning, rule ranking, rule pruning, prediction

Procedia PDF Downloads 507