Search results for: universal testing machine
4233 Reinforced Concrete Foundation for Turbine Generators
Authors: Siddhartha Bhattacharya
Abstract:
Steam Turbine-Generators (STG) and Combustion Turbine-Generator (CTG) are used in almost all modern petrochemical, LNG plants and power plant facilities. The reinforced concrete table top foundations are required to support these high speed rotating heavy machineries and is one of the most critical and challenging structures on any industrial project. The paper illustrates through a practical example, the step by step procedure adopted in designing a table top foundation supported on piles for a steam turbine generator with operating speed of 60 Hz. Finite element model of a table top foundation is generated in ANSYS. Piles are modeled as springs-damper elements (COMBIN14). Basic loads are adopted in analysis and design of the foundation based on the vendor requirements, industry standards, and relevant ASCE & ACI codal provisions. Static serviceability checks are performed with the help of Misalignment Tolerance Matrix (MTM) method in which the percentage of misalignment at a given bearing due to displacement at another bearing is calculated and kept within the stipulated criteria by the vendor so that the machine rotor can sustain the stresses developed due to this misalignment. Dynamic serviceability checks are performed through modal and forced vibration analysis where the foundation is checked for resonance and allowable amplitudes, as stipulated by the machine manufacturer. Reinforced concrete design of the foundation is performed by calculating the axial force, bending moment and shear at each of the critical sections. These values are calculated through area integral of the element stresses at these critical locations. Design is done as per ACI 318-05.Keywords: steam turbine generator foundation, finite element, static analysis, dynamic analysis
Procedia PDF Downloads 2974232 Enhancing the Interpretation of Group-Level Diagnostic Results from Cognitive Diagnostic Assessment: Application of Quantile Regression and Cluster Analysis
Authors: Wenbo Du, Xiaomei Ma
Abstract:
With the empowerment of Cognitive Diagnostic Assessment (CDA), various domains of language testing and assessment have been investigated to dig out more diagnostic information. What is noticeable is that most of the extant empirical CDA-based research puts much emphasis on individual-level diagnostic purpose with very few concerned about learners’ group-level performance. Even though the personalized diagnostic feedback is the unique feature that differentiates CDA from other assessment tools, group-level diagnostic information cannot be overlooked in that it might be more practical in classroom setting. Additionally, the group-level diagnostic information obtained via current CDA always results in a “flat pattern”, that is, the mastery/non-mastery of all tested skills accounts for the two highest proportion. In that case, the outcome does not bring too much benefits than the original total score. To address these issues, the present study attempts to apply cluster analysis for group classification and quantile regression analysis to pinpoint learners’ performance at different proficiency levels (beginner, intermediate and advanced) thus to enhance the interpretation of the CDA results extracted from a group of EFL learners’ reading performance on a diagnostic reading test designed by PELDiaG research team from a key university in China. The results show that EM method in cluster analysis yield more appropriate classification results than that of CDA, and quantile regression analysis does picture more insightful characteristics of learners with different reading proficiencies. The findings are helpful and practical for instructors to refine EFL reading curriculum and instructional plan tailored based on the group classification results and quantile regression analysis. Meanwhile, these innovative statistical methods could also make up the deficiencies of CDA and push forward the development of language testing and assessment in the future.Keywords: cognitive diagnostic assessment, diagnostic feedback, EFL reading, quantile regression
Procedia PDF Downloads 1464231 Factors That Stimulate Employee Development in Polish Small Enterprises
Authors: Ewa Rak
Abstract:
This paper is part of a broader research project on employee development in small enterprises, financed by Polish National Science Centre. The project results will serve as basis for a doctoral dissertation. The paper utilises literature studies and qualitative research conducted in small enterprises operating in the Lower Silesia region of Poland. This paper aims to identify some of the factors that stimulate employee development in small companies operating in Poland. The great variety of business pursuits and applications represented by this sector makes it hard to determine a universal configuration of factors to offer best possible conditions for employee development. Research results suggest that each of the examined companies had one or two of such factors in focus, and serving as the basis for the entire pro-development system. These include: employment security (both for employee and entrepreneur) and extensive knowledge and experience of entrepreneurs, but only if it is combined with a willingness and ability to share it.Keywords: employee development, factors that stimulate employee development, human resources development, Poland, small enterprises, training
Procedia PDF Downloads 2684230 Application of Mathematics in Real-Life Situation
Authors: Abubakar Attahiru
Abstract:
Mathematics plays an important role in the real situation. The development of the study of mathematics is a result of the needs of man to survive and interact with one another in society. Mathematics is the universal language that is applied in almost every aspect of life. Mathematics gives us a way to understand patterns, define relationships, and predict the future. The changes in the content and methods of studying mathematics follow the trends in societal needs and developments. Also, the developments in mathematics affect the developments in society. Generally, education helps to develop society while the activities and needs of the society dictate e educational policy of any society. Among all the academic subjects studied at school, mathematics has distinctly contributed more to the objectives of general education of man than any other subject. This is a result of the applications of mathematics to all spheres of human endeavors’. This paper looks at the meaning of the basic concepts of mathematics, science, and technology, the application of mathematics in a real-life situation, and their relationships with society. The paper also shows how mathematics, science, and technology affect the existence and development of society and how society determines the nature of mathematics studied in society through its educational system.Keywords: application, mathematics, real life, situation
Procedia PDF Downloads 1584229 Speech Intelligibility Improvement Using Variable Level Decomposition DWT
Authors: Samba Raju, Chiluveru, Manoj Tripathy
Abstract:
Intelligibility is an essential characteristic of a speech signal, which is used to help in the understanding of information in speech signal. Background noise in the environment can deteriorate the intelligibility of a recorded speech. In this paper, we presented a simple variance subtracted - variable level discrete wavelet transform, which improve the intelligibility of speech. The proposed algorithm does not require an explicit estimation of noise, i.e., prior knowledge of the noise; hence, it is easy to implement, and it reduces the computational burden. The proposed algorithm decides a separate decomposition level for each frame based on signal dominant and dominant noise criteria. The performance of the proposed algorithm is evaluated with speech intelligibility measure (STOI), and results obtained are compared with Universal Discrete Wavelet Transform (DWT) thresholding and Minimum Mean Square Error (MMSE) methods. The experimental results revealed that the proposed scheme outperformed competing methodsKeywords: discrete wavelet transform, speech intelligibility, STOI, standard deviation
Procedia PDF Downloads 1504228 Unsupervised Echocardiogram View Detection via Autoencoder-Based Representation Learning
Authors: Andrea Treviño Gavito, Diego Klabjan, Sanjiv J. Shah
Abstract:
Echocardiograms serve as pivotal resources for clinicians in diagnosing cardiac conditions, offering non-invasive insights into a heart’s structure and function. When echocardiographic studies are conducted, no standardized labeling of the acquired views is performed. Employing machine learning algorithms for automated echocardiogram view detection has emerged as a promising solution to enhance efficiency in echocardiogram use for diagnosis. However, existing approaches predominantly rely on supervised learning, necessitating labor-intensive expert labeling. In this paper, we introduce a fully unsupervised echocardiographic view detection framework that leverages convolutional autoencoders to obtain lower dimensional representations and the K-means algorithm for clustering them into view-related groups. Our approach focuses on discriminative patches from echocardiographic frames. Additionally, we propose a trainable inverse average layer to optimize decoding of average operations. By integrating both public and proprietary datasets, we obtain a marked improvement in model performance when compared to utilizing a proprietary dataset alone. Our experiments show boosts of 15.5% in accuracy and 9.0% in the F-1 score for frame-based clustering, and 25.9% in accuracy and 19.8% in the F-1 score for view-based clustering. Our research highlights the potential of unsupervised learning methodologies and the utilization of open-sourced data in addressing the complexities of echocardiogram interpretation, paving the way for more accurate and efficient cardiac diagnoses.Keywords: artificial intelligence, echocardiographic view detection, echocardiography, machine learning, self-supervised representation learning, unsupervised learning
Procedia PDF Downloads 394227 1-D Convolutional Neural Network Approach for Wheel Flat Detection for Freight Wagons
Authors: Dachuan Shi, M. Hecht, Y. Ye
Abstract:
With the trend of digitalization in railway freight transport, a large number of freight wagons in Germany have been equipped with telematics devices, commonly placed on the wagon body. A telematics device contains a GPS module for tracking and a 3-axis accelerometer for shock detection. Besides these basic functions, it is desired to use the integrated accelerometer for condition monitoring without any additional sensors. Wheel flats as a common type of failure on wheel tread cause large impacts on wagons and infrastructure as well as impulsive noise. A large wheel flat may even cause safety issues such as derailments. In this sense, this paper proposes a machine learning approach for wheel flat detection by using car body accelerations. Due to suspension systems, impulsive signals caused by wheel flats are damped significantly and thus could be buried in signal noise and disturbances. Therefore, it is very challenging to detect wheel flats using car body accelerations. The proposed algorithm considers the envelope spectrum of car body accelerations to eliminate the effect of noise and disturbances. Subsequently, a 1-D convolutional neural network (CNN), which is well known as a deep learning method, is constructed to automatically extract features in the envelope-frequency domain and conduct classification. The constructed CNN is trained and tested on field test data, which are measured on the underframe of a tank wagon with a wheel flat of 20 mm length in the operational condition. The test results demonstrate the good performance of the proposed algorithm for real-time fault detection.Keywords: fault detection, wheel flat, convolutional neural network, machine learning
Procedia PDF Downloads 1314226 Milling Process of Rigid Flex Printed Circuit Board to Which Polyimide Covers the Whole Surface
Authors: Daniela Evtimovska, Ivana Srbinovska, Padraig O’Rourke
Abstract:
Kostal Macedonia has the challenge to mill a rigid-flex printed circuit board (PCB). The PCB elaborated in this paper is made of FR4 material covered with polyimide through the whole surface on the one side, including the tabs where PCBs need to be separated. After milling only 1.44 meters, the updraft routing tool isn’t effective and causes polyimide debris on all PCB cuts if it continues to mill with the same tool. Updraft routing tool is used for all another product in Kostal Macedonia, and it is changing after milling 60 meters. Changing the tool adds 80 seconds to the cycle time. One solution is using a laser-cut machine. Buying a laser-cut machine for cutting only one product doesn’t make financial sense. The focus is given to find an internal solution among the options under review to solve the issue with polyimide debris. In the paper, the design of the rigid-flex panel is described deeply. It is evaluated downdraft routing tool as a possible solution which could be used for the flex rigid panel as a specific product. It is done a comparison between updraft and down draft routing tools from a technical and financial aspect of view, taking into consideration the customer requirements for the rigid-flex PCB. The results show that using the downdraft routing tool is the best solution in this case. This tool is more expensive for 0.62 euros per piece than updraft. The downdraft routing tool needs to be changed after milling 43.44 meters in comparison with the updraft tool, which needs to be changed after milling only 1.44 meters. It is done analysis which actions should be taken in order further improvements and the possibility of maximum serving of downdraft routing tool.Keywords: Kostal Macedonia, rigid flex PCB, polyimide, debris, milling process, up/down draft routing tool
Procedia PDF Downloads 1944225 An Efficient Machine Learning Model to Detect Metastatic Cancer in Pathology Scans Using Principal Component Analysis Algorithm, Genetic Algorithm, and Classification Algorithms
Authors: Bliss Singhal
Abstract:
Machine learning (ML) is a branch of Artificial Intelligence (AI) where computers analyze data and find patterns in the data. The study focuses on the detection of metastatic cancer using ML. Metastatic cancer is the stage where cancer has spread to other parts of the body and is the cause of approximately 90% of cancer-related deaths. Normally, pathologists spend hours each day to manually classifying whether tumors are benign or malignant. This tedious task contributes to mislabeling metastasis being over 60% of the time and emphasizes the importance of being aware of human error and other inefficiencies. ML is a good candidate to improve the correct identification of metastatic cancer, saving thousands of lives and can also improve the speed and efficiency of the process, thereby taking fewer resources and time. So far, the deep learning methodology of AI has been used in research to detect cancer. This study is a novel approach to determining the potential of using preprocessing algorithms combined with classification algorithms in detecting metastatic cancer. The study used two preprocessing algorithms: principal component analysis (PCA) and the genetic algorithm, to reduce the dimensionality of the dataset and then used three classification algorithms: logistic regression, decision tree classifier, and k-nearest neighbors to detect metastatic cancer in the pathology scans. The highest accuracy of 71.14% was produced by the ML pipeline comprising of PCA, the genetic algorithm, and the k-nearest neighbor algorithm, suggesting that preprocessing and classification algorithms have great potential for detecting metastatic cancer.Keywords: breast cancer, principal component analysis, genetic algorithm, k-nearest neighbors, decision tree classifier, logistic regression
Procedia PDF Downloads 834224 Gestational Diabetes Mellitus (GDM) Increasing Postpartum Screening to Prevent T2D
Authors: Boma Nellie S, Nambiar Ritu, K. Kanchanmala, T. Rashida, Israell Imelda, Moul Khusnud, Michael Marina
Abstract:
Gestational diabetes (GDM) imparts an increased life long risk of developing Type 2 Diabetes Mellitus (T2DM) and cardiovascular disease in women. Once diagnosed with GDM women have up to 74% increased cumulative risk developing T2DM in 10-15 years. Identifying women at increased risk of developing T2DM and offering them pharmacological and lifestyle management interventions will delay or eliminate the development of diabetes in this population. While ADA recommends that all gestational diabetics be offered postnatal screening, worldwide the screening rates from 35-75% and Al Rahba Hospital with a robust universal antenatal screening program for GDM was at a dismal 9% in 2011. A multidisciplinary team was put together involving OB/Gyn Physicians, Midwives, Nurses (ward and OPD) Diabetic Educators, Dietitians, Medical Records, Laboratory & IT with the implementation of multiple strategies to increase the uptake of postpartum screening of the gestational diabetic.Keywords: GDM, postnatal screening, preventing type 2 diabetes, lifestyle management
Procedia PDF Downloads 5234223 Early Detection of Major Earthquakes Using Broadband Accelerometers
Authors: Umberto Cerasani, Luca Cerasani
Abstract:
Methods for earthquakes forecasting have been intensively investigated in the last decades, but there is still no universal solution agreed by seismologists. Rock failure is most often preceded by a tiny elastic movement in the failure area and by the appearance of micro-cracks. These micro-cracks could be detected at the soil surface and represent useful earth-quakes precursors. The aim of this study was to verify whether tiny raw acceleration signals (in the 10⁻¹ to 10⁻⁴ cm/s² range) prior to the arrival of main primary-waves could be exploitable and related to earthquakes magnitude. Mathematical tools such as Fast Fourier Transform (FFT), moving average and wavelets have been applied on raw acceleration data available on the ITACA web site, and the study focused on one of the most unpredictable earth-quakes, i.e., the August 24th, 2016 at 01H36 one that occurred in the central Italy area. It appeared that these tiny acceleration signals preceding main P-waves have different patterns both on frequency and time domains for high magnitude earthquakes compared to lower ones.Keywords: earthquake, accelerometer, earthquake forecasting, seism
Procedia PDF Downloads 1464222 Pros and Cons of Nanoparticles on Health
Authors: Amber Shahi, Ayesha Tazeen, Abdus Samad, Shama Parveen
Abstract:
Nanoparticles (NPs) are tiny particles. According to the International Organization for Standardization, the size range of NPs is in the nanometer range (1-100 nm). They show distinct properties that are not shown by larger particles of the same material. NPs are currently being used in different fields due to their unique physicochemical nature. NPs are a boon for medical sciences, environmental sciences, electronics, and textile industries. However, there is growing concern about their potential adverse effects on human health. This poster presents a comprehensive review of the current literature on the pros and cons of NPs on human health. The poster will discuss the various types of interactions of NPs with biological systems. There are a number of beneficial uses of NPs in the field of health and environmental welfare. NPs are very useful in disease diagnosis, antimicrobial action, and the treatment of diseases like Alzheimer’s. They can also cross the blood-brain barrier, making them capable of treating brain diseases. Additionally, NPs can target specific tumors and be used for cancer treatment. To treat environmental health, NPs also act as catalytic converters to reduce pollution from the environment. On the other hand, NPs also have some negative impacts on the human body, such as being cytotoxic and genotoxic. They can also affect the reproductive system, such as the testis and ovary, and sexual behavior. The poster will further discuss the routes of exposure of NPs. The poster will conclude with a discussion of the current regulations and guidelines on the use of NPs in various applications. It will highlight the need for further research and the development of standardized toxicity testing methods to ensure the safe use of NPs in various applications. When using NPs in diagnosis and treatment, we should also take into consideration their safe concentration in the body. Overall, this poster aims to provide a comprehensive overview of the pros and cons of NPs on human health and to promote awareness and understanding of the potential risks and benefits associated with their use.Keywords: disease diagnosis, human health, nanoparticles, toxicity testing
Procedia PDF Downloads 814221 Hearing Conservation Program for Vector Control Workers: Short-Term Outcomes from a Cluster-Randomized Controlled Trial
Authors: Rama Krishna Supramanian, Marzuki Isahak, Noran Naqiah Hairi
Abstract:
Noise-induced hearing loss (NIHL) is one of the highest recorded occupational diseases, despite being preventable. Hearing Conservation Program (HCP) is designed to protect workers hearing and prevent them from developing hearing impairment due to occupational noise exposures. However, there is still a lack of evidence regarding the effectiveness of this program. The purpose of this study was to determine the effectiveness of a Hearing Conservation Program (HCP) in preventing or reducing audiometric threshold changes among vector control workers. This study adopts a cluster randomized controlled trial study design, with district health offices as the unit of randomization. Nine district health offices were randomly selected and 183 vector control workers were randomized to intervention or control group. The intervention included a safety and health policy, noise exposure assessment, noise control, distribution of appropriate hearing protection devices, training and education program and audiometric testing. The control group only underwent audiometric testing. Audiometric threshold changes observed in the intervention group showed improvement in the hearing threshold level for all frequencies except 500 Hz and 8000 Hz for the left ear. The hearing threshold changes range from 1.4 dB to 5.2 dB with largest improvement at higher frequencies mainly 4000 Hz and 6000 Hz. Meanwhile for the right ear, the mean hearing threshold level remained similar at 4000 Hz and 6000 Hz after 3 months of intervention. The Hearing Conservation Program (HCP) is effective in preserving the hearing of vector control workers involved in fogging activity as well as increasing their knowledge, attitude and practice towards noise-induced hearing loss (NIHL).Keywords: adult, hearing conservation program, noise-induced hearing loss, vector control worker
Procedia PDF Downloads 1714220 Fight against Money Laundering with Optical Character Recognition
Authors: Saikiran Subbagari, Avinash Malladhi
Abstract:
Anti Money Laundering (AML) regulations are designed to prevent money laundering and terrorist financing activities worldwide. Financial institutions around the world are legally obligated to identify, assess and mitigate the risks associated with money laundering and report any suspicious transactions to governing authorities. With increasing volumes of data to analyze, financial institutions seek to automate their AML processes. In the rise of financial crimes, optical character recognition (OCR), in combination with machine learning (ML) algorithms, serves as a crucial tool for automating AML processes by extracting the data from documents and identifying suspicious transactions. In this paper, we examine the utilization of OCR for AML and delve into various OCR techniques employed in AML processes. These techniques encompass template-based, feature-based, neural network-based, natural language processing (NLP), hidden markov models (HMMs), conditional random fields (CRFs), binarizations, pattern matching and stroke width transform (SWT). We evaluate each technique, discussing their strengths and constraints. Also, we emphasize on how OCR can improve the accuracy of customer identity verification by comparing the extracted text with the office of foreign assets control (OFAC) watchlist. We will also discuss how OCR helps to overcome language barriers in AML compliance. We also address the implementation challenges that OCR-based AML systems may face and offer recommendations for financial institutions based on the data from previous research studies, which illustrate the effectiveness of OCR-based AML.Keywords: anti-money laundering, compliance, financial crimes, fraud detection, machine learning, optical character recognition
Procedia PDF Downloads 1464219 The Importance of Imaging and Functional Tests for Early Detection of Occupational Diseases in Kosovo's Miners
Authors: Krenare Shabani, Kreshnike Dedushi Hoti, Serbeze Kabashi, Jeton Shatri, Arben Rroji, Mrikë Bunjaku, Leotrim Berisha, Jona Kosova, Edmond Puca, Bleriana Shabani
Abstract:
Introduction: Workers in Kosovo's mining industry are subjected to hazardous working conditions and airborne particles, such as silica dust, which can cause silicosis and other severe respiratory illnesses. The purpose of this research is to assess the health impacts of such exposures, as well as the importance of imaging and functional testing in detecting pathological changes early on. Methodology: The study is prospective and cross-sectional and was carried out during the year 2024. 626 people (446 miners and 180 non-miners) were enrolled in the study. Subjects underwent spirometry and chest radiography. Data were analysed with SPSS24. Results: The average age of the participants is 48 years. Demographics and Smoking: Smoking was common among young miners. Radiological Changes: Radiographic abnormalities in the lungs were seen in 23.1% of miners and 10.6% of non-miners, including small irregular opacities and emphysematous changes. Lung Function: The FEV1/FVC ratio decreased with increased exposure time, indicating a decline in pulmonary function.Impact of Exposure Duration: Longer exposure duration was associated with a higher number of miners experiencing coughs and requiring medical consultations such as CT scans and biopsies. Conclusions: Medical imaging and functional testing are critical for early diagnosis of lung abnormalities in miners.Findings demonstrate a strong correlation between extended exposure to mine dust and the development of respiratory disorders, emphasising the importance of preventative measures and routine health monitoring.Keywords: silicosis, miners, imaging, spirometry
Procedia PDF Downloads 294218 The Role of Artificial Intelligence Algorithms in Decision-Making Policies
Authors: Marisa Almeida AraúJo
Abstract:
Artificial intelligence (AI) tools are being used (including in the criminal justice system) and becomingincreasingly popular. The many questions that these (future) super-beings pose the neuralgic center is rooted in the (old) problematic between rationality and morality. For instance, if we follow a Kantian perspective in which morality derives from AI, rationality will also surpass man in ethical and moral standards, questioning the nature of mind, the conscience of self and others, and moral. The recognition of superior intelligence in a non-human being puts us in the contingency of having to recognize a pair in a form of new coexistence and social relationship. Just think of the humanoid robot Sophia, capable of reasoning and conversation (and who has been recognized for Saudi citizenship; a fact that symbolically demonstrates our empathy with the being). Machines having a more intelligent mind, and even, eventually, with higher ethical standards to which, in the alluded categorical imperative, we would have to subject ourselves under penalty of contradiction with the universal Kantian law. Recognizing the complex ethical and legal issues and the significant impact on human rights and democratic functioning itself is the goal of our work.Keywords: ethics, artificial intelligence, legal rules, principles, philosophy
Procedia PDF Downloads 1994217 Teaching Practices for Subverting Significant Retentive Learner Errors in Arithmetic
Authors: Michael Lousis
Abstract:
The systematic identification of the most conspicuous and significant errors made by learners during three-years of testing of their progress in learning Arithmetic throughout the development of the Kassel Project in England and Greece was accomplished. How much retentive these errors were over three-years in the officially provided school instruction of Arithmetic in these countries has also been shown. The learners’ errors in Arithmetic stemmed from a sample, which was comprised of two hundred (200) English students and one hundred and fifty (150) Greek students. The sample was purposefully selected according to the students’ participation in each testing session in the development of the three-year project, in both domains simultaneously in Arithmetic and Algebra. Specific teaching practices have been invented and are presented in this study for subverting these learners’ errors, which were found out to be retentive to the level of the nationally provided mathematical education of each country. The invention and the development of these proposed teaching practices were founded on the rationality of the theoretical accounts concerning the explanation, prediction and control of the errors, on the conceptual metaphor and on an analysis, which tried to identify the required cognitive components and skills of the specific tasks, in terms of Psychology and Cognitive Science as applied to information-processing. The aim of the implementation of these instructional practices is not only the subversion of these errors but the achievement of the mathematical competence, as this was defined to be constituted of three elements: appropriate representations - appropriate meaning - appropriately developed schemata. However, praxis is of paramount importance, because there is no independent of science ‘real-truth’ and because praxis serves as quality control when it takes the form of a cognitive method.Keywords: arithmetic, cognitive science, cognitive psychology, information-processing paradigm, Kassel project, level of the nationally provided mathematical education, praxis, remedial mathematical teaching practices, retentiveness of errors
Procedia PDF Downloads 3174216 Adapting Inclusive Residential Models to Match Universal Accessibility and Fire Protection
Authors: Patricia Huedo, Maria José Ruá, Raquel Agost-Felip
Abstract:
Ensuring sustainable development of urban environments means guaranteeing adequate environmental conditions, being resilient and meeting conditions of safety and inclusion for all people, regardless of their condition. All existing buildings should meet basic safety conditions and be equipped with safe and accessible routes, along with visual, acoustic and tactile signals to protect their users or potential visitors, and regardless of whether they undergo rehabilitation or change of use processes. Moreover, from a social perspective, we consider the need to prioritize buildings occupied by the most vulnerable groups of people that currently do not have specific regulations tailored to their needs. Some residential models in operation are not only outside the scope of application of the regulations in force; they also lack a project or technical data that would allow knowing the fire behavior of the construction materials. However, the difficulty and cost involved in adapting the entire building stock to current regulations can never justify the lack of safety for people. Hence, this work develops a simplified model to assess compliance with the basic safety conditions in case of fire and its compatibility with the specific accessibility needs of each user. The purpose is to support the designer in decision making, as well as to contribute to the development of a basic fire safety certification tool to be applied in inclusive residential models. This work has developed a methodology to support designers in adapting Social Services Centers, usually intended to vulnerable people. It incorporates a checklist of 9 items and information from sources or standards that designers can use to justify compliance or propose solutions. For each item, the verification system is justified, and possible sources of consultation are provided, considering the possibility of lacking technical documentation of construction systems or building materials. The procedure is based on diagnosing the degree of compliance with fire conditions of residential models used by vulnerable groups, considering the special accessibility conditions required by each user group. Through visual inspection and site surveying, the verification model can serve as a support tool, significantly streamlining the diagnostic phase and reducing the number of tests to be requested by over 75%. This speeds up and simplifies the diagnostic phase. To illustrate the methodology, two different buildings in the Valencian Region (Spain) have been selected. One case study is a mental health facility for residential purposes, located in a rural area, on the outskirts of a small town; the other one, is a day care facility for individuals with intellectual disabilities, located in a medium-sized city. The comparison between the case studies allow to validate the model in distinct conditions. Verifying compliance with a basic security level can allow a quality seal and a public register of buildings adapted to fire regulations to be established, similarly to what is being done with other types of attributes such as energy performance.Keywords: fire safety, inclusive housing, universal accessibility, vulnerable people
Procedia PDF Downloads 244215 A 3D Cell-Based Biosensor for Real-Time and Non-Invasive Monitoring of 3D Cell Viability and Drug Screening
Authors: Yuxiang Pan, Yong Qiu, Chenlei Gu, Ping Wang
Abstract:
In the past decade, three-dimensional (3D) tumor cell models have attracted increasing interest in the field of drug screening due to their great advantages in simulating more accurately the heterogeneous tumor behavior in vivo. Drug sensitivity testing based on 3D tumor cell models can provide more reliable in vivo efficacy prediction. The gold standard fluorescence staining is hard to achieve the real-time and label-free monitoring of the viability of 3D tumor cell models. In this study, micro-groove impedance sensor (MGIS) was specially developed for dynamic and non-invasive monitoring of 3D cell viability. 3D tumor cells were trapped in the micro-grooves with opposite gold electrodes for the in-situ impedance measurement. The change of live cell number would cause inversely proportional change to the impedance magnitude of the entire cell/matrigel to construct and reflect the proliferation and apoptosis of 3D cells. It was confirmed that 3D cell viability detected by the MGIS platform is highly consistent with the standard live/dead staining. Furthermore, the accuracy of MGIS platform was demonstrated quantitatively using 3D lung cancer model and sophisticated drug sensitivity testing. In addition, the parameters of micro-groove impedance chip processing and measurement experiments were optimized in details. The results demonstrated that the MGIS and 3D cell-based biosensor and would be a promising platform to improve the efficiency and accuracy of cell-based anti-cancer drug screening in vitro.Keywords: micro-groove impedance sensor, 3D cell-based biosensors, 3D cell viability, micro-electromechanical systems
Procedia PDF Downloads 1294214 The Economics of Justice as Fairness
Authors: Antonio Abatemarco, Francesca Stroffolini
Abstract:
In the economic literature, Rawls’ Theory of Justice is usually interpreted in a two-stage setting, where a priority to the worst off individual is imposed as a distributive value judgment. In this paper, instead, we model Rawls’ Theory in a three-stage setting, that is, a separating line is drawn between the original position, the educational stage, and the working life. Hence, in this paper, we challenge the common interpretation of Rawls’ Theory of Justice as Fairness by showing that this Theory goes well beyond the definition of a distributive value judgment, in such a way as to embrace efficiency issues as well. In our model, inequalities are shown to be permitted as far as they stimulate a greater effort in education in the population, and so economic growth. To our knowledge, this is the only possibility for the inequality to be ‘bought’ by both the most-, and above all, the least-advantaged individual as suggested by the Difference Principle. Finally, by recalling the old tradition of ‘universal ex-post efficiency’, we show that a unique optimal social contract does not exist behind the veil of ignorance; more precisely, the sole set of potentially Rawls-optimal social contracts can be identified a priori, and partial justice orderings derived accordingly.Keywords: justice, Rawls, inequality, social contract
Procedia PDF Downloads 2254213 The Impact of Introspective Models on Software Engineering
Authors: Rajneekant Bachan, Dhanush Vijay
Abstract:
The visualization of operating systems has refined the Turing machine, and current trends suggest that the emulation of 32 bit architectures will soon emerge. After years of technical research into Web services, we demonstrate the synthesis of gigabit switches, which embodies the robust principles of theory. Loam, our new algorithm for forward-error correction, is the solution to all of these challenges.Keywords: software engineering, architectures, introspective models, operating systems
Procedia PDF Downloads 5394212 Experimental Simulation Set-Up for Validating Out-Of-The-Loop Mitigation when Monitoring High Levels of Automation in Air Traffic Control
Authors: Oliver Ohneiser, Francesca De Crescenzio, Gianluca Di Flumeri, Jan Kraemer, Bruno Berberian, Sara Bagassi, Nicolina Sciaraffa, Pietro Aricò, Gianluca Borghini, Fabio Babiloni
Abstract:
An increasing degree of automation in air traffic will also change the role of the air traffic controller (ATCO). ATCOs will fulfill significantly more monitoring tasks compared to today. However, this rather passive role may lead to Out-Of-The-Loop (OOTL) effects comprising vigilance decrement and less situation awareness. The project MINIMA (Mitigating Negative Impacts of Monitoring high levels of Automation) has conceived a system to control and mitigate such OOTL phenomena. In order to demonstrate the MINIMA concept, an experimental simulation set-up has been designed. This set-up consists of two parts: 1) a Task Environment (TE) comprising a Terminal Maneuvering Area (TMA) simulator as well as 2) a Vigilance and Attention Controller (VAC) based on neurophysiological data recording such as electroencephalography (EEG) and eye-tracking devices. The current vigilance level and the attention focus of the controller are measured during the ATCO’s active work in front of the human machine interface (HMI). The derived vigilance level and attention trigger adaptive automation functionalities in the TE to avoid OOTL effects. This paper describes the full-scale experimental set-up and the component development work towards it. Hence, it encompasses a pre-test whose results influenced the development of the VAC as well as the functionalities of the final TE and the two VAC’s sub-components.Keywords: automation, human factors, air traffic controller, MINIMA, OOTL (Out-Of-The-Loop), EEG (Electroencephalography), HMI (Human Machine Interface)
Procedia PDF Downloads 3844211 Analyzing the Effect of Multilingualism, Language 1, and Language 2 on Reading Comprehension
Authors: Judith Hanke
Abstract:
Due to the increase of students with reading difficulties, digital reading support with diagnostics was developed to foster the individual student's reading comprehension. The digital reading support focused on the reading comprehension of elementary school students. The digital reading packages consist of literary texts with aligned reading exercises. The number of students with German as a second language is growing in Germany. Students with multilingualism, language 1, and language 2 learn German together in school. The research's focus is on determining whether and to what extent multilingualism, language 1, and language 2 affect reading comprehension. For the methodology, an ABA design was selected for the intervention study to examine the reading support. The study was expedited from April 2023 until July 2023 and collected quantitative data of individuals, groups, and classes. It comprised a survey group (N = 58) and a control group (N = 53). The quantitative data was collected from 3 classes of 3 teachers and 47 students for all three test times. To show differences between the groups, a standardized reading comprehension test was used for the three test times, pretest, posttest, and follow-up. The standardized test consists of three subtests regarding word comprehension, sentence comprehension, and text comprehension. The main findings include that students who spoke German as their first language had the best test scores. Interestingly, students with a different language had better testing scores than students with German as the first language and (an) other language/s. Also, the students with another language outperformed the native language speakers in one of the subtests of the post-testing. The variables of spoken language at home and German as a second language were also examined and correlated with the test results. One significant correlation was found between spoken language at home and the text comprehension test of the pretesting. Additionally, the variable German as a second language had multiple significant correlations in the pretest, posttest and follow-up. The study's significance is to understand the influence of several languages, language 1, and language 2, on reading comprehension.Keywords: multilingualism, language 1, language 2, reading comprehension, second language
Procedia PDF Downloads 334210 An In-Situ Integrated Micromachining System for Intricate Micro-Parts Machining
Authors: Shun-Tong Chen, Wei-Ping Huang, Hong-Ye Yang, Ming-Chieh Yeh, Chih-Wei Du
Abstract:
This study presents a novel versatile high-precision integrated micromachining system that combines contact and non-contact micromachining techniques to machine intricate micro-parts precisely. Two broad methods of micro fabrication-1) volume additive (micro co-deposition), and 2) volume subtractive (nanometric flycutting, ultrafine w-EDM (wire Electrical Discharge Machining), and micro honing) - are integrated in the developed micromachining system, and their effectiveness is verified. A multidirectional headstock that supports various machining orientations is designed to evaluate the feasibility of multifunctional micromachining. An exchangeable working-tank that allows for various machining mechanisms is also incorporated into the system. Hence, the micro tool and workpiece need not be unloaded or repositioned until all the planned tasks have been completed. By using the designed servo rotary mechanism, a nanometric flycutting approach with a concentric rotary accuracy of 5-nm is constructed and utilized with the system to machine a diffraction-grating element with a nano-metric scale V-groove array. To improve the wear resistance of the micro tool, the micro co-deposition function is used to provide a micro-abrasive coating by an electrochemical method. The construction of ultrafine w-EDM facilitates the fabrication of micro slots with a width of less than 20-µm on a hardened tool. The hardened tool can thus be employed as a micro honing-tool to hone a micro hole with an internal diameter of 200 µm on SKD-11 molded steel. Experimental results prove that intricate micro-parts can be in-situ manufactured with high-precision by the developed integrated micromachining system.Keywords: integrated micromachining system, in-situ micromachining, nanometric flycutting, ultrafine w-EDM, micro honing
Procedia PDF Downloads 4114209 Semantic Features of Turkish and Spanish Phraseological Units with a Somatic Component ‘Hand’
Authors: Narmina Mammadova
Abstract:
In modern linguistics, the comparative study of languages is becoming increasingly popular, the typology and comparison of languages that have different structures is expanding and deepening. Of particular interest is the study of phraseological units, which makes it possible to identify the specific features of the compared languages in all their national identity. This paper gives a brief analysis of the comparative study of somatic phraseological units (SFU) of the Spanish and Turkish languages with the component "hand" in the semantic aspect; identification of equivalents, analogs and non-equivalent units, as well as a description of methods of translation of non-equivalent somatic phraseological units. Comparative study of the phraseology of unrelated languages is of particular relevance since it allows us to identify both general, universal features and differential and specific features characteristic of a particular language. Based on the results of the generalization of the study, it can be assumed that phraseological units containing a somatic component have a high interlingual phraseological activity, which contributes to an increase in the degree of interlingual equivalence.Keywords: Linguoculturology, Turkish, Spanish, language picture of the world, phraseological units, semantic microfield
Procedia PDF Downloads 1974208 Testing of Complicated Bus Bar Protection Using Smart Testing Methodology
Authors: K. N. Dinesh Babu
Abstract:
In this paper, the protection of a complicated bus arrangement with a dual bus coupler and bus sectionalizer using low impedance differential protection applicable for very high voltages like 220kV and 400kV is discussed. In many power generation stations, several operational procedures are implemented to utilize the transfer bus as the main bus and to facilitate the maintenance of circuit breakers and current transformers (in each section) without shutting down the bay(s). Owing to this fact, the complications in operational philosophy have thrown challenges for the bus bar protection implementation. Many bus topologies allow any one of the main buses available in the station to be used as an auxiliary bus. In such a system, pre-defined precautions and procedures are made as guidelines, which are followed before assigning any bus as an auxiliary bus. The procedure involves shifting of links, changing rotary switches, insertion of test block, and so on, thereby causing unreliable operation. This kind of unreliable operation or inadvertent procedural lapse may result in the isolation of the bus bar from the grid due to the unpredictable operation of the bus bar protection relay, which is a commonly occurring phenomenon due to manual mistakes. With the sophisticated configuration and implementation of logic in modern intelligent electronic devices, the operator is free to select the transfer arrangement without sacrificing the protection required by a bus differential system for a reliable operation, and labor-intensive processes are completely eliminated. This paper deals with the procedure to test the security logic for such special scenarios using Megger make SMRT, bus bar protection relay to assure system stability and get rid of all the specific operational precautions/procedure.Keywords: bus bar protection, by-pass isolator, blind spot, breaker failure, intelligent electronic device, end fault, bus unification, directional principle, zones of protection, breaker re-trip, under voltage security, smart megger relay tester
Procedia PDF Downloads 694207 Proton Irradiation Testing on Commercial Enhancement Mode GaN Power Transistor
Authors: L. Boyaci
Abstract:
Two basic equipment of electrical power subsystem of space satellites are Power Conditioning Unit (PCU) and Power Distribution Unit (PDU). Today, the main switching element used in power equipment in satellites is silicon (Si) based radiation-hardened MOSFET. GaNFETs have superior performances over MOSFETs in terms of their conduction and switching characteristics. GaNFET has started to take MOSFET’s place in many applications in industry especially by virtue of its switching performances. If GaNFET can also be used in equipment for space applications, this would be great revolution for future space power subsystem designs. In this study, the effect of proton irradiation on Gallium Nitride based power transistors was investigated. Four commercial enhancement mode GaN power transistors from Efficient Power Conversion Corporation (EPC) are irradiated with 30MeV protons while devices are switching. Flux of 8.2x10⁹ protons/cm²/s is applied for 12.5 seconds to reach ultimate fluence of 10¹¹ protons/cm². Vgs-Ids characteristics are measured and recorded for each device before, during and after irradiation. It was observed that if there would be destructive events. Proton induced permanent damage on devices is not observed. All the devices remained healthy and continued to operate. For two of these devices, further irradiation is applied with same flux for 30 minutes up to a total fluence level of 1.476x10¹³ protons/cm². We observed that GaNFETs are fully functional under this high level of radiation and no destructive events and irreversible failures took place for transistors. Results reveal that irradiated GaNFET in this experiment has radiation tolerance under proton testing and very important candidate for being one of the future power switching element in space.Keywords: enhancement mode GaN power transistors, proton irradiation effects, radiation tolerance
Procedia PDF Downloads 1544206 Predicting Emerging Agricultural Investment Opportunities: The Potential of Structural Evolution Index
Authors: Kwaku Damoah
Abstract:
The agricultural sector is characterized by continuous transformation, driven by factors such as demographic shifts, evolving consumer preferences, climate change, and migration trends. This dynamic environment presents complex challenges for key stakeholders including farmers, governments, and investors, who must navigate these changes to achieve optimal investment returns. To effectively predict market trends and uncover promising investment opportunities, a systematic, data-driven approach is essential. This paper introduces the Structural Evolution Index (SEI), a machine learning-based methodology. SEI is specifically designed to analyse long-term trends and forecast the potential of emerging agricultural products for investment. Versatile in application, it evaluates various agricultural metrics such as production, yield, trade, land use, and consumption, providing a comprehensive view of the evolution within agricultural markets. By harnessing data from the UN Food and Agricultural Organisation (FAOSTAT), this study demonstrates the SEI's capabilities through Comparative Exploratory Analysis and evaluation of international trade in agricultural products, focusing on Malaysia and Singapore. The SEI methodology reveals intricate patterns and transitions within the agricultural sector, enabling stakeholders to strategically identify and capitalize on emerging markets. This predictive framework is a powerful tool for decision-makers, offering crucial insights that help anticipate market shifts and align investments with anticipated returns.Keywords: agricultural investment, algorithm, comparative exploratory analytics, machine learning, market trends, predictive analytics, structural evolution index
Procedia PDF Downloads 634205 Early Impact Prediction and Key Factors Study of Artificial Intelligence Patents: A Method Based on LightGBM and Interpretable Machine Learning
Authors: Xingyu Gao, Qiang Wu
Abstract:
Patents play a crucial role in protecting innovation and intellectual property. Early prediction of the impact of artificial intelligence (AI) patents helps researchers and companies allocate resources and make better decisions. Understanding the key factors that influence patent impact can assist researchers in gaining a better understanding of the evolution of AI technology and innovation trends. Therefore, identifying highly impactful patents early and providing support for them holds immeasurable value in accelerating technological progress, reducing research and development costs, and mitigating market positioning risks. Despite the extensive research on AI patents, accurately predicting their early impact remains a challenge. Traditional methods often consider only single factors or simple combinations, failing to comprehensively and accurately reflect the actual impact of patents. This paper utilized the artificial intelligence patent database from the United States Patent and Trademark Office and the Len.org patent retrieval platform to obtain specific information on 35,708 AI patents. Using six machine learning models, namely Multiple Linear Regression, Random Forest Regression, XGBoost Regression, LightGBM Regression, Support Vector Machine Regression, and K-Nearest Neighbors Regression, and using early indicators of patents as features, the paper comprehensively predicted the impact of patents from three aspects: technical, social, and economic. These aspects include the technical leadership of patents, the number of citations they receive, and their shared value. The SHAP (Shapley Additive exPlanations) metric was used to explain the predictions of the best model, quantifying the contribution of each feature to the model's predictions. The experimental results on the AI patent dataset indicate that, for all three target variables, LightGBM regression shows the best predictive performance. Specifically, patent novelty has the greatest impact on predicting the technical impact of patents and has a positive effect. Additionally, the number of owners, the number of backward citations, and the number of independent claims are all crucial and have a positive influence on predicting technical impact. In predicting the social impact of patents, the number of applicants is considered the most critical input variable, but it has a negative impact on social impact. At the same time, the number of independent claims, the number of owners, and the number of backward citations are also important predictive factors, and they have a positive effect on social impact. For predicting the economic impact of patents, the number of independent claims is considered the most important factor and has a positive impact on economic impact. The number of owners, the number of sibling countries or regions, and the size of the extended patent family also have a positive influence on economic impact. The study primarily relies on data from the United States Patent and Trademark Office for artificial intelligence patents. Future research could consider more comprehensive data sources, including artificial intelligence patent data, from a global perspective. While the study takes into account various factors, there may still be other important features not considered. In the future, factors such as patent implementation and market applications may be considered as they could have an impact on the influence of patents.Keywords: patent influence, interpretable machine learning, predictive models, SHAP
Procedia PDF Downloads 504204 Prediction of Remaining Life of Industrial Cutting Tools with Deep Learning-Assisted Image Processing Techniques
Authors: Gizem Eser Erdek
Abstract:
This study is research on predicting the remaining life of industrial cutting tools used in the industrial production process with deep learning methods. When the life of cutting tools decreases, they cause destruction to the raw material they are processing. This study it is aimed to predict the remaining life of the cutting tool based on the damage caused by the cutting tools to the raw material. For this, hole photos were collected from the hole-drilling machine for 8 months. Photos were labeled in 5 classes according to hole quality. In this way, the problem was transformed into a classification problem. Using the prepared data set, a model was created with convolutional neural networks, which is a deep learning method. In addition, VGGNet and ResNet architectures, which have been successful in the literature, have been tested on the data set. A hybrid model using convolutional neural networks and support vector machines is also used for comparison. When all models are compared, it has been determined that the model in which convolutional neural networks are used gives successful results of a %74 accuracy rate. In the preliminary studies, the data set was arranged to include only the best and worst classes, and the study gave ~93% accuracy when the binary classification model was applied. The results of this study showed that the remaining life of the cutting tools could be predicted by deep learning methods based on the damage to the raw material. Experiments have proven that deep learning methods can be used as an alternative for cutting tool life estimation.Keywords: classification, convolutional neural network, deep learning, remaining life of industrial cutting tools, ResNet, support vector machine, VggNet
Procedia PDF Downloads 79