Search results for: k-means clustering based feature weighting
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 28587

Search results for: k-means clustering based feature weighting

24117 Optimizing Coal Yard Management Using Discrete Event Simulation

Authors: Iqbal Felani

Abstract:

A Coal-Fired Power Plant has some integrated facilities to handle coal from three separated coal yards to eight units power plant’s bunker. But nowadays the facilities are not reliable enough for supporting the system. Management planned to invest some facilities to increase the reliability. They also had a plan to make single spesification of coal used all of the units, called Single Quality Coal (SQC). This simulation would compare before and after improvement with two scenarios i.e First In First Out (FIFO) and Last In First Out (LIFO). Some parameters like stay time, reorder point and safety stock is determined by the simulation. Discrete event simulation based software, Flexsim 5.0, is used to help the simulation. Based on the simulation, Single Quality Coal with FIFO scenario has the shortest staytime with 8.38 days.

Keywords: Coal Yard Management, Discrete event simulation First In First Out, Last In First Out.

Procedia PDF Downloads 652
24116 Improving Neonatal Abstinence Syndrome Assessments

Authors: Nancy Wilson

Abstract:

In utero, fetal drug exposure is prevalent amongst birthing facilities. Assessment tools for neonatal abstinence syndrome (NAS) are often cumbersome and ill-fitting, harboring immense subjectivity. This paradox often leads the clinical assessor to be hypervigilant when assessing the newborn for subtle symptoms of NAS, often mistaken for normal newborn behaviors. As a quality improvement initiative, this project led to a more adaptable NAS tool termed eat, sleep, console (ESC). This function-based NAS assessment scores the infant based on the ability to accomplish three basic newborn necessities- to sleep, to eat, and to be consoled. Literature supports that ESC methodology improves patient and family outcomes while providing more cost-effective care.

Keywords: neonatal abstinence syndrome, neonatal opioid withdrawal, maternal substance abuse, pregnancy, and addiction, Finnegan neonatal abstinence syndrome tool, eat, sleep, console

Procedia PDF Downloads 123
24115 Learning with Music: The Effects of Musical Tension on Long-Term Declarative Memory Formation

Authors: Nawras Kurzom, Avi Mendelsohn

Abstract:

The effects of background music on learning and memory are inconsistent, partly due to the intrinsic complexity and variety of music and partly to individual differences in music perception and preference. A prominent musical feature that is known to elicit strong emotional responses is musical tension. Musical tension can be brought about by building anticipation of rhythm, harmony, melody, and dynamics. Delaying the resolution of dominant-to-tonic chord progressions, as well as using dissonant harmonics, can elicit feelings of tension, which can, in turn, affect memory formation of concomitant information. The aim of the presented studies was to explore how forming declarative memory is influenced by musical tension, brought about within continuous music as well as in the form of isolated chords with varying degrees of dissonance/consonance. The effects of musical tension on long-term memory of declarative information were studied in two ways: 1) by evoking tension within continuous music pieces by delaying the release of harmonic progressions from dominant to tonic chords, and 2) by using isolated single complex chords with various degrees of dissonance/roughness. Musical tension was validated through subjective reports of tension, as well as physiological measurements of skin conductance response (SCR) and pupil dilation responses to the chords. In addition, music information retrieval (MIR) was used to quantify musical properties associated with tension and its release. Each experiment included an encoding phase, wherein individuals studied stimuli (words or images) with different musical conditions. Memory for the studied stimuli was tested 24 hours later via recognition tasks. In three separate experiments, we found positive relationships between tension perception and physiological measurements of SCR and pupil dilation. As for memory performance, we found that background music, in general, led to superior memory performance as compared to silence. We detected a trade-off effect between tension perception and memory, such that individuals who perceived musical tension as such displayed reduced memory performance for images encoded during musical tension, whereas tense music benefited memory for those who were less sensitive to the perception of musical tension. Musical tension exerts complex interactions with perception, emotional responses, and cognitive performance on individuals with and without musical training. Delineating the conditions and mechanisms that underlie the interactions between musical tension and memory can benefit our understanding of musical perception at large and the diverse effects that music has on ongoing processing of declarative information.

Keywords: musical tension, declarative memory, learning and memory, musical perception

Procedia PDF Downloads 79
24114 Programmed Speech to Text Summarization Using Graph-Based Algorithm

Authors: Hamsini Pulugurtha, P. V. S. L. Jagadamba

Abstract:

Programmed Speech to Text and Text Summarization Using Graph-based Algorithms can be utilized in gatherings to get the short depiction of the gathering for future reference. This gives signature check utilizing Siamese neural organization to confirm the personality of the client and convert the client gave sound record which is in English into English text utilizing the discourse acknowledgment bundle given in python. At times just the outline of the gathering is required, the answer for this text rundown. Thus, the record is then summed up utilizing the regular language preparing approaches, for example, solo extractive text outline calculations

Keywords: Siamese neural network, English speech, English text, natural language processing, unsupervised extractive text summarization

Procedia PDF Downloads 187
24113 The Implementation of Child Adoption as Legal Protection of Children

Authors: Sonny Dewi Judiasih

Abstract:

The principle of a marriage is to achieve a happy and eternity family based on the willing of the God. The family has a fundamental role in the society as a social individual and as a nuclear family consists of father, mother, and children. Thus, each family always would like to have children who will continue the family. However, not all family will be blessed with children and consequently, there is family without children. Therefore, the said the certain family will do any effort to fulfill the wish to have children. One of the ways is to adopt children. The implementation of child adoption is conducted by the family who does not have children but sometimes child adoption is conducted by a family who has already children. The implementation of child adoption is based on the interest of the welfare and the intellectual of the said child. Moreover, it should be based on the social liability of the individual in accordance with the developing of the traditional values as part of the nation culture. The child adoption is conducted for the welfare of the child demonstrates that a change on the basic motive (value) whereby in the past the child adoption is to fulfill the wish of foster parent (to have children in the family). Nowadays the purpose of child adoption is not merely for the interest of foster parent but in particular for the interest, welfare and the future of the child. The development of the society has caused the occurrence of changes of perspective in the society which lead to a need for new law. The court of justice has an impact of such changes. It is evidenced by the court order for child adoption in the legal framework of certainty of law. The changes of motives (value) of the child adoption in the society can be fully understood in the event that the society fully understand that the ultimate purpose of Indonesia nation is to achieve a justice and prosperity society, i.e., social welfare for all Indonesian people.

Keywords: child adoption, family law, legal protection, children

Procedia PDF Downloads 448
24112 Mitigating Food Insecurity and Malnutrition by Promoting Carbon Farming via a Solar-Powered Enzymatic Composting Bioreactor with Arduino-Based Sensors

Authors: Molin A., De Ramos J. M., Cadion L. G., Pico R. L.

Abstract:

Malnutrition and food insecurity represent significant global challenges affecting millions of individuals, particularly in low-income and developing regions. The researchers created a solar-powered enzymatic composting bioreactor with an Arduino-based monitoring system for pH, humidity, and temperature. It manages mixed municipal solid wastes incorporating industrial enzymes and whey additives for accelerated composting and minimized carbon footprint. Within 15 days, the bioreactor yielded 54.54% compost compared to 44.85% from traditional methods, increasing yield by nearly 10%. Tests showed that the bioreactor compost had 4.84% NPK, passing metal analysis standards, while the traditional pit compost had 3.86% NPK; both are suitable for agriculture. Statistical analyses, including ANOVA and Tukey's HSD test, revealed significant differences in agricultural yield across different compost types based on leaf length, width, and number of leaves. The study compared the effects of different composts on Brassica rapa subsp. Chinesis (Petchay) and Brassica juncea (Mustasa) plant growth. For Pechay, significant effects of compost type on plant leaf length (F(5,84) = 62.33, η² = 0.79) and leaf width (F(5,84) = 12.35, η² = 0.42) were found. For Mustasa, significant effects of compost type on leaf length (F(4,70) = 20.61, η² = 0.54), leaf width (F(4,70) = 19.24, η² = 0.52), and number of leaves (F(4,70) = 13.17, η² = 0.43) were observed. This study explores the effectiveness of the enzymatic composting bioreactor and its viability in promoting carbon farming as a solution to food insecurity and malnutrition.

Keywords: malnutrition, food insecurity, enzymatic composting bioreactor, arduino-based monitoring system, enzymes, carbon farming, whey additive, NPK level

Procedia PDF Downloads 31
24111 A Novel Spectral Index for Automatic Shadow Detection in Urban Mapping Based on WorldView-2 Satellite Imagery

Authors: Kaveh Shahi, Helmi Z. M. Shafri, Ebrahim Taherzadeh

Abstract:

In remote sensing, shadow causes problems in many applications such as change detection and classification. It is caused by objects which are elevated, thus can directly affect the accuracy of information. For these reasons, it is very important to detect shadows particularly in urban high spatial resolution imagery which created a significant problem. This paper focuses on automatic shadow detection based on a new spectral index for multispectral imagery known as Shadow Detection Index (SDI). The new spectral index was tested on different areas of World-View 2 images and the results demonstrated that the new spectral index has a massive potential to extract shadows effectively and automatically.

Keywords: spectral index, shadow detection, remote sensing images, World-View 2

Procedia PDF Downloads 510
24110 Assessment of the Performance of Fly Ash Based Geo-Polymer Concrete under Sulphate and Acid Attack

Authors: Talakokula Visalakshi

Abstract:

Concrete is the most commonly used construction material across the globe, its usage is second only to water. It is prepared using ordinary Portland cement whose production contributes to 5-8% of total carbon emission in the world. On the other hand the fly ash by product from the power plants is produced in huge quantities is termed as waste and disposed in landfills. In order to address the above issues mentioned, it is essential that other forms of binding material must be developed in place of cement to make concrete. The geo polymer concrete is one such alternative developed by Davidovits in 1980’s. Geopolymer do not form calcium-silicate hydrates for matrix formation and strength but undergo polycondensation of silica and alumina precursors to attain structural strength. Its setting mechanism depends upon polymerization rather than hydration. As a result it is able to achieve its strength in 3-5 days whereas concrete requires about a month to do the same. The objective of this research is to assess the performance of geopolymer concrete under sulphate and acid attack. The assessment is done based on the experiments conducted on geopolymer concrete. The expected outcomes include that if geopolymer concrete is more durable than normal concrete, then it could be a competitive replacement option of concrete and can lead to significant reduction of carbon foot print and have a positive impact on the environment. Fly ash based geopolymer concrete offers an opportunity to completely remove the cement content from concrete thereby making the concrete a greener and future construction material.

Keywords: fly ash, geo polymer, geopolymer concrete, construction material

Procedia PDF Downloads 475
24109 Automated Transformation of 3D Point Cloud to BIM Model: Leveraging Algorithmic Modeling for Efficient Reconstruction

Authors: Radul Shishkov, Orlin Davchev

Abstract:

The digital era has revolutionized architectural practices, with building information modeling (BIM) emerging as a pivotal tool for architects, engineers, and construction professionals. However, the transition from traditional methods to BIM-centric approaches poses significant challenges, particularly in the context of existing structures. This research introduces a technical approach to bridge this gap through the development of algorithms that facilitate the automated transformation of 3D point cloud data into detailed BIM models. The core of this research lies in the application of algorithmic modeling and computational design methods to interpret and reconstruct point cloud data -a collection of data points in space, typically produced by 3D scanners- into comprehensive BIM models. This process involves complex stages of data cleaning, feature extraction, and geometric reconstruction, which are traditionally time-consuming and prone to human error. By automating these stages, our approach significantly enhances the efficiency and accuracy of creating BIM models for existing buildings. The proposed algorithms are designed to identify key architectural elements within point clouds, such as walls, windows, doors, and other structural components, and to translate these elements into their corresponding BIM representations. This includes the integration of parametric modeling techniques to ensure that the generated BIM models are not only geometrically accurate but also embedded with essential architectural and structural information. Our methodology has been tested on several real-world case studies, demonstrating its capability to handle diverse architectural styles and complexities. The results showcase a substantial reduction in time and resources required for BIM model generation while maintaining high levels of accuracy and detail. This research contributes significantly to the field of architectural technology by providing a scalable and efficient solution for the integration of existing structures into the BIM framework. It paves the way for more seamless and integrated workflows in renovation and heritage conservation projects, where the accuracy of existing conditions plays a critical role. The implications of this study extend beyond architectural practices, offering potential benefits in urban planning, facility management, and historic preservation.

Keywords: BIM, 3D point cloud, algorithmic modeling, computational design, architectural reconstruction

Procedia PDF Downloads 35
24108 Faster, Lighter, More Accurate: A Deep Learning Ensemble for Content Moderation

Authors: Arian Hosseini, Mahmudul Hasan

Abstract:

To address the increasing need for efficient and accurate content moderation, we propose an efficient and lightweight deep classification ensemble structure. Our approach is based on a combination of simple visual features, designed for high-accuracy classification of violent content with low false positives. Our ensemble architecture utilizes a set of lightweight models with narrowed-down color features, and we apply it to both images and videos. We evaluated our approach using a large dataset of explosion and blast contents and compared its performance to popular deep learning models such as ResNet-50. Our evaluation results demonstrate significant improvements in prediction accuracy, while benefiting from 7.64x faster inference and lower computation cost. While our approach is tailored to explosion detection, it can be applied to other similar content moderation and violence detection use cases as well. Based on our experiments, we propose a "think small, think many" philosophy in classification scenarios. We argue that transforming a single, large, monolithic deep model into a verification-based step model ensemble of multiple small, simple, and lightweight models with narrowed-down visual features can possibly lead to predictions with higher accuracy.

Keywords: deep classification, content moderation, ensemble learning, explosion detection, video processing

Procedia PDF Downloads 27
24107 A Systemic Maturity Model

Authors: Emir H. Pernet, Jeimy J. Cano

Abstract:

Maturity models, used descriptively to explain changes in reality or normatively to guide managers to make interventions to make organizations more effective and efficient, are based on the principles of statistical quality control promulgated by Shewhart in the years 30, and on the principles of PDCA continuous improvement (Plan, Do, Check, Act) developed by Deming and Juran. Some frameworks developed over the concept of maturity models includes COBIT, CMM, and ITIL. This paper presents some limitations of traditional maturity models, most of them based on points of reflection and analysis done by some authors. Almost all limitations are related to the mechanistic and reductionist approach of the principles over those models are built. As Systems Theory helps the understanding of the dynamics of organizations and organizational change, the development of a systemic maturity model can help to overcome some of those limitations. This document proposes a systemic maturity model, based on a systemic conceptualization of organizations, focused on the study of the functioning of the parties, the relationships among them, and their behavior as a whole. The concept of maturity from the system theory perspective is conceptually defined as an emergent property of the organization, which arises from as a result of the degree of alignment and integration of their processes. This concept is operationalized through a systemic function that measures the maturity of an organization, and finally validated by the measuring of maturity in organizations. For its operationalization and validation, the model was applied to measure the maturity of organizational Governance, Risk and Compliance (GRC) processes.

Keywords: GRC, maturity model, systems theory, viable system model

Procedia PDF Downloads 294
24106 Oxidation and Reduction Kinetics of Ni-Based Oxygen Carrier for Chemical Looping Combustion

Authors: J. H. Park, R. H. Hwang, K. B. Yi

Abstract:

Carbon Capture and Storage (CCS) is one of the important technology to reduce the CO₂ emission from large stationary sources such as a power plant. Among the carbon technologies for power plants, chemical looping combustion (CLC) has attracted much attention due to a higher thermal efficiency and a lower cost of electricity. A CLC process is consists of a fuel reactor and an air reactor which are interconnected fluidized bed reactor. In the fuel reactor, an oxygen carrier (OC) is reduced by fuel gas such as CH₄, H₂, CO. And the OC is send to air reactor and oxidized by air or O₂ gas. The oxidation and reduction reaction of OC occurs between the two reactors repeatedly. In the CLC system, high concentration of CO₂ can be easily obtained by steam condensation only from the fuel reactor. It is very important to understand the oxidation and reduction characteristics of oxygen carrier in the CLC system to determine the solids circulation rate between the air and fuel reactors, and the amount of solid bed materials. In this study, we have conducted the experiment and interpreted oxidation and reduction reaction characteristics via observing weight change of Ni-based oxygen carrier using the TGA with varying as concentration and temperature. Characterizations of the oxygen carrier were carried out with BET, SEM. The reaction rate increased with increasing the temperature and increasing the inlet gas concentration. We also compared experimental results and adapted basic reaction kinetic model (JMA model). JAM model is one of the nucleation and nuclei growth models, and this model can explain the delay time at the early part of reaction. As a result, the model data and experimental data agree over the arranged conversion and time with overall variance (R²) greater than 98%. Also, we calculated activation energy, pre-exponential factor, and reaction order through the Arrhenius plot and compared with previous Ni-based oxygen carriers.

Keywords: chemical looping combustion, kinetic, nickel-based, oxygen carrier, spray drying method

Procedia PDF Downloads 190
24105 ICT-based Methodologies and Students’ Academic Performance and Retention in Physics: A Case with Newton Laws of Motion

Authors: Gabriel Ocheleka Aniedi A. Udo, Patum Wasinda

Abstract:

The study was carried out to appraise the impact of ICT-based teaching methodologies (video-taped instructions and Power Point presentations) on academic performance and retention of secondary school students in Physics, with particular interest in Newton Laws of Motion. The study was conducted in Cross River State, Nigeria, with a quasi-experimental research design using non-randomised pre-test and post-test control group. The sample for the study consisted of 176 SS2 students drawn from four intact classes of four secondary schools within the study area. Physics Achievement Test (PAT), with a reliability coefficient of 0.85, was used for data collection. Mean and Analysis of Covariance (ANCOVA) was used in the treatment of the obtained data. The results of the study showed that there was a significant difference in the academic performance and retention of students taught using video-taped instructions and those taught using power point presentations. Findings of the study showed that students taught using video-taped instructions had a higher academic performance and retention than those taught using power point presentations. The study concludes that the use of blended ICT-based teaching methods can improve learner’s academic performance and retention.

Keywords: video taped instruction (VTI), power point presentation (PPT), academic performance, retention, physics

Procedia PDF Downloads 62
24104 Clustered Regularly Interspaced Short Palindromic Repeat/cas9-Based Lateral Flow and Fluorescence Diagnostics for Rapid Pathogen Detection

Authors: Mark Osborn

Abstract:

Clustered, regularly interspaced short palindromic repeat (CRISPR/Cas) proteins can be designed to bind specified DNA and RNA sequences and hold great promise for the accurate detection of nucleic acids for diagnostics. Commercially available reagents were integrated into a CRISPR/Cas9-based lateral flow assay that can detect severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) sequences with single-base specificity. This approach requires minimal equipment and represents a simplified platform for field-based deployment. A rapid, multiplex fluorescence CRISPR/Cas9 nuclease cleavage assay capable of detecting and differentiating SARS-CoV-2, influenza A and B, and respiratory syncytial virus in a single reaction was also developed. These findings provide proof of principle for CRISPR/Cas9 point-of-care diagnosis that can detect specific SARS-CoV-2 strain(s). Further, Cas9 cleavage allows for a scalable fluorescent platform for identifying respiratory viral pathogens with overlapping symptomology. Collectively, this approach is a facile platform for diagnostics with broad application to user-defined sequence interrogation and detection.

Keywords: CRISPR/Cas9, lateral flow assay, SARS-Co-V2, single-nucleotide resolution

Procedia PDF Downloads 165
24103 The Return of Daily Life — Improvement Experiments on Urban Village in the Post-Urban Village Era

Authors: Gan Lu, Xu Lei

Abstract:

This is an era when urban village is disappearing in China. A series of social phenomenon presented in post-urban village era is forcing rethinking of the future of urban village. Existing monotonous urban renewal mode based on gentrification is questioned, and the social values of urban village has been gaining increasing attention while the daily life and spatial power of underclass is being focused on. Based on the consensus on the positive meaning of urban village phenomenon, social sectors have taken amount of improvement experiments to explore the possibility of modern transition of urban village on the premise of existence. These experiments revealed that urban tremendous changes impact a lot on social daily life, and pointed out that it is necessary to bring up the responsibility of architects and the definition of urban for discussion again.

Keywords: post-urban village era, gentrification, social value, daily life, improvement experiment.

Procedia PDF Downloads 490
24102 Index t-SNE: Tracking Dynamics of High-Dimensional Datasets with Coherent Embeddings

Authors: Gaelle Candel, David Naccache

Abstract:

t-SNE is an embedding method that the data science community has widely used. It helps two main tasks: to display results by coloring items according to the item class or feature value; and for forensic, giving a first overview of the dataset distribution. Two interesting characteristics of t-SNE are the structure preservation property and the answer to the crowding problem, where all neighbors in high dimensional space cannot be represented correctly in low dimensional space. t-SNE preserves the local neighborhood, and similar items are nicely spaced by adjusting to the local density. These two characteristics produce a meaningful representation, where the cluster area is proportional to its size in number, and relationships between clusters are materialized by closeness on the embedding. This algorithm is non-parametric. The transformation from a high to low dimensional space is described but not learned. Two initializations of the algorithm would lead to two different embeddings. In a forensic approach, analysts would like to compare two or more datasets using their embedding. A naive approach would be to embed all datasets together. However, this process is costly as the complexity of t-SNE is quadratic and would be infeasible for too many datasets. Another approach would be to learn a parametric model over an embedding built with a subset of data. While this approach is highly scalable, points could be mapped at the same exact position, making them indistinguishable. This type of model would be unable to adapt to new outliers nor concept drift. This paper presents a methodology to reuse an embedding to create a new one, where cluster positions are preserved. The optimization process minimizes two costs, one relative to the embedding shape and the second relative to the support embedding’ match. The embedding with the support process can be repeated more than once, with the newly obtained embedding. The successive embedding can be used to study the impact of one variable over the dataset distribution or monitor changes over time. This method has the same complexity as t-SNE per embedding, and memory requirements are only doubled. For a dataset of n elements sorted and split into k subsets, the total embedding complexity would be reduced from O(n²) to O(n²=k), and the memory requirement from n² to 2(n=k)², which enables computation on recent laptops. The method showed promising results on a real-world dataset, allowing to observe the birth, evolution, and death of clusters. The proposed approach facilitates identifying significant trends and changes, which empowers the monitoring high dimensional datasets’ dynamics.

Keywords: concept drift, data visualization, dimension reduction, embedding, monitoring, reusability, t-SNE, unsupervised learning

Procedia PDF Downloads 128
24101 Dimensioning of Circuit Switched Networks by Using Simulation Code Based On Erlang (B) Formula

Authors: Ali Mustafa Elshawesh, Mohamed Abdulali

Abstract:

The paper presents an approach to dimension circuit switched networks and find the relationship between the parameters of the circuit switched networks on the condition of specific probability of call blocking. Our work is creating a Simulation code based on Erlang (B) formula to draw graphs which show two curves for each graph; one of simulation and the other of calculated. These curves represent the relationships between average number of calls and average call duration with the probability of call blocking. This simulation code facilitates to select the appropriate parameters for circuit switched networks.

Keywords: Erlang B formula, call blocking, telephone system dimension, Markov model, link capacity

Procedia PDF Downloads 587
24100 Autonomous Kuka Youbot Navigation Based on Machine Learning and Path Planning

Authors: Carlos Gordon, Patricio Encalada, Henry Lema, Diego Leon, Dennis Chicaiza

Abstract:

The following work presents a proposal of autonomous navigation of mobile robots implemented in an omnidirectional robot Kuka Youbot. We have been able to perform the integration of robotic operative system (ROS) and machine learning algorithms. ROS mainly provides two distributions; ROS hydro and ROS Kinect. ROS hydro allows managing the nodes of odometry, kinematics, and path planning with statistical and probabilistic, global and local algorithms based on Adaptive Monte Carlo Localization (AMCL) and Dijkstra. Meanwhile, ROS Kinect is responsible for the detection block of dynamic objects which can be in the points of the planned trajectory obstructing the path of Kuka Youbot. The detection is managed by artificial vision module under a trained neural network based on the single shot multibox detector system (SSD), where the main dynamic objects for detection are human beings and domestic animals among other objects. When the objects are detected, the system modifies the trajectory or wait for the decision of the dynamic obstacle. Finally, the obstacles are skipped from the planned trajectory, and the Kuka Youbot can reach its goal thanks to the machine learning algorithms.

Keywords: autonomous navigation, machine learning, path planning, robotic operative system, open source computer vision library

Procedia PDF Downloads 156
24099 A Data-Mining Model for Protection of FACTS-Based Transmission Line

Authors: Ashok Kalagura

Abstract:

This paper presents a data-mining model for fault-zone identification of flexible AC transmission systems (FACTS)-based transmission line including a thyristor-controlled series compensator (TCSC) and unified power-flow controller (UPFC), using ensemble decision trees. Given the randomness in the ensemble of decision trees stacked inside the random forests model, it provides an effective decision on the fault-zone identification. Half-cycle post-fault current and voltage samples from the fault inception are used as an input vector against target output ‘1’ for the fault after TCSC/UPFC and ‘1’ for the fault before TCSC/UPFC for fault-zone identification. The algorithm is tested on simulated fault data with wide variations in operating parameters of the power system network, including noisy environment providing a reliability measure of 99% with faster response time (3/4th cycle from fault inception). The results of the presented approach using the RF model indicate the reliable identification of the fault zone in FACTS-based transmission lines.

Keywords: distance relaying, fault-zone identification, random forests, RFs, support vector machine, SVM, thyristor-controlled series compensator, TCSC, unified power-flow controller, UPFC

Procedia PDF Downloads 412
24098 Evaluation of a Method for the Virtual Design of a Software-based Approach for Electronic Fuse Protection in Automotive Applications

Authors: Dominic Huschke, Rudolf Keil

Abstract:

New driving functionalities like highly automated driving have a major impact on the electrics/electronics architecture of future vehicles and inevitably lead to higher safety requirements. Partly due to these increased requirements, the vehicle industry is increasingly looking at semiconductor switches as an alternative to conventional melting fuses. The protective functionality of semiconductor switches can be implemented in hardware as well as in software. A current approach discussed in science and industry is the implementation of a model of the protected low voltage power cable on a microcontroller to calculate its temperature. Here, the information regarding the current is provided by the continuous current measurement of the semiconductor switch. The signal to open the semiconductor switch is provided by the microcontroller when a previously defined limit for the temperature of the low voltage power cable is exceeded. A setup for the testing of the described principle for electronic fuse protection of a low voltage power cable is built and successfullyvalidated with experiments afterwards. Here, the evaluation criterion is the deviation of the measured temperature of the low voltage power cable from the specified limit temperature when the semiconductor switch is opened. The analysis is carried out with an assumed ambient temperature as well as with a measured ambient temperature. Subsequently, the experimentally performed investigations are simulated in a virtual environment. The explicit focus is on the simulation of the behavior of the microcontroller with an implemented model of a low voltage power cable in a real-time environment. Subsequently, the generated results are compared with those of the experiments. Based on this, the completely virtual design of the described approach is assumed to be valid.

Keywords: automotive wire harness, electronic fuse protection, low voltage power cable, semiconductor-based fuses, software-based validation

Procedia PDF Downloads 90
24097 Development of Automated Quality Management System for the Management of Heat Networks

Authors: Nigina Toktasynova, Sholpan Sagyndykova, Zhanat Kenzhebayeva, Maksat Kalimoldayev, Mariya Ishimova, Irbulat Utepbergenov

Abstract:

Any business needs a stable operation and continuous improvement, therefore it is necessary to constantly interact with the environment, to analyze the work of the enterprise in terms of employees, executives and consumers, as well as to correct any inconsistencies of certain types of processes and their aggregate. In the case of heat supply organizations, in addition to suppliers, local legislation must be considered which often is the main regulator of pricing of services. In this case, the process approach used to build a functional organizational structure in these types of businesses in Kazakhstan is a challenge not only in the implementation, but also in ways of analyzing the employee's salary. To solve these problems, we investigated the management system of heating enterprise, including strategic planning based on the balanced scorecard (BSC), quality management in accordance with the standards of the Quality Management System (QMS) ISO 9001 and analysis of the system based on expert judgment using fuzzy inference. To carry out our work we used the theory of fuzzy sets, the QMS in accordance with ISO 9001, BSC according to the method of Kaplan and Norton, method of construction of business processes according to the notation IDEF0, theory of modeling using Matlab software simulation tools and graphical programming LabVIEW. The results of the work are as follows: We determined possibilities of improving the management of heat-supply plant-based on QMS; after the justification and adaptation of software tool it has been used to automate a series of functions for the management and reduction of resources and for the maintenance of the system up to date; an application for the analysis of the QMS based on fuzzy inference has been created with novel organization of communication software with the application enabling the analysis of relevant data of enterprise management system.

Keywords: balanced scorecard, heat supply, quality management system, the theory of fuzzy sets

Procedia PDF Downloads 347
24096 Defence Ethics : A Performance Measurement Framework for the Defence Ethics Program

Authors: Allyson Dale, Max Hlywa

Abstract:

The Canadian public expects the highest moral standards from Canadian Armed Forces (CAF) members and Department of National Defence (DND) employees. The Chief, Professional Conduct and Culture (CPCC) stood up in April 2021 with the mission of ensuring that the defence culture and members’ conduct are aligned with the ethical principles and values that the organization aspires towards. The Defence Ethics Program (DEP), which stood up in 1997, is a values-based ethics program for individuals and organizations within the DND/CAF and now falls under CPCC. The DEP is divided into five key functional areas, including policy, communications, collaboration, training and education, and advice and guidance. The main focus of the DEP is to foster an ethical culture within defence so that members and organizations perform to the highest ethical standards. The measurement of organizational ethics is often complex and challenging. In order to monitor whether the DEP is achieving its intended outcomes, a performance measurement framework (PMF) was developed using the Director General Military Personnel Research and Analysis (DGMPRA) PMF development process. This evidence-based process is based on subject-matter expertise from the defence team. The goal of this presentation is to describe each stage of the DGMPRA PMF development process and to present and discuss the products of the DEP PMF (e.g., logic model). Specifically, first, a strategic framework was developed to provide a high-level overview of the strategic objectives, mission, and vision of the DEP. Next, Key Performance Questions were created based on the objectives in the strategic framework. A logic model detailing the activities, outputs (what is produced by the program activities), and intended outcomes of the program were developed to demonstrate how the program works. Finally, Key Performance Indicators were developed based on both the intended outcomes in the logic model and the Key Performance Questions in order to monitor program effectiveness. The Key Performance Indicators measure aspects of organizational ethics such as ethical conduct and decision-making, DEP collaborations, and knowledge and awareness of the Defence Ethics Code while leveraging ethics-related items from multiple DGMPRA surveys where appropriate.

Keywords: defence ethics, ethical culture, organizational performance, performance measurement framework

Procedia PDF Downloads 82
24095 Artificial Intelligence Based Meme Generation Technology for Engaging Audience in Social Media

Authors: Andrew Kurochkin, Kostiantyn Bokhan

Abstract:

In this study, a new meme dataset of ~650K meme instances was created, a technology of meme generation based on the state of the art deep learning technique - GPT-2 model was researched, a comparative analysis of machine-generated memes and human-created was conducted. We justified that Amazon Mechanical Turk workers can be used for the approximate estimating of users' behavior in a social network, more precisely to measure engagement. It was shown that generated memes cause the same engagement as human memes that produced low engagement in the social network (historically). Thus, generated memes are less engaging than random memes created by humans.

Keywords: content generation, computational social science, memes generation, Reddit, social networks, social media interaction

Procedia PDF Downloads 114
24094 Scheduling in Cloud Networks Using Chakoos Algorithm

Authors: Masoumeh Ali Pouri, Hamid Haj Seyyed Javadi

Abstract:

Nowadays, cloud processing is one of the important issues in information technology. Since scheduling of tasks graph is an NP-hard problem, considering approaches based on undeterminisitic methods such as evolutionary processing, mostly genetic and cuckoo algorithms, will be effective. Therefore, an efficient algorithm has been proposed for scheduling of tasks graph to obtain an appropriate scheduling with minimum time. In this algorithm, the new approach is based on making the length of the critical path shorter and reducing the cost of communication. Finally, the results obtained from the implementation of the presented method show that this algorithm acts the same as other algorithms when it faces graphs without communication cost. It performs quicker and better than some algorithms like DSC and MCP algorithms when it faces the graphs involving communication cost.

Keywords: cloud computing, scheduling, tasks graph, chakoos algorithm

Procedia PDF Downloads 44
24093 Study on an Integrated Real-Time Sensor in Droplet-Based Microfluidics

Authors: Tien-Li Chang, Huang-Chi Huang, Zhao-Chi Chen, Wun-Yi Chen

Abstract:

The droplet-based microfluidic are used as micro-reactors for chemical and biological assays. Hence, the precise addition of reagents into the droplets is essential for this function in the scope of lab-on-a-chip applications. To obtain the characteristics (size, velocity, pressure, and frequency of production) of droplets, this study describes an integrated on-chip method of real-time signal detection. By controlling and manipulating the fluids, the flow behavior can be obtained in the droplet-based microfluidics. The detection method is used a type of infrared sensor. Through the varieties of droplets in the microfluidic devices, the real-time conditions of velocity and pressure are gained from the sensors. Here the microfluidic devices are fabricated by polydimethylsiloxane (PDMS). To measure the droplets, the signal acquisition of sensor and LabVIEW program control must be established in the microchannel devices. The devices can generate the different size droplets where the flow rate of oil phase is fixed 30 μl/hr and the flow rates of water phase range are from 20 μl/hr to 80 μl/hr. The experimental results demonstrate that the sensors are able to measure the time difference of droplets under the different velocity at the voltage from 0 V to 2 V. Consequently, the droplets are measured the fastest speed of 1.6 mm/s and related flow behaviors that can be helpful to develop and integrate the practical microfluidic applications.

Keywords: microfluidic, droplets, sensors, single detection

Procedia PDF Downloads 471
24092 Savinglife®: An Educational Technology for Basic and Advanced Cardiovascular Life Support

Authors: Naz Najma, Grace T. M. Dal Sasso, Maria de Lourdes de Souza

Abstract:

The development of information and communication technologies and the accessibility of mobile devices has increased the possibilities of the teaching and learning process anywhere and anytime. Mobile and web application allows the production of constructive teaching and learning models in various educational settings, showing the potential for active learning in nursing. The objective of this study was to present the development of an educational technology (Savinglife®, an app) for learning cardiopulmonary resuscitation and advanced cardiovascular life support training. Savinglife® is a technological production, based on the concept of virtual learning and problem-based learning approach. The study was developed from January 2016 to November 2016, using five phases (analyze, design, develop, implement, evaluate) of the instructional systems development process. The technology presented 10 scenarios and 12 simulations, covering different aspects of basic and advanced cardiac life support. The contents can be accessed in a non-linear way leaving the students free to build their knowledge based on their previous experience. Each scenario is presented through interactive tools such as scenario description, assessment, diagnose, intervention and reevaluation. Animated ECG rhythms, text documents, images and videos are provided to support procedural and active learning considering real life situation. Accessible equally on small to large devices with or without an internet connection, Savinglife® offers a dynamic, interactive and flexible tool, placing students at the center of the learning process. Savinglife® can contribute to the student’s learning in the assessment and management of basic and advanced cardiac life support in a safe and ethical way.

Keywords: problem-based learning, cardiopulmonary resuscitation, nursing education, advanced cardiac life support, educational technology

Procedia PDF Downloads 288
24091 Optimum Structural Wall Distribution in Reinforced Concrete Buildings Subjected to Earthquake Excitations

Authors: Nesreddine Djafar Henni, Akram Khelaifia, Salah Guettala, Rachid Chebili

Abstract:

Reinforced concrete shear walls and vertical plate-like elements play a pivotal role in efficiently managing a building's response to seismic forces. This study investigates how the performance of reinforced concrete buildings equipped with shear walls featuring different shear wall-to-frame stiffness ratios aligns with the requirements stipulated in the Algerian seismic code RPA99v2003, particularly in high-seismicity regions. Seven distinct 3D finite element models are developed and evaluated through nonlinear static analysis. Engineering Demand Parameters (EDPs) such as lateral displacement, inter-story drift ratio, shear force, and bending moment along the building height are analyzed. The findings reveal two predominant categories of induced responses: force-based and displacement-based EDPs. Furthermore, as the shear wall-to-frame ratio increases, there is a concurrent increase in force-based EDPs and a decrease in displacement-based ones. Examining the distribution of shear walls from both force and displacement perspectives, model G with the highest stiffness ratio, concentrating stiffness at the building's center, intensifies induced forces. This configuration necessitates additional reinforcements, leading to a conservative design approach. Conversely, model C, with the lowest stiffness ratio, distributes stiffness towards the periphery, resulting in minimized induced shear forces and bending moments, representing an optimal scenario with maximal performance and minimal strength requirements.

Keywords: dual RC buildings, RC shear walls, modeling, static nonlinear pushover analysis, optimization, seismic performance

Procedia PDF Downloads 38
24090 Comparative Growth Kinetic Studies of Two Strains Saccharomyces cerevisiae Isolated from Dates and a Commercial Strain

Authors: Nizar Chaira

Abstract:

Dates, main products of the oases, due to their therapeutic interests, are considered highly nutritious fruit. Several studies on the valuation biotechnology and technology of dates are made, and several products are already prepared. Isolation of the yeast Saccharomyces cerevisiae, naturally presents in a scrap of date, optimization of growth in the medium based on date syrup and production biomass can potentially expand the range of secondary products of dates. To this end, this paper tries to study the suitability for processing dates technology and biotechnology to use the date pulp as a carbon source for biological transformation. Two strains of Saccharomyces cerevisiae isolated from date syrup (S1, S2) and a commercial strain have used for this study. After optimization of culture conditions, production in a fermenter on two different media (date syrup and beet molasses) was performed. This is followed by studying the kinetics of growth, protein production and consumption of sugars in crops strain 1, 2 and the commercial strain and on both media. The results obtained showed that a concentration of 2% sugar, 2.5 g/l yeast extract, pH 4.5 and a temperature between 25 and 35°C are the optimal conditions for cultivation in a bioreactor. The exponential phase of the specific growth rate of a strain on both media showed that it is about 0.3625 h-1 for the production of a medium based on date syrup and 0.3521 h-1 on beet molasses with a generation time equal to 1.912 h and on the medium based on date syrup, yeast consumes preferentially the reducing sugars. For the production of protein, we showed that this latter presents an exponential phase when the medium starts to run out of reducing sugars. For strain 2, the specific growth rate is about 0.261h-1 for the production on a medium based on date syrup and 0207 h-1 on beet molasses and the base medium syrup date of the yeast consumes preferentially reducing sugars. For the invertase and other metabolits, these increases rapidly after exhaustion of reducing sugars. The comparison of productivity between the three strains on the medium based on date syrup showed that the maximum value is obtained with the second strain: p = 1072 g/l/h as it is about of 0923 g/l/h for strain 1 and 0644 g/l/h for the commercial strain. Thus, isolates of date syrup are more competitive than the commercial strain and can give the same performance in a shorter time with energy gain.

Keywords: date palm, fermentation, molasses, Saccharomyces, syrup

Procedia PDF Downloads 305
24089 Estimation of Thermal Conductivity of Nanofluids Using MD-Stochastic Simulation-Based Approach

Authors: Sujoy Das, M. M. Ghosh

Abstract:

The thermal conductivity of a fluid can be significantly enhanced by dispersing nano-sized particles in it, and the resultant fluid is termed as "nanofluid". A theoretical model for estimating the thermal conductivity of a nanofluid has been proposed here. It is based on the mechanism that evenly dispersed nanoparticles within a nanofluid undergo Brownian motion in course of which the nanoparticles repeatedly collide with the heat source. During each collision a rapid heat transfer occurs owing to the solid-solid contact. Molecular dynamics (MD) simulation of the collision of nanoparticles with the heat source has shown that there is a pulse-like pick up of heat by the nanoparticles within 20-100 ps, the extent of which depends not only on thermal conductivity of the nanoparticles, but also on the elastic and other physical properties of the nanoparticle. After the collision the nanoparticles undergo Brownian motion in the base fluid and release the excess heat to the surrounding base fluid within 2-10 ms. The Brownian motion and associated temperature variation of the nanoparticles have been modeled by stochastic analysis. Repeated occurrence of these events by the suspended nanoparticles significantly contributes to the characteristic thermal conductivity of the nanofluids, which has been estimated by the present model for a ethylene glycol based nanofluid containing Cu-nanoparticles of size ranging from 8 to 20 nm, with Gaussian size distribution. The prediction of the present model has shown a reasonable agreement with the experimental data available in literature.

Keywords: brownian dynamics, molecular dynamics, nanofluid, thermal conductivity

Procedia PDF Downloads 360
24088 Assessing the Attitude and Belief towards Online Advertisement in Pakistan and China Mainland

Authors: Prih Bukhari

Abstract:

The purpose of the proposed paper is to determine if the perception of online advertisement formed due to attitude and belief vary among two different countries or not. Specifically, it seeks to find out how people from China and Pakistan perceive online advertisement. Public attitude and belief towards advertising have been a focus of attention to explore a path to a better strategy of advertising. The ‘belief’ factor was analyzed through 4 items, i.e., product information, entertainment, and increase in economy’ whereas, the ‘attitude’ factor was analyzed thorough questions based on 4 items, i.e. ‘overall, I consider online advertising a good thing’; 'overall, I like online advertising’; ‘'I consider online advertising very essential’; and 'I would describe my overall attitude toward online advertising very favorably’. As such, it provides theoretical basis to explain similarities and differences of beliefs and attitude towards advertising across the two countries. Given its mixed method approach, both quantitative and qualitative method is used to carry out research. A questionnaire-based survey and focus group interviews were conducted. The sample size was of 500 participants. For analysis survey copies were then collected from which 497 were received whereas focus group interviews were collected from both nations. The findings showed that the belief factor among both countries had no significant relation with the perception of online advertisement. However, the attitude had a significant relation with the perception about online advertisement. Also it was observed that despite of different backgrounds, perception about online advertisement based on beliefs and attitude were found largely to be similar. Implications and future studies are provided.

Keywords: attitude, belief, online advertisement, perception

Procedia PDF Downloads 130