Search results for: tool tuning
4683 Optimizing the Efficiency of Measuring Instruments in Ouagadougou-Burkina Faso
Authors: Moses Emetere, Marvel Akinyemi, S. E. Sanni
Abstract:
At the moment, AERONET or AMMA database shows a large volume of data loss. With only about 47% data set available to the scientist, it is evident that accurate nowcast or forecast cannot be guaranteed. The calibration constants of most radiosonde or weather stations are not compatible with the atmospheric conditions of the West African climate. A dispersion model was developed to incorporate salient mathematical representations like a Unified number. The Unified number was derived to describe the turbulence of the aerosols transport in the frictional layer of the lower atmosphere. Fourteen years data set from Multi-angle Imaging SpectroRadiometer (MISR) was tested using the dispersion model. A yearly estimation of the atmospheric constants over Ouagadougou using the model was obtained with about 87.5% accuracy. It further revealed that the average atmospheric constant for Ouagadougou-Niger is a_1 = 0.626, a_2 = 0.7999 and the tuning constants is n_1 = 0.09835 and n_2 = 0.266. Also, the yearly atmospheric constants affirmed the lower atmosphere of Ouagadougou is very dynamic. Hence, it is recommended that radiosonde and weather station manufacturers should constantly review the atmospheric constant over a geographical location to enable about eighty percent data retrieval.Keywords: aerosols retention, aerosols loading, statistics, analytical technique
Procedia PDF Downloads 3154682 Estimation of the Curve Number and Runoff Height Using the Arc CN-Runoff Tool in Sartang Ramon Watershed in Iran
Authors: L.Jowkar. M.Samiee
Abstract:
Models or systems based on rainfall and runoff are numerous and have been formulated and applied depending on the precipitation regime, temperature, and climate. In this study, the ArcCN-Runoff rain-runoff modeling tool was used to estimate the spatial variability of the rainfall-runoff relationship in Sartang Ramon in Jiroft watershed. In this study, the runoff was estimated from 6-hour rainfall. The results showed that based on hydrological soil group map, soils with hydrological groups A, B, C, and D covered 1, 2, 55, and 41% of the basin, respectively. Given that the majority of the area has a slope above 60 percent and results of soil hydrologic groups, one can conclude that Sartang Ramon Basin has a relatively high potential for producing runoff. The average runoff height for a 6-hour rainfall with a 2-year return period is 26.6 mm. The volume of runoff from the 2-year return period was calculated as the runoff height of each polygon multiplied by the area of the polygon, which is 137913486 m³ for the whole basin.Keywords: Arc CN-Run off, rain-runoff, return period, watershed
Procedia PDF Downloads 1274681 Mechanical Properties of D2 Tool Steel Cryogenically Treated Using Controllable Cooling
Authors: A. Rabin, G. Mazor, I. Ladizhenski, R. Shneck, Z.
Abstract:
The hardness and hardenability of AISI D2 cold work tool steel with conventional quenching (CQ), deep cryogenic quenching (DCQ) and rapid deep cryogenic quenching heat treatments caused by temporary porous coating based on magnesium sulfate was investigated. Each of the cooling processes was examined from the perspective of the full process efficiency, heat flux in the austenite-martensite transformation range followed by characterization of the temporary porous layer made of magnesium sulfate using confocal laser scanning microscopy (CLSM), surface and core hardness and hardenability using Vickr’s hardness technique. The results show that the cooling rate (CR) at the austenite-martensite transformation range have a high influence on the hardness of the studied steel.Keywords: AISI D2, controllable cooling, magnesium sulfate coating, rapid cryogenic heat treatment, temporary porous layer
Procedia PDF Downloads 1374680 The Acceptable Roles of Artificial Intelligence in the Judicial Reasoning Process
Authors: Sonia Anand Knowlton
Abstract:
There are some cases where we as a society feel deeply uncomfortable with the use of Artificial Intelligence (AI) tools in the judicial decision-making process, and justifiably so. A perfect example is COMPAS, an algorithmic model that predicts recidivism rates of offenders to assist in the determination of their bail conditions. COMPAS turned out to be extremely racist: it massively overpredicted recidivism rates of Black offenders and underpredicted recidivism rates of white offenders. At the same time, there are certain uses of AI in the judicial decision-making process that many would feel more comfortable with and even support. Take, for example, a “super-breathalyzer,” an (albeit imaginary) tool that uses AI to deliver highly detailed information about the subject of the breathalyzer test to the legal decision-makers analyzing their drunk-driving case. This article evaluates the point at which a judge’s use of AI tools begins to undermine the public’s trust in the administration of justice. It argues that the answer to this question depends on whether the AI tool is in a role in which it must perform a moral evaluation of a human being.Keywords: artificial intelligence, judicial reasoning, morality, technology, algorithm
Procedia PDF Downloads 864679 Software User Experience Enhancement through Collaborative Design
Authors: Shan Wang, Fahad Alhathal, Daniel Hobson
Abstract:
User-centered design skills play an important role in crafting a positive and intuitive user experience for software applications. Embracing a user-centric design approach involves understanding the needs, preferences, and behaviors of the end-users throughout the design process. This mindset not only enhances the usability of the software but also fosters a deeper connection between the digital product and its users. This paper encompasses a 6-month knowledge exchange collaboration project between an academic institution and an external industry in 2023, aims to improve the user experience of a digital platform utilized for a knowledge management tool, to understand users' preferences for features, identify sources of frustration, and pinpoint areas for enhancement. This research conducted one of the most effective methods to implement user-centered design through co-design workshops for testing user onboarding experiences that involve the active participation of users in the design process. More specifically, in January 2023, we organized eight workshops with a diverse group of 11 individuals. Throughout these sessions, we accumulated a total of 11 hours of qualitative data in both video and audio formats. Subsequently, we conducted an analysis of user journeys, identifying common issues and potential areas for improvement. This analysis was pivotal in guiding the knowledge management software in prioritizing feature enhancements and design improvements. Employing a user-centered design thinking process, we developed a series of graphic design solutions in collaboration with the software management tool company. These solutions were targeted at refining onboarding user experiences, workplace interfaces, and interactive design. Some of these design solutions were translated into tangible interfaces for the knowledge management tool. By actively involving users in the design process and valuing their input, developers can create products that are not only functional but also resonate with the end-users, ultimately leading to greater success in the competitive software landscape. In conclusion, this paper not only contributes insights into designing onboarding user experiences for software within a co-design approach but also presents key theories on leveraging the user-centered design process in software design to enhance overall user experiences.Keywords: user experiences, co-design, design process, knowledge management tool, user-centered design
Procedia PDF Downloads 684678 Creation of a Clinical Tool for Diagnosis and Treatment of Skin Disease in HIV Positive Patients in Malawi
Authors: Alice Huffman, Joseph Hartland, Sam Gibbs
Abstract:
Dermatology is often a neglected specialty in low-resource settings, despite the high morbidity associated with skin disease. This becomes even more significant when associated with HIV infection, as dermatological conditions are more common and aggressive in HIV positive patients. African countries have the highest HIV infection rates and skin conditions are frequently misdiagnosed and mismanaged, because of a lack of dermatological training and educational material. The frequent lack of diagnostic tests in the African setting renders basic clinical skills all the more vital. This project aimed to improve diagnosis and treatment of skin disease in the HIV population in a district hospital in Malawi. A basic dermatological clinical tool was developed and produced in collaboration with local staff and based on available literature and data collected from clinics. The aim was to improve diagnostic accuracy and provide guidance for the treatment of skin disease in HIV positive patients. A literature search within Embase, Medline and Google scholar was performed and supplemented through data obtained from attending 5 Antiretroviral clinics. From the literature, conditions were selected for inclusion in the resource if they were described as specific, more prevalent, or extensive in the HIV population or have more adverse outcomes if they develop in HIV patients. Resource-appropriate treatment options were decided using Malawian Ministry of Health guidelines and textbooks specific to African dermatology. After the collection of data and discussion with local clinical and pharmacy staff a list of 15 skin conditions was included and a booklet created using the simple layout of a picture, a diagnostic description of the disease and treatment options. Clinical photographs were collected from local clinics (with full consent of the patient) or from the book ‘Common Skin Diseases in Africa’ (permission granted if fully acknowledged and used in a not-for-profit capacity). This tool was evaluated by the local staff, alongside an educational teaching session on skin disease. This project aimed to reduce uncertainty in diagnosis and provide guidance for appropriate treatment in HIV patients by gathering information into one practical and manageable resource. To further this project, we hope to review the effectiveness of the tool in practice.Keywords: dermatology, HIV, Malawi, skin disease
Procedia PDF Downloads 2064677 A Method to Identify Areas for Hydraulic Fracturing by Using Production Logging Tools
Authors: Armin Shirbazo, Hamed Lamei Ramandi, Mohammad Vahab, Jalal Fahimpour
Abstract:
Hydraulic fracturing, especially multi-stage hydraulic fracturing, is a practical solution for wells with uneconomic production. The wide range of applications is appraised appropriately to have a stable well-production. Production logging tool, which is known as PLT in the oil and gas industry, is counted as one of the most reliable methods to evaluate the efficiency of fractures jobs. This tool has a number of benefits and can be used to prevent subsequent production failure. It also distinguishes different problems that occurred during well-production. In this study, the effectiveness of hydraulic fracturing jobs is examined by using the PLT in various cases and situations. The performance of hydraulically fractured wells is investigated. Then, the PLT is employed to gives more information about the properties of different layers. The PLT is also used to selecting an optimum fracturing design. The results show that one fracture and three-stage fractures behave differently. In general, the one-stage fracture should be created in high-quality areas of the reservoir to have better performance, and conversely, in three-stage fractures, low-quality areas are a better candidate for fracturingKeywords: multi-stage fracturing, horizontal well, PLT, fracture length, number of stages
Procedia PDF Downloads 1954676 Carbide Structure and Fracture Toughness of High Speed Tool Steels
Authors: Jung-Ho Moon, Tae Kwon Ha
Abstract:
M2 steels, the typical Co-free high speed steel (HSS) possessing hardness level of 63~65 HRc, are most widely used for cutting tools. On the other hand, Co-containing HSS’s, such as M35 and M42, show a higher hardness level of 65~67 HRc and used for high quality cutting tools. In the fabrication of HSS’s, it is very important to control cleanliness and eutectic carbide structure of the ingot and it is required to increase productivity at the same time. Production of HSS ingots includes a variety of processes such as casting, electro-slag remelting (ESR), forging, blooming, and wire rod rolling processes. In the present study, electro-slag rapid remelting (ESRR) process, an advanced ESR process combined by continuous casting, was successfully employed to fabricate HSS billets of M2, M35, and M42 steels. Distribution and structure of eutectic carbides of the billets were analysed and cleanliness, hardness, and composition profile of the billets were also evaluated.Keywords: high speed tool steel, eutectic carbide, microstructure, hardness, fracture toughness
Procedia PDF Downloads 4454675 A Rapid Prototyping Tool for Suspended Biofilm Growth Media
Authors: Erifyli Tsagkari, Stephanie Connelly, Zhaowei Liu, Andrew McBride, William Sloan
Abstract:
Biofilms play an essential role in treating water in biofiltration systems. The biofilm morphology and function are inextricably linked to the hydrodynamics of flow through a filter, and yet engineers rarely explicitly engineer this interaction. We develop a system that links computer simulation and 3-D printing to optimize and rapidly prototype filter media to optimize biofilm function with the hypothesis that biofilm function is intimately linked to the flow passing through the filter. A computational model that numerically solves the incompressible time-dependent Navier Stokes equations coupled to a model for biofilm growth and function is developed. The model is imbedded in an optimization algorithm that allows the model domain to adapt until criteria on biofilm functioning are met. This is applied to optimize the shape of filter media in a simple flow channel to promote biofilm formation. The computer code links directly to a 3-D printer, and this allows us to prototype the design rapidly. Its validity is tested in flow visualization experiments and by microscopy. As proof of concept, the code was constrained to explore a small range of potential filter media, where the medium acts as an obstacle in the flow that sheds a von Karman vortex street that was found to enhance the deposition of bacteria on surfaces downstream. The flow visualization and microscopy in the 3-D printed realization of the flow channel validated the predictions of the model and hence its potential as a design tool. Overall, it is shown that the combination of our computational model and the 3-D printing can be effectively used as a design tool to prototype filter media to optimize biofilm formation.Keywords: biofilm, biofilter, computational model, von karman vortices, 3-D printing.
Procedia PDF Downloads 1434674 Calibration and Validation of the Aquacrop Model for Simulating Growth and Yield of Rain-Fed Sesame (Sesamum Indicum L.) Under Different Soil Fertility Levels in the Semi-arid Areas of Tigray, Ethiopia
Authors: Abadi Berhane, Walelign Worku, Berhanu Abrha, Gebre Hadgu
Abstract:
Sesame is an important oilseed crop in Ethiopia, which is the second most exported agricultural commodity next to coffee. However, there is poor soil fertility management and a research-led farming system for the crop. The AquaCrop model was applied as a decision-support tool, which performs a semi-quantitative approach to simulate the yield of crops under different soil fertility levels. The objective of this experiment was to calibrate and validate the AquaCrop model for simulating the growth and yield of sesame under different nitrogen fertilizer levels and to test the performance of the model as a decision-support tool for improved sesame cultivation in the study area. The experiment was laid out as a randomized complete block design (RCBD) in a factorial arrangement in the 2016, 2017, and 2018 main cropping seasons. In this experiment, four nitrogen fertilizer rates, 0, 23, 46, and 69 Kg/ha nitrogen, and three improved varieties (Setit-1, Setit-2, and Humera-1). In the meantime, growth, yield, and yield components of sesame were collected from each treatment. Coefficient of determination (R2), Root mean square error (RMSE), Normalized root mean square error (N-RMSE), Model efficiency (E), and Degree of agreement (D) were used to test the performance of the model. The results indicated that the AquaCrop model successfully simulated soil water content with R2 varying from 0.92 to 0.98, RMSE 6.5 to 13.9 mm, E 0.78 to 0.94, and D 0.95 to 0.99, and the corresponding values for AB also varied from 0.92 to 0.98, 0.33 to 0.54 tons/ha, 0.74 to 0.93, and 0.9 to 0.98, respectively. The results on the canopy cover of sesame also showed that the model acceptably simulated canopy cover with R2 varying from 0.95 to 0.99 and a RMSE of 5.3 to 8.6%. The AquaCrop model was appropriately calibrated to simulate soil water content, canopy cover, aboveground biomass, and sesame yield; the results indicated that the model adequately simulated the growth and yield of sesame under the different nitrogen fertilizer levels. The AquaCrop model might be an important tool for improved soil fertility management and yield enhancement strategies of sesame. Hence, the model might be applied as a decision-support tool in soil fertility management in sesame production.Keywords: aquacrop model, normalized water productivity, nitrogen fertilizer, canopy cover, sesame
Procedia PDF Downloads 804673 Design and Creation of a BCI Videogame for Training and Measure of Sustained Attention in Children with ADHD
Authors: John E. Muñoz, Jose F. Lopez, David S. Lopez
Abstract:
Attention Deficit Hyperactivity Disorder (ADHD) is a disorder that affects 1 out of 5 Colombian children, converting into a real public health problem in the country. Conventional treatments such as medication and neuropsychological therapy have been proved to be insufficient in order to decrease high incidence levels of ADHD in the principal Colombian cities. This work demonstrates a design and development of a videogame that uses a brain computer interface not only to serve as an input device but also as a tool to monitor neurophysiologic signal. The video game named “The Harvest Challenge” puts a cultural scene of a Colombian coffee grower in its context, where a player can use his/her avatar in three mini games created in order to reinforce four fundamental aspects: i) waiting ability, ii) planning ability, iii) ability to follow instructions and iv) ability to achieve objectives. The details of this collaborative designing process of the multimedia tool according to the exact clinic necessities and the description of interaction proposals are presented through the mental stages of attention and relaxation. The final videogame is presented as a tool for sustained attention training in children with ADHD using as an action mechanism the neuromodulation of Beta and Theta waves through an electrode located in the central part of the front lobe of the brain. The processing of an electroencephalographic signal is produced automatically inside the videogame allowing to generate a report of the theta/beta ratio evolution - a biological marker, which has been demonstrated to be a sufficient measure to discriminate of children with deficit and without.Keywords: BCI, neuromodulation, ADHD, videogame, neurofeedback, theta/beta ratio
Procedia PDF Downloads 3724672 Psychodiagnostic Tool Development for Measurement of Social Responsibility in Ukrainian Organizations
Authors: Olena Kovalchuk
Abstract:
How to define the understanding of social responsibility issues by Ukrainian companies is a contravention question. Thus, one of the practical uses of social responsibility is a diagnostic tool development for educational, business or scientific purposes. So the purpose of this research is to develop a tool for measurement of social responsibility in organization. Methodology: A 21-item questionnaire “Organization Social Responsibility Scale” was developed. This tool was adapted for the Ukrainian sample and based on the questionnaire “Perceived Role of Ethics and Social Responsibility” which connects ethical and socially responsible behavior to different aspects of the organizational effectiveness. After surveying the respondents, the factor analysis was made by the method of main compounds with orthogonal rotation VARIMAX. On the basis of the obtained results the 21-item questionnaire was developed (Cronbach’s alpha – 0,768; Inter-Item Correlations – 0,34). Participants: 121 managers at all levels of Ukrainian organizations (57 males; 65 females) took part in the research. Results: Factor analysis showed five ethical dilemmas concerning the social responsibility and profit compatibility in Ukrainian organizations. Below we made an attempt to interpret them: — Social responsibility vs profit. Corporate social responsibility can be a way to reduce operational costs. A firm’s first priority is employees’ morale. Being ethical and socially responsible is the priority of the organization. The most loaded question is "Corporate social responsibility can reduce operational costs". Significant effect of this factor is 0.768. — Profit vs social responsibility. Efficiency is much more important to a firm than ethics or social responsibility. Making the profit is the most important concern for a firm. The dominant question is "Efficiency is much more important to a firm than whether or not the firm is seen as ethical or socially responsible". Significant effect of this factor is 0.793. — A balanced combination of social responsibility and profit. Organization with social responsibility policy is more attractive for its stakeholders. The most loaded question is "Social responsibility and profitability can be compatible". Significant effect of this factor is 0.802. — Role of Social Responsibility in the successful organizational performance. Understanding the value of social responsibility and business ethics. Well-being and welfare of the society. The dominant question is "Good ethics is often good business". Significant effect of this factor is 0.727. — Global vision of social responsibility. Issues related to global social responsibility and sustainability. Innovative approaches to poverty reduction. Awareness of climate change problems. Global vision for successful business. The dominant question is "The overall effectiveness of a business can be determined to a great extent by the degree to which it is ethical and socially responsible". Significant effect of this factor is 0.842. The theoretical contribution. The perspective of the study is to develop a tool for measurement social responsibility in organizations and to test questionnaire’s adequacy for social and cultural context. Practical implications. The research results can be applied for designing a training programme for business school students to form their global vision for successful business as well as the ability to solve ethical dilemmas in managerial practice. Researchers interested in social responsibility issues are welcome to join the project.Keywords: corporate social responsibility, Cronbach’s alpha, ethical behaviour, psychodiagnostic tool
Procedia PDF Downloads 3644671 A Study of the Trade-off Energy Consumption-Performance-Schedulability for DVFS Multicore Systems
Authors: Jalil Boudjadar
Abstract:
Dynamic Voltage and Frequency Scaling (DVFS) multicore platforms are promising execution platforms that enable high computational performance, less energy consumption and flexibility in scheduling the system processes. However, the resulting interleaving and memory interference together with per-core frequency tuning make real-time guarantees hard to be delivered. Besides, energy consumption represents a strong constraint for the deployment of such systems on energy-limited settings. Identifying the system configurations that would achieve a high performance and consume less energy while guaranteeing the system schedulability is a complex task in the design of modern embedded systems. This work studies the trade-off between energy consumption, cores utilization and memory bottleneck and their impact on the schedulability of DVFS multicore time-critical systems with a hierarchy of shared memories. We build a model-based framework using Parametrized Timed Automata of UPPAAL to analyze the mutual impact of performance, energy consumption and schedulability of DVFS multicore systems, and demonstrate the trade-off on an actual case study.Keywords: time-critical systems, multicore systems, schedulability analysis, energy consumption, performance analysis
Procedia PDF Downloads 1084670 Reliability of Self-Reported Language Proficiency Measures in l1 Attrition Research: A Closer Look at the Can-Do-Scales.
Authors: Anastasia Sorokina
Abstract:
Self-reported language proficiency measures have been widely used by researchers and have been proven to be an accurate tool to assess actual language proficiency. L1 attrition researchers also rely on self-reported measures. More specifically, can-do-scales has gained popularity in the discipline of L1 attrition research. The can-do-scales usually contain statements about language (e.g., “I can write e-mails”); participants are asked to rate each statement on a scale from 1 (I cannot do it at all) to 5 (I can do it without any difficulties). Despite its popularity, no studies have examined can-do-scales’ reliability at measuring the actual level of L1 attrition. Do can-do-scales positively correlate with lexical diversity, syntactic complexity, and fluency? The present study analyzed speech samples of 35 Russian-English attriters to examine whether their self-reported proficiency correlates with their actual L1 proficiency. The results of Pearson correlation demonstrated that can-do-scales correlated with lexical diversity, syntactic complexity, and fluency. These findings provide a valuable contribution to the L1 attrition research by demonstrating that can-do-scales can be used as a reliable tool to measure L1 attrition.Keywords: L1 attrition, can-do-scales, lexical diversity, syntactic complexity
Procedia PDF Downloads 2474669 Optimization of Friction Stir Spot Welding Process Parameters for Joining 6061 Aluminum Alloy Using Taguchi Method
Authors: Mohammed A. Tashkandi, Jawdat A. Al-Jarrah, Masoud Ibrahim
Abstract:
This paper investigates the shear strength of the joints produced by friction stir spot welding process (FSSW). FSSW parameters such as tool rotational speed, plunge depth, shoulder diameter of the welding tool and dwell time play the major role in determining the shear strength of the joints. The effect of these four parameters on FSSW process as well as the shear strength of the welded joints was studied via five levels of each parameter. Taguchi method was used to minimize the number of experiments required to determine the fracture load of the friction stir spot-welded joints by incorporating independently controllable FSSW parameters. Taguchi analysis was applied to optimize the FSSW parameters to attain the maximum shear strength of the spot weld for this type of aluminum alloy.Keywords: Friction Stir Spot Welding, Al6061 alloy, Shear Strength, FSSW process parameters
Procedia PDF Downloads 4344668 Mental Imagery as an Auxiliary Tool to the Performance of Elite Competitive Swimmers of the University of the East Manila
Authors: Hillary Jo Muyalde
Abstract:
Introduction: Elite athletes train regularly to enhance their physical endurance, but sometimes, training sessions are not enough. When competition comes, these athletes struggle to find focus. Mental imagery is a psychological technique that helps condition the mind to focus and eventually help improve performance. This study aims to help elite competitive swimmers of the University of the East improve their performance with Mental Imagery as an auxiliary tool. Methodology: The study design used was quasi-experimental with a purposive sampling technique and a within-subject design. It was conducted with a total of 41 participants. The participants were given a Sport Imagery Ability Questionnaire (SIAQ) to measure imagery ability and the Mental Imagery Program. The study utilized a Paired T-test for data analysis where the participants underwent six weeks of no mental imagery training and were compared to six weeks with the Mental Imagery Program (MIP). The researcher recorded the personal best time of participants in their respective specialty stroke. Results: The results of the study showed a t-value of 17.804 for Butterfly stroke events, 9.922 for Backstroke events, 7.787 for Breaststroke events, and 17.440 in Freestyle. This indicated that MIP had a positive effect on participants’ performance. The SIAQ result also showed a big difference where -10.443 for Butterfly events, -5.363 for Backstroke, -7.244 for Breaststroke events, and -10.727 for Freestyle events, which meant the participants were able to image better than before MIP. Conclusion: In conclusion, the findings of this study showed that there is indeed an improvement in the performance of the participants after the application of the Mental Imagery Program. It is recommended from this study that the participants continue to use mental imagery as an auxiliary tool to their training regimen for continuous positive results.Keywords: mental Imagery, personal best time, SIAQ, specialty stroke
Procedia PDF Downloads 804667 Crystal Nucleation in 3D Printed Polymer Scaffolds in Tissue Engineering
Authors: Amani Alotaibi
Abstract:
3D printing has emerged as a pivotal technique for scaffold development, particularly in the field of bone tissue regeneration, due to its ability to customize scaffolds to fit complex geometries of bone defects. Among the various methods available, fused deposition modeling (FDM) is particularly promising as it avoids the use of solvents or toxic chemicals during fabrication. This study investigates the effects of three key parameters, extrusion temperature, screw rotational speed, and deposition speed, on the crystallization and mechanical properties of polycaprolactone (PCL) scaffolds. Three extrusion temperatures (70°C, 80°C, and 90°C), three screw speeds (10 RPM, 15 RPM, and 20 RPM), and three deposition speeds (8 mm/s, 10 mm/s, and 12 mm/s) were evaluated. The scaffolds were characterized using X-ray diffraction (XRD), differential scanning calorimetry (DSC), and tensile testing to assess changes in crystallinity and mechanical properties. Additionally, the scaffolds were analyzed for crystal size and biocompatibility. The results demonstrated that increasing the extrusion temperature to 80°C, combined with a screw speed of 15 RPM and a deposition speed of 10 mm/s, significantly improved the crystallinity, compressive modulus, and thermal resistance of the PCL scaffolds. These findings suggest that by fine-tuning basic 3D printing parameters, it is possible to modulate the structural and mechanical properties of the scaffold, thereby enhancing its suitability for bone tissue regeneration.Keywords: 3D printing, polymer, scaffolds, tissue engineering, crystallization
Procedia PDF Downloads 144666 Enhancing Code Security with AI-Powered Vulnerability Detection
Authors: Zzibu Mark Brian
Abstract:
As software systems become increasingly complex, ensuring code security is a growing concern. Traditional vulnerability detection methods often rely on manual code reviews or static analysis tools, which can be time-consuming and prone to errors. This paper presents a distinct approach to enhancing code security by leveraging artificial intelligence (AI) and machine learning (ML) techniques. Our proposed system utilizes a combination of natural language processing (NLP) and deep learning algorithms to identify and classify vulnerabilities in real-world codebases. By analyzing vast amounts of open-source code data, our AI-powered tool learns to recognize patterns and anomalies indicative of security weaknesses. We evaluated our system on a dataset of over 10,000 open-source projects, achieving an accuracy rate of 92% in detecting known vulnerabilities. Furthermore, our tool identified previously unknown vulnerabilities in popular libraries and frameworks, demonstrating its potential for improving software security.Keywords: AI, machine language, cord security, machine leaning
Procedia PDF Downloads 404665 Optimizing Machine Learning Algorithms for Defect Characterization and Elimination in Liquids Manufacturing
Authors: Tolulope Aremu
Abstract:
The key process steps to produce liquid detergent products will introduce potential defects, such as formulation, mixing, filling, and packaging, which might compromise product quality, consumer safety, and operational efficiency. Real-time identification and characterization of such defects are of prime importance for maintaining high standards and reducing waste and costs. Usually, defect detection is performed by human inspection or rule-based systems, which is very time-consuming, inconsistent, and error-prone. The present study overcomes these limitations in dealing with optimization in defect characterization within the process for making liquid detergents using Machine Learning algorithms. Performance testing of various machine learning models was carried out: Support Vector Machine, Decision Trees, Random Forest, and Convolutional Neural Network on defect detection and classification of those defects like wrong viscosity, color deviations, improper filling of a bottle, packaging anomalies. These algorithms have significantly benefited from a variety of optimization techniques, including hyperparameter tuning and ensemble learning, in order to greatly improve detection accuracy while minimizing false positives. Equipped with a rich dataset of defect types and production parameters consisting of more than 100,000 samples, our study further includes information from real-time sensor data, imaging technologies, and historic production records. The results are that optimized machine learning models significantly improve defect detection compared to traditional methods. Take, for instance, the CNNs, which run at 98% and 96% accuracy in detecting packaging anomaly detection and bottle filling inconsistency, respectively, by fine-tuning the model with real-time imaging data, through which there was a reduction in false positives of about 30%. The optimized SVM model on detecting formulation defects gave 94% in viscosity variation detection and color variation. These values of performance metrics correspond to a giant leap in defect detection accuracy compared to the usual 80% level achieved up to now by rule-based systems. Moreover, this optimization with models can hasten defect characterization, allowing for detection time to be below 15 seconds from an average of 3 minutes using manual inspections with real-time processing of data. With this, the reduction in time will be combined with a 25% reduction in production downtime because of proactive defect identification, which can save millions annually in recall and rework costs. Integrating real-time machine learning-driven monitoring drives predictive maintenance and corrective measures for a 20% improvement in overall production efficiency. Therefore, the optimization of machine learning algorithms in defect characterization optimum scalability and efficiency for liquid detergent companies gives improved operational performance to higher levels of product quality. In general, this method could be conducted in several industries within the Fast moving consumer Goods industry, which would lead to an improved quality control process.Keywords: liquid detergent manufacturing, defect detection, machine learning, support vector machines, convolutional neural networks, defect characterization, predictive maintenance, quality control, fast-moving consumer goods
Procedia PDF Downloads 214664 Use of Concept Maps as a Tool for Evaluating Students' Understanding of Science
Authors: Aregamalage Sujeewa Vijayanthi Polgampala, Fang Huang
Abstract:
This study explores the genesis and development of concept mapping as a useful tool for science education and its effectiveness as technique for teaching and learning and evaluation for secondary science in schools and the role played by National College of Education science teachers. Concept maps, when carefully employed and executed serves as an integral part of teaching method and measure of effectiveness of teaching and tool for evaluation. Research has shown that science concept maps can have positive influence on student learning and motivation. The success of concept maps played in an instruction class depends on the type of theme selected, the development of learning outcomes, and the flexibility of instruction in providing library unit that is equipped with multimedia equipment where learners can interact. The study was restricted to 6 male and 9 female respondents' teachers in third-year internship pre service science teachers in Gampaha district Sri Lanka. Data were collected through 15 item questionnaire provided to learners and in depth interviews and class observations of 18 science classes. The two generated hypotheses for the study were rejected, while the results revealed that significant difference exists between factors influencing teachers' choice of concept maps, its usefulness and problems hindering the effectiveness of concept maps for teaching and learning process of secondary science in schools. It was examined that concept maps can be used as an effective measure to evaluate students understanding of concepts and misconceptions. Even the teacher trainees could not identify, key concept is on top, and subordinate concepts fall below. It is recommended that pre service science teacher trainees should be provided a thorough training using it as an evaluation instrument.Keywords: concept maps, evaluation, learning science, misconceptions
Procedia PDF Downloads 2744663 Accuracy of Trauma on Scene Triage Screen Tool (Shock Index, Reverse Shock Index Glasgow Coma Scale, and National Early Warning Score) to Predict the Severity of Emergency Department Triage
Authors: Chaiyaporn Yuksen, Tapanawat Chaiwan
Abstract:
Introduction: Emergency medical service (EMS) care for trauma patients must be provided on-scene assessment and essential treatment and have appropriate transporting to the trauma center. The shock index (SI), reverse shock index Glasgow Coma Scale (rSIG), and National Early Warning Score (NEWS) triage tools are easy to use in a prehospital setting. There is no standardized on-scene triage protocol in prehospital care. The primary objective was to determine the accuracy of SI, rSIG, and NEWS to predict the severity of trauma patients in the emergency department (ED). Methods: This was a retrospective cross-sectional and diagnostic research conducted on trauma patients transported by EMS to the ED of Ramathibodi Hospital, a university-affiliated super tertiary care hospital in Bangkok, Thailand, from January 2015 to September 2022. We included the injured patients receiving prehospital care and transport to the ED of Ramathibodi Hospital by the EMS team from January 2015 to September 2022. We compared the on-scene parameter (SI, rSIG, and NEWS) and ED (Emergency Severity Index) with the area under ROC. Results: 218 patients were traumatic patients transported by EMS to the ED. 161 was ESI level 1-2, and 57 was level 3-5. NEWS was a more accurate triage tool to discriminate the severity of trauma patients than rSIG and SI. The area under the ROC was 0.743 (95%CI 0.70-0.79), 0.649 (95%CI 0.59-0.70), and 0.582 (95%CI 0.52-0.65), respectively (P-value <0.001). The cut point of NEWS to discriminate was 6 points. Conclusions: The NEWs was the most accurate triage tool in prehospital seeing in trauma patients.Keywords: on-scene triage, trauma patient, ED triage, accuracy, NEWS
Procedia PDF Downloads 1274662 Software User Experience Enhancement through User-Centered Design and Co-design Approach
Authors: Shan Wang, Fahad Alhathal, Hari Subramanian
Abstract:
User-centered design skills play an important role in crafting a positive and intuitive user experience for software applications. Embracing a user-centric design approach involves understanding the needs, preferences, and behaviors of the end-users throughout the design process. This mindset not only enhances the usability of the software but also fosters a deeper connection between the digital product and its users. This paper encompasses a 6-month knowledge exchange collaboration project between an academic institution and an external industry in 2023 in the UK; it aims to improve the user experience of a digital platform utilized for a knowledge management tool, to understand users' preferences for features, identify sources of frustration, and pinpoint areas for enhancement. This research conducted one of the most effective methods to implement user-centered design through co-design workshops for testing user onboarding experiences that involve the active participation of users in the design process. More specifically, in January 2023, we organized eight co-design workshops with a diverse group of 11 individuals. Throughout these co-design workshops, we accumulated a total of 11 hours of qualitative data in both video and audio formats. Subsequently, we conducted an analysis of user journeys, identifying common issues and potential areas for improvement within three insights. This analysis was pivotal in guiding the knowledge management software in prioritizing feature enhancements and design improvements. Employing a user-centered design thinking process, we developed a series of graphic design solutions in collaboration with the software management tool company. These solutions were targeted at refining onboarding user experiences, workplace interfaces, and interactive design. Some of these design solutions were translated into tangible interfaces for the knowledge management tool. By actively involving users in the design process and valuing their input, developers can create products that are not only functional but also resonate with the end-users, ultimately leading to greater success in the competitive software landscape. In conclusion, this paper not only contributes insights into designing onboarding user experiences for software within a co-design approach but also presents key theories on leveraging the user-centered design process in software design to enhance overall user experiences.Keywords: user experiences design, user centered design, co-design approach, knowledge management tool
Procedia PDF Downloads 134661 Competency Model as a Key Tool for Managing People in Organizations: Presentation of a Model
Authors: Andrea ČopíKová
Abstract:
Competency Based Management is a new approach to management, which solves organization’s challenges with complexity and with the aim to find and solve organization’s problems and learn how to avoid these in future. They teach the organizations to create, apart from the state of stability – that is temporary, vital organization, which is permanently able to utilize and profit from internal and external opportunities. The aim of this paper is to propose a process of competency model design, based on which a competency model for a financial department manager in a production company will be created. Competency models are very useful tool in many personnel processes in any organization. They are used for acquiring and selection of employees, designing training and development activities, employees’ evaluation, and they can be used as a guide for a career planning and as a tool for succession planning especially for managerial positions. When creating a competency model the method AHP (Analytic Hierarchy Process) and quantitative pair-wise comparison (Saaty’s method) will be used; these methods belong among the most used methods for the determination of weights, and it is used in the AHP procedure. The introduction part of the paper consists of the research results pertaining to the use of competency model in practice and then the issue of competency and competency models is explained. The application part describes in detail proposed methodology for the creation of competency models, based on which the competency model for the position of financial department manager in a foreign manufacturing company, will be created. In the conclusion of the paper, the final competency model will be shown for above mentioned position. The competency model divides selected competencies into three groups that are managerial, interpersonal and functional. The model describes in detail individual levels of competencies, their target value (required level) and the level of importance.Keywords: analytic hierarchy process, competency, competency model, quantitative pairwise comparison
Procedia PDF Downloads 2444660 A Simple and Easy-To-Use Tool for Detecting Outer Contour of Leukocytes Based on Image Processing Techniques
Authors: Retno Supriyanti, Best Leader Nababan, Yogi Ramadhani, Wahyu Siswandari
Abstract:
Blood cell morphology is an important parameter in a hematology test. Currently, in developing countries, a lot of hematology is done manually, either by physicians or laboratory staff. According to the limitation of the human eye, examination based on manual method will result in a lower precision and accuracy. In addition, the hematology test by manual will further complicate the diagnosis in some areas that do not have competent medical personnel. This research aims to develop a simple tool in the detection of blood cell morphology-based computer. In this paper, we focus on the detection of the outer contour of leukocytes. The results show that the system that we developed is promising for detecting blood cell morphology automatically. It is expected, by implementing this method, the problem of accuracy, precision and limitations of the medical staff can be solved.Keywords: morphology operation, developing countries, hematology test, limitation of medical personnel
Procedia PDF Downloads 3394659 Neighborhood-Scape as a Methodology for Enhancing Gulf Region Cities' Quality of Life: Case of Doha, Qatar
Authors: Eman AbdelSabour
Abstract:
Sustainability is increasingly being considered as a critical aspect in shaping the urban environment. It works as an invention development basis for global urban growth. Currently, different models and structures impact the means of interpreting the criteria that would be included in defining a sustainable city. There is a collective need to improve the growth path to an extremely durable path by presenting different suggestions regarding multi-scale initiatives. The global rise in urbanization has led to increased demand and pressure for better urban planning choice and scenarios for a better sustainable urban alternative. The need for an assessment tool at the urban scale was prompted due to the trend of developing increasingly sustainable urban development (SUD). The neighborhood scale is being managed by a growing research committee since it seems to be a pertinent scale through which economic, environmental, and social impacts could be addressed. Although neighborhood design is a comparatively old practice, it is in the initial years of the 21st century when environmentalists and planners started developing sustainable assessment at the neighborhood level. Through this, urban reality can be considered at a larger scale whereby themes which are beyond the size of a single building can be addressed, while it still stays small enough that concrete measures could be analyzed. The neighborhood assessment tool has a crucial role in helping neighborhood sustainability to perform approach and fulfill objectives through a set of themes and criteria. These devices are also known as neighborhood assessment tool, district assessment tool, and sustainable community rating tool. The primary focus of research has been on sustainability from the economic and environmental aspect, whereas the social, cultural issue is rarely focused. Therefore, this research is based on Doha, Qatar, the current urban conditions of the neighborhoods is discussed in this study. The research problem focuses on the spatial features in relation to the socio-cultural aspects. This study is outlined in three parts; the first section comprises of review of the latest use of wellbeing assessment methods to enhance decision process of retrofitting physical features of the neighborhood. The second section discusses the urban settlement development, regulations and the process of decision-making rule. An analysis of urban development policy with reference to neighborhood development is also discussed in this section. Moreover, it includes a historical review of the urban growth of the neighborhoods as an atom of the city system present in Doha. Last part involves developing quantified indicators regarding subjective well-being through a participatory approach. Additionally, applying GIS will be utilized as a visualizing tool for the apparent Quality of Life (QoL) that need to develop in the neighborhood area as an assessment approach. Envisaging the present QoL situation in Doha neighborhoods is a process to improve current condition neighborhood function involves many days to day activities of the residents, due to which areas are considered dynamic.Keywords: neighborhood, subjective wellbeing, decision support tools, Doha, retrofiring
Procedia PDF Downloads 1384658 Sustainable Landscape Development Assessment Tools
Authors: Nur Azemah Aminludin, Osman Mohd Tahir
Abstract:
A dynamic landscape development is important for providing healthy ecosystem which supports all life. Nowadays, many initiatives towards sustainable development have been published. They lead to better living and more efficient use of natural resources in sustaining long-term ecological, economics and social benefits. To date, many assessment tools related to built environment have been established and practiced in this region, which mostly has the purpose assessing the environment performance of buildings. Hence, an assessment tool focusing on the sustainable landscape development itself is a necessity. This paper reviews the assessment criteria and indicators that are suitable for sustainable landscape development practices. The local and global assessment tools for landscape development are investigated, analyzed and discussed critically. Consideration also is given to the integration of the assessment tools with the surrounding environmental, social, and economical aspects. In addition, the assessment criteria and indicators for assessing the landscape development in Malaysia are also reviewed and discussed. In conclusion, this paper reviews, analyzes and discusses on available local and global landscape development assessment tools for sustainability.Keywords: assessment tool, sustainable landscape development, assessment criteria, assessment indicator
Procedia PDF Downloads 3934657 Electrochemistry of Metal Chalcogenides Semiconductor Materials; Theory and Practical Applications
Authors: Mahmoud Elrouby
Abstract:
Metal chalcogenide materials have wide spectrum of properties, for that these materials can be used in electronics, optics, magnetics, solar energy conversion, catalysis, passivation, ion sensing, batteries, and fuel cells. This work aims to, how can obtain these materials via electrochemical methods simply for further applications. The work regards in particular the systems relevant to the sulphur sub-group elements, i.e., sulphur, selenium, and tellurium. The role of electrochemistry in synthesis, development, and characterization of the metal chalcogenide materials and related devices is vital and important. Electrochemical methods as preparation tool offer the advantages of soft chemistry to access bulk, thin, nano film and epitaxial growth of a wide range of alloys and compounds, while as a characterization tool provides exceptional assistance in specifying the physicochemical properties of materials. Moreover, quite important applications and modern devices base their operation on electrochemical principles. Thereupon, our scope in the first place was to organize existing facts on the electrochemistry of metal chalcogenides regarding their synthesis, properties, and applications.Keywords: electrodeposition, metal chacogenides, semiconductors, applications
Procedia PDF Downloads 2994656 Benchmarking of Pentesting Tools
Authors: Esteban Alejandro Armas Vega, Ana Lucila Sandoval Orozco, Luis Javier García Villalba
Abstract:
The benchmarking of tools for dynamic analysis of vulnerabilities in web applications is something that is done periodically, because these tools from time to time update their knowledge base and search algorithms, in order to improve their accuracy. Unfortunately, the vast majority of these evaluations are made by software enthusiasts who publish their results on blogs or on non-academic websites and always with the same evaluation methodology. Similarly, academics who have carried out this type of analysis from a scientific approach, the majority, make their analysis within the same methodology as well the empirical authors. This paper is based on the interest of finding answers to questions that many users of this type of tools have been asking over the years, such as, to know if the tool truly test and evaluate every vulnerability that it ensures do, or if the tool, really, deliver a real report of all the vulnerabilities tested and exploited. This kind of questions have also motivated previous work but without real answers. The aim of this paper is to show results that truly answer, at least on the tested tools, all those unanswered questions. All the results have been obtained by changing the common model of benchmarking used for all those previous works.Keywords: cybersecurity, IDS, security, web scanners, web vulnerabilities
Procedia PDF Downloads 3194655 Method for Evaluating the Monetary Value of a Customized Version of the Digital Twin for the Additive Manufacturing
Authors: Fabio Oettl, Sebastian Hoerbrand, Tobias Wittmeir, Johannes Schilp
Abstract:
By combining the additive manufacturing (AM)- process with digital concepts, like the digital twin (DT) or the downsized and basing concept of the digital part file (DPF), the competitiveness of additive manufacturing is enhanced and new use cases like decentral production are enabled. But in literature, one can´t find any quantitative approach for valuing the usage of a DT or DPF in AM. Out of this fact, such an approach will be developed within this paper in order to further promote or dissuade the usage of these concepts. The focus is set on the production as an early lifecycle phase, which means that the AM-production process gets analyzed regarding the potential advantages of using DPF in AM. These advantages are transferred to a monetary value with this approach. By calculating the costs of the DPF, an overall monetary value is a result. Thereon a tool, based on a simulation environment is constructed, where the algorithms are transformed into a program. The results of applying this tool show that an overall value of 20,81 € for the DPF can be realized for one special use case. For the future application of the DPF there is the recommendation to integrate especially sustainable information because out of this, a higher value of the DPF can be expected.Keywords: additive manufacturing, digital concept costs, digital part file, digital twin, monetary value estimation
Procedia PDF Downloads 2024654 Experimental Quantification of the Intra-Tow Resin Storage Evolution during RTM Injection
Authors: Mathieu Imbert, Sebastien Comas-Cardona, Emmanuelle Abisset-Chavanne, David Prono
Abstract:
Short cycle time Resin Transfer Molding (RTM) applications appear to be of great interest for the mass production of automotive or aeronautical lightweight structural parts. During the RTM process, the two components of a resin are mixed on-line and injected into the cavity of a mold where a fibrous preform has been placed. Injection and polymerization occur simultaneously in the preform inducing evolutions of temperature, degree of cure and viscosity that furthermore affect flow and curing. In order to adjust the processing conditions to reduce the cycle time, it is, therefore, essential to understand and quantify the physical mechanisms occurring in the part during injection. In a previous study, a dual-scale simulation tool has been developed to help determining the optimum injection parameters. This tool allows tracking finely the repartition of the resin and the evolution of its properties during reactive injections with on-line mixing. Tows and channels of the fibrous material are considered separately to deal with the consequences of the dual-scale morphology of the continuous fiber textiles. The simulation tool reproduces the unsaturated area at the flow front, generated by the tow/channel difference of permeability. Resin “storage” in the tows after saturation is also taken into account as it may significantly affect the repartition and evolution of the temperature, degree of cure and viscosity in the part during reactive injections. The aim of the current study is, thanks to experiments, to understand and quantify the “storage” evolution in the tows to adjust and validate the numerical tool. The presented study is based on four experimental repeats conducted on three different types of textiles: a unidirectional Non Crimp Fabric (NCF), a triaxial NCF and a satin weave. Model fluids, dyes and image analysis, are used to study quantitatively, the resin flow in the saturated area of the samples. Also, textiles characteristics affecting the resin “storage” evolution in the tows are analyzed. Finally, fully coupled on-line mixing reactive injections are conducted to validate the numerical model.Keywords: experimental, on-line mixing, high-speed RTM process, dual-scale flow
Procedia PDF Downloads 167