Search results for: tool validation
5669 Artificial Neural Networks Application on Nusselt Number and Pressure Drop Prediction in Triangular Corrugated Plate Heat Exchanger
Authors: Hany Elsaid Fawaz Abdallah
Abstract:
This study presents a new artificial neural network(ANN) model to predict the Nusselt Number and pressure drop for the turbulent flow in a triangular corrugated plate heat exchanger for forced air and turbulent water flow. An experimental investigation was performed to create a new dataset for the Nusselt Number and pressure drop values in the following range of dimensionless parameters: The plate corrugation angles (from 0° to 60°), the Reynolds number (from 10000 to 40000), pitch to height ratio (from 1 to 4), and Prandtl number (from 0.7 to 200). Based on the ANN performance graph, the three-layer structure with {12-8-6} hidden neurons has been chosen. The training procedure includes back-propagation with the biases and weight adjustment, the evaluation of the loss function for the training and validation dataset and feed-forward propagation of the input parameters. The linear function was used at the output layer as the activation function, while for the hidden layers, the rectified linear unit activation function was utilized. In order to accelerate the ANN training, the loss function minimization may be achieved by the adaptive moment estimation algorithm (ADAM). The ‘‘MinMax’’ normalization approach was utilized to avoid the increase in the training time due to drastic differences in the loss function gradients with respect to the values of weights. Since the test dataset is not being used for the ANN training, a cross-validation technique is applied to the ANN network using the new data. Such procedure was repeated until loss function convergence was achieved or for 4000 epochs with a batch size of 200 points. The program code was written in Python 3.0 using open-source ANN libraries such as Scikit learn, TensorFlow and Keras libraries. The mean average percent error values of 9.4% for the Nusselt number and 8.2% for pressure drop for the ANN model have been achieved. Therefore, higher accuracy compared to the generalized correlations was achieved. The performance validation of the obtained model was based on a comparison of predicted data with the experimental results yielding excellent accuracy.Keywords: artificial neural networks, corrugated channel, heat transfer enhancement, Nusselt number, pressure drop, generalized correlations
Procedia PDF Downloads 875668 Current Practices of Permitted Daily Exposure (PDE) Calculation and Selection
Authors: Annie Ramanbhai Mecwan
Abstract:
Cleaning validation in a pharmaceutical manufacturing facility is documented evidence that a cleaning process has effectively removed contaminants, residues from previous drug products and cleaning agents below a pre-defined threshold from the reusable tools and parts of equipment. In shared manufacturing facilities more than one drug product is prepared. After cleaning of reusable tools and parts of equipment after one drug product manufacturing, there are chances that some residues of drug substance from previously manufactured drug products may be retained on the equipment and can carried forward to the next drug product and thus cause cross-contamination. Health-based limits through the derivation of a safe threshold value called permitted daily exposure (PDE) for the residues of drug substances should be employed to identify the risks posed at these manufacturing facilities. The PDE represents a substance-specific dose that is unlikely to cause an adverse effect if an individual is exposed to or below this dose every day for a lifetime. There are different practices to calculate PDE. Data for all APIs in the public domain are considered to calculate PDE value though, company to company may vary the final PDE value based on different toxicologist’s perspective or their subjective evaluation. Hence, Regulatory agencies should take responsibility for publishing PDE values for all APIs as it is done for elemental PDEs. This will harmonize the PDE values all over the world and prevent the unnecessary load on manufacturers for cleaning validationKeywords: active pharmaceutical ingredient, good manufacturing practice, NOAEL, no observed adverse effect level, permitted daily exposure
Procedia PDF Downloads 905667 Measurement of Solids Concentration in Hydrocyclone Using ERT: Validation Against CFD
Authors: Vakamalla Teja Reddy, Narasimha Mangadoddy
Abstract:
Hydrocyclones are used to separate particles into different size fractions in the mineral processing, chemical and metallurgical industries. High speed video imaging, Laser Doppler Anemometry (LDA), X-ray and Gamma ray tomography are previously used to measure the two-phase flow characteristics in the cyclone. However, investigation of solids flow characteristics inside the cyclone is often impeded by the nature of the process due to slurry opaqueness and solid metal wall vessels. In this work, a dual-plane high speed Electrical resistance tomography (ERT) is used to measure hydrocyclone internal flow dynamics in situ. Experiments are carried out in 3 inch hydrocyclone for feed solid concentrations varying in the range of 0-50%. ERT data analysis through the optimized FEM mesh size and reconstruction algorithms on air-core and solid concentration tomograms is assessed. Results are presented in terms of the air-core diameter and solids volume fraction contours using Maxwell’s equation for various hydrocyclone operational parameters. It is confirmed by ERT that the air core occupied area and wall solids conductivity levels decreases with increasing the feed solids concentration. Algebraic slip mixture based multi-phase computational fluid dynamics (CFD) model is used to predict the air-core size and the solid concentrations in the hydrocyclone. Validation of air-core size and mean solid volume fractions by ERT measurements with the CFD simulations is attempted.Keywords: air-core, electrical resistance tomography, hydrocyclone, multi-phase CFD
Procedia PDF Downloads 3795666 The Validation and Reliability of the Arabic Effort-Reward Imbalance Model Questionnaire: A Cross-Sectional Study among University Students in Jordan
Authors: Mahmoud M. AbuAlSamen, Tamam El-Elimat
Abstract:
Amid the economic crisis in Jordan, the Jordanian government has opted for a knowledge economy where education is promoted as a mean for economic development. University education usually comes at the expense of study-related stress that may adversely impact the health of students. Since stress is a latent variable that is difficult to measure, a valid tool should be used in doing so. The effort-reward imbalance (ERI) is a model used as a measurement tool for occupational stress. The model was built on the notion of reciprocity, which relates ‘effort’ to ‘reward’ through the mediating ‘over-commitment’. Reciprocity assumes equilibrium between both effort and reward, where ‘high’ effort is adequately compensated with ‘high’ reward. When this equilibrium is violated (i.e., high effort with low reward), this may elicit negative emotions and stress, which have been correlated to adverse health conditions. The theory of ERI was established in many different parts of the world, and associations with chronic diseases and the health of workers were explored at length. While much of the effort-reward imbalance was investigated in work conditions, there has been a growing interest in understanding the validity of the ERI model when applied to other social settings such as schools and universities. The ERI questionnaire was developed in Arabic recently to measure ERI among high school teachers. However, little information is available on the validity of the ERI questionnaire in university students. A cross-sectional study was conducted on 833 students in Jordan to measure the validity and reliability of the ERI questionnaire in Arabic among university students. Reliability, as measured by Cronbach’s alpha of the effort, reward, and overcommitment scales, was 0.73, 0.76, and 0.69, respectively, suggesting satisfactory reliability. The factorial structure was explored using principal axis factoring. The results fitted a five-solution model where both the effort and overcommitment were uni-dimensional while the reward scale was three-dimensional with its factors, namely being ‘support’, ‘esteem’, and ‘security’. The solution explained 56% of the variance in the data. The established ERI theory was replicated with excellent validity in this study. The effort-reward ratio in university students was 1.19, which suggests a slight degree of failed reciprocity. The study also investigated the association of effort, reward, overcommitment, and ERI with participants’ demographic factors and self-reported health. ERI was found to be significantly associated with absenteeism (p < 0.0001), past history of failed courses (p=0.03), and poor academic performance (p < 0.001). Moreover, ERI was found to be associated with poor self-reported health among university students (p=0.01). In conclusion, the Arabic ERI questionnaire is reliable and valid for use in measuring effort-reward imbalance in university students in Jordan. The results of this research are important in informing higher education policy in Jordan.Keywords: effort-reward imbalance, factor analysis, validity, self-reported health
Procedia PDF Downloads 1165665 Awareness and Utilization of E-Learning Technologies in Teaching and Learning of Human Kinetics and Health Education Courses in Nigeria Universities
Authors: Ibrahim Laro ABUBAKAR
Abstract:
The study examined the Availability and Utilization of E-Learning Technologies in Teaching of Human Kinetics and Health Education courses in Nigerian Universities, specifically, Universities in Kwara State. Two purposes were formulated to guide the study from which two research questions and two hypotheses were raised. The descriptive research design was used in the research. Three Hundred respondents (100 Lecturers and 200 Students) made up the population for the study. There was no sampling, as the population of the study was not much. A structured questionnaire tagged ‘Availability and Utilization of E-Learning Technologies in Teaching and Learning Questionnaire’ (AUETTLQ) was used for data collection. The questionnaire was subjected to face and content validation, and it was equally pilot tested. The validation yielded a reliability coefficient of 0.78. The data collected from the study were statistically analyzed using frequencies and percentage count for personal data of the respondents, mean and standard deviation to answer the research questions. The null hypotheses were tested at 0.05 level of significance using the independent t-test. One among other findings of this study showed that lecturers and Student are aware of synchronous e-learning technologies in teaching and learning of Human Kinetics and Health Education but often utilize the synchronous e-learning technologies. It was recommended among others that lecturers and Students should be sensitized through seminars and workshops on the need to maximally utilize available e-learning technologies in teaching and learning of Human Kinetics and Health Education courses in Universities.Keywords: awareness, utilization, E-Learning, technologies, human kinetics synchronous
Procedia PDF Downloads 1195664 Lessons Learned from Interlaboratory Noise Modelling in Scope of Environmental Impact Assessments in Slovenia
Abstract:
Noise assessment methods are regularly used in scope of Environmental Impact Assessments for planned projects to assess (predict) the expected noise emissions of these projects. Different noise assessment methods could be used. In recent years, we had an opportunity to collaborate in some noise assessment procedures where noise assessments of different laboratories have been performed simultaneously. We identified some significant differences in noise assessment results between laboratories in Slovenia. We estimate that despite good input Georeferenced Data to set up acoustic model exists in Slovenia; there is no clear consensus on methods for predictive noise methods for planned projects. We analyzed input data, methods and results of predictive noise methods for two planned industrial projects, both were done independently by two laboratories. We also analyzed the data, methods and results of two interlaboratory collaborative noise models for two existing noise sources (railway and motorway). In cases of predictive noise modelling, the validations of acoustic models were performed by noise measurements of surrounding existing noise sources, but in varying durations. The acoustic characteristics of existing buildings were also not described identically. The planned noise sources were described and digitized differently. Differences in noise assessment results between different laboratories have ranged up to 10 dBA, which considerably exceeds the acceptable uncertainty ranged between 3 to 6 dBA. Contrary to predictive noise modelling, in cases of collaborative noise modelling for two existing noise sources the possibility to perform the validation noise measurements of existing noise sources greatly increased the comparability of noise modelling results. In both cases of collaborative noise modelling for existing motorway and railway, the modelling results of different laboratories were comparable. Differences in noise modeling results between different laboratories were below 5 dBA, which was acceptable uncertainty set up by interlaboratory noise modelling organizer. The lessons learned from the study were: 1) Predictive noise calculation using formulae from International standard SIST ISO 9613-2: 1997 is not an appropriate method to predict noise emissions of planned projects since due to complexity of procedure they are not used strictly, 2) The noise measurements are important tools to minimize noise assessment errors of planned projects and should be in cases of predictive noise modelling performed at least for validation of acoustic model, 3) National guidelines should be made on the appropriate data, methods, noise source digitalization, validation of acoustic model etc. in order to unify the predictive noise models and their results in scope of Environmental Impact Assessments for planned projects.Keywords: environmental noise assessment, predictive noise modelling, spatial planning, noise measurements, national guidelines
Procedia PDF Downloads 2345663 The Work Book Tool, a Lifelong Chronicle: Part of the "Designprogrammet" at the Design School of the University in Kalmar, Sweden
Authors: Henriette Jarild-Koblanck, Monica Moro
Abstract:
The research has been implemented at the Kalmar University now LNU Linnaeus University inside the Design Program (Designprogrammet) for several years. The Work Book tool was created using the framework of the Bologna declaration. The project concerns primarily pedagogy and design methodology, focusing on how we evaluate artistic work processes and projects and on how we can develop the preconditions for cross-disciplinary work. The original idea of the Work Book springs from the steady habit of the Swedish researcher and now retired full professor and dean Henriette Koblanck to put images, things and colours in a notebook, right from her childhood, writing down impressions and reflections. On this preliminary thought of making use of a work book, in a form freely chosen by the user, she began to develop the Design Program (Designprogrammet) that was applied at the Kalmar University now LNU Linnaeus University, where she called a number of professionals to collaborate, among them Monica Moro an Italian designer, researcher, and teacher in the field of colour and shape. The educational intention is that the Work Book should become a tool that is both inspirational for the process of thinking and intuitional creating, and personal support for both rational and technical thinking. The students were to use the Work Book not only to visually and graphically document their results from investigations, experiments and thoughts but also as a tool to present their works to others, -students, tutors and teachers, or to other stakeholders they discussed the proceedings with. To help the students a number of matrixes were developed oriented to evaluate the projects in elaboration, based on the Bologna Declaration. In conclusion, the feedback from the students is excellent; many are still using the Work Book as a professional tool as in their words they consider it a rather accurate representation of their working process, and furthermore of themselves, so much that many of them have used it as a portfolio when applying for jobs.Keywords: academic program, art, assessment of student’s progress, Bologna Declaration, design, learning, self-assessment
Procedia PDF Downloads 3385662 Design of a Service-Enabled Dependable Integration Environment
Authors: Fuyang Peng, Donghong Li
Abstract:
The aim of information systems integration is to make all the data sources, applications and business flows integrated into the new environment so that unwanted redundancies are reduced and bottlenecks and mismatches are eliminated. Two issues have to be dealt with to meet such requirements: the software architecture that supports resource integration, and the adaptor development tool that help integration and migration of legacy applications. In this paper, a service-enabled dependable integration environment (SDIE), is presented, which has two key components, i.e., a dependable service integration platform and a legacy application integration tool. For the dependable platform for service integration, the service integration bus, the service management framework, the dependable engine for service composition, and the service registry and discovery components are described. For the legacy application integration tool, its basic organization, functionalities and dependable measures taken are presented. Due to its service-oriented integration model, the light-weight extensible container, the service component combination-oriented p-lattice structure, and other features, SDIE has advantages in openness, flexibility, performance-price ratio and feature support over commercial products, is better than most of the open source integration software in functionality, performance and dependability support.Keywords: application integration, dependability, legacy, SOA
Procedia PDF Downloads 3605661 Simulation of Climatic Change Effects on the Potential Fishing Zones of Dorado Fish (Coryphaena hippurus L.) in the Colombian Pacific under Scenarios RCP Using CMIP5 Model
Authors: Adriana Martínez-Arias, John Josephraj Selvaraj, Luis Octavio González-Salcedo
Abstract:
In the Colombian Pacific, Dorado fish (Coryphaena hippurus L.) fisheries is of great commercial interest. However, its habitat and fisheries may be affected by climatic change especially by the actual increase in sea surface temperature. Hence, it is of interest to study the dynamics of these species fishing zones. In this study, we developed Artificial Neural Networks (ANN) models to predict Catch per Unit Effort (CPUE) as an indicator of species abundance. The model was based on four oceanographic variables (Chlorophyll a, Sea Surface Temperature, Sea Level Anomaly and Bathymetry) derived from satellite data. CPUE datasets for model training and cross-validation were obtained from logbooks of commercial fishing vessel. Sea surface Temperature for Colombian Pacific were projected under Representative Concentration Pathway (RCP) scenarios 4.5 and 8.5 using Coupled Model Intercomparison Project Phase 5 (CMIP5) and CPUE maps were created. Our results indicated that an increase in sea surface temperature reduces the potential fishing zones of this species in the Colombian Pacific. We conclude that ANN is a reliable tool for simulation of climate change effects on the potential fishing zones. This research opens a future agenda for other species that have been affected by climate change.Keywords: climatic change, artificial neural networks, dorado fish, CPUE
Procedia PDF Downloads 2435660 Pose-Dependency of Machine Tool Structures: Appearance, Consequences, and Challenges for Lightweight Large-Scale Machines
Authors: S. Apprich, F. Wulle, A. Lechler, A. Pott, A. Verl
Abstract:
Large-scale machine tools for the manufacturing of large work pieces, e.g. blades, casings or gears for wind turbines, feature pose-dependent dynamic behavior. Small structural damping coefficients lead to long decay times for structural vibrations that have negative impacts on the production process. Typically, these vibrations are handled by increasing the stiffness of the structure by adding mass. That is counterproductive to the needs of sustainable manufacturing as it leads to higher resource consumption both in material and in energy. Recent research activities have led to higher resource efficiency by radical mass reduction that rely on control-integrated active vibration avoidance and damping methods. These control methods depend on information describing the dynamic behavior of the controlled machine tools in order to tune the avoidance or reduction method parameters according to the current state of the machine. The paper presents the appearance, consequences and challenges of the pose-dependent dynamic behavior of lightweight large-scale machine tool structures in production. The paper starts with the theoretical introduction of the challenges of lightweight machine tool structures resulting from reduced stiffness. The statement of the pose-dependent dynamic behavior is corroborated by the results of the experimental modal analysis of a lightweight test structure. Afterwards, the consequences of the pose-dependent dynamic behavior of lightweight machine tool structures for the use of active control and vibration reduction methods are explained. Based on the state of the art on pose-dependent dynamic machine tool models and the modal investigation of an FE-model of the lightweight test structure, the criteria for a pose-dependent model for use in vibration reduction are derived. The description of the approach for a general pose-dependent model of the dynamic behavior of large lightweight machine tools that provides the necessary input to the aforementioned vibration avoidance and reduction methods to properly tackle machine vibrations is the outlook of the paper.Keywords: dynamic behavior, lightweight, machine tool, pose-dependency
Procedia PDF Downloads 4595659 Programming without Code: An Approach and Environment to Conditions-On-Data Programming
Authors: Philippe Larvet
Abstract:
This paper presents the concept of an object-based programming language where tests (if... then... else) and control structures (while, repeat, for...) disappear and are replaced by conditions on data. According to the object paradigm, by using this concept, data are still embedded inside objects, as variable-value couples, but object methods are expressed into the form of logical propositions (‘conditions on data’ or COD).For instance : variable1 = value1 AND variable2 > value2 => variable3 = value3. Implementing this approach, a central inference engine turns and examines objects one after another, collecting all CODs of each object. CODs are considered as rules in a rule-based system: the left part of each proposition (left side of the ‘=>‘ sign) is the premise and the right part is the conclusion. So, premises are evaluated and conclusions are fired. Conclusions modify the variable-value couples of the object and the engine goes to examine the next object. The paper develops the principles of writing CODs instead of complex algorithms. Through samples, the paper also presents several hints for implementing a simple mechanism able to process this ‘COD language’. The proposed approach can be used within the context of simulation, process control, industrial systems validation, etc. By writing simple and rigorous conditions on data, instead of using classical and long-to-learn languages, engineers and specialists can easily simulate and validate the functioning of complex systems.Keywords: conditions on data, logical proposition, programming without code, object-oriented programming, system simulation, system validation
Procedia PDF Downloads 2215658 The Use of Language as a Cognitive Tool in French Immersion Teaching
Authors: Marie-Josée Morneau
Abstract:
A literacy-based approach, centred on the use of the language of instruction as a cognitive tool, can increase the L2 communication skills of French immersion students. Academic subject areas such as science and mathematics offer an authentic language learning context where students can become more proficient speakers while using specific vocabulary and language structures to learn, interact and communicate their reasoning, when provided the opportunities and guidance to do so. In this Canadian quasi-experimental study, the effects of teaching specific language elements during mathematic classes through literacy-based activities in Early French Immersion programming were compared between two Grade 7/8 groups: the experimental group, which received literacy-based teaching for a 6-week period, and the control group, which received regular teaching instruction. The results showed that the participants from the experimental group made more progress in their mathematical communication skills, which suggests that targeting L2 language as a cognitive tool can be beneficial to immersion learners who learn mathematic concepts and remind us that all L2 teachers are language teachers.Keywords: mathematics, French immersion, literacy-based, oral communication, L2
Procedia PDF Downloads 765657 Clinical Validation of an Automated Natural Language Processing Algorithm for Finding COVID-19 Symptoms and Complications in Patient Notes
Authors: Karolina Wieczorek, Sophie Wiliams
Abstract:
Introduction: Patient data is often collected in Electronic Health Record Systems (EHR) for purposes such as providing care as well as reporting data. This information can be re-used to validate data models in clinical trials or in epidemiological studies. Manual validation of automated tools is vital to pick up errors in processing and to provide confidence in the output. Mentioning a disease in a discharge letter does not necessarily mean that a patient suffers from this disease. Many of them discuss a diagnostic process, different tests, or discuss whether a patient has a certain disease. The COVID-19 dataset in this study used natural language processing (NLP), an automated algorithm which extracts information related to COVID-19 symptoms, complications, and medications prescribed within the hospital. Free-text patient clinical patient notes are rich sources of information which contain patient data not captured in a structured form, hence the use of named entity recognition (NER) to capture additional information. Methods: Patient data (discharge summary letters) were exported and screened by an algorithm to pick up relevant terms related to COVID-19. Manual validation of automated tools is vital to pick up errors in processing and to provide confidence in the output. A list of 124 Systematized Nomenclature of Medicine (SNOMED) Clinical Terms has been provided in Excel with corresponding IDs. Two independent medical student researchers were provided with a dictionary of SNOMED list of terms to refer to when screening the notes. They worked on two separate datasets called "A” and "B”, respectively. Notes were screened to check if the correct term had been picked-up by the algorithm to ensure that negated terms were not picked up. Results: Its implementation in the hospital began on March 31, 2020, and the first EHR-derived extract was generated for use in an audit study on June 04, 2020. The dataset has contributed to large, priority clinical trials (including International Severe Acute Respiratory and Emerging Infection Consortium (ISARIC) by bulk upload to REDcap research databases) and local research and audit studies. Successful sharing of EHR-extracted datasets requires communicating the provenance and quality, including completeness and accuracy of this data. The results of the validation of the algorithm were the following: precision (0.907), recall (0.416), and F-score test (0.570). Percentage enhancement with NLP extracted terms compared to regular data extraction alone was low (0.3%) for relatively well-documented data such as previous medical history but higher (16.6%, 29.53%, 30.3%, 45.1%) for complications, presenting illness, chronic procedures, acute procedures respectively. Conclusions: This automated NLP algorithm is shown to be useful in facilitating patient data analysis and has the potential to be used in more large-scale clinical trials to assess potential study exclusion criteria for participants in the development of vaccines.Keywords: automated, algorithm, NLP, COVID-19
Procedia PDF Downloads 1025656 A Single Cell Omics Experiments as Tool for Benchmarking Bioinformatics Oncology Data Analysis Tools
Authors: Maddalena Arigoni, Maria Luisa Ratto, Raffaele A. Calogero, Luca Alessandri
Abstract:
The presence of tumor heterogeneity, where distinct cancer cells exhibit diverse morphological and phenotypic profiles, including gene expression, metabolism, and proliferation, poses challenges for molecular prognostic markers and patient classification for targeted therapies. Understanding the causes and progression of cancer requires research efforts aimed at characterizing heterogeneity, which can be facilitated by evolving single-cell sequencing technologies. However, analyzing single-cell data necessitates computational methods that often lack objective validation. Therefore, the establishment of benchmarking datasets is necessary to provide a controlled environment for validating bioinformatics tools in the field of single-cell oncology. Benchmarking bioinformatics tools for single-cell experiments can be costly due to the high expense involved. Therefore, datasets used for benchmarking are typically sourced from publicly available experiments, which often lack a comprehensive cell annotation. This limitation can affect the accuracy and effectiveness of such experiments as benchmarking tools. To address this issue, we introduce omics benchmark experiments designed to evaluate bioinformatics tools to depict the heterogeneity in single-cell tumor experiments. We conducted single-cell RNA sequencing on six lung cancer tumor cell lines that display resistant clones upon treatment of EGFR mutated tumors and are characterized by driver genes, namely ROS1, ALK, HER2, MET, KRAS, and BRAF. These driver genes are associated with downstream networks controlled by EGFR mutations, such as JAK-STAT, PI3K-AKT-mTOR, and MEK-ERK. The experiment also featured an EGFR-mutated cell line. Using 10XGenomics platform with cellplex technology, we analyzed the seven cell lines together with a pseudo-immunological microenvironment consisting of PBMC cells labeled with the Biolegend TotalSeq™-B Human Universal Cocktail (CITEseq). This technology allowed for independent labeling of each cell line and single-cell analysis of the pooled seven cell lines and the pseudo-microenvironment. The data generated from the aforementioned experiments are available as part of an online tool, which allows users to define cell heterogeneity and generates count tables as an output. The tool provides the cell line derivation for each cell and cell annotations for the pseudo-microenvironment based on CITEseq data by an experienced immunologist. Additionally, we created a range of pseudo-tumor tissues using different ratios of the aforementioned cells embedded in matrigel. These tissues were analyzed using 10XGenomics (FFPE samples) and Curio Bioscience (fresh frozen samples) platforms for spatial transcriptomics, further expanding the scope of our benchmark experiments. The benchmark experiments we conducted provide a unique opportunity to evaluate the performance of bioinformatics tools for detecting and characterizing tumor heterogeneity at the single-cell level. Overall, our experiments provide a controlled and standardized environment for assessing the accuracy and robustness of bioinformatics tools for studying tumor heterogeneity at the single-cell level, which can ultimately lead to more precise and effective cancer diagnosis and treatment.Keywords: single cell omics, benchmark, spatial transcriptomics, CITEseq
Procedia PDF Downloads 1175655 A Study to Explore the Views of Students regarding E-Learning as an Instructional Tool at University Level
Authors: Zafar Iqbal
Abstract:
This study involved students of 6th semester enrolled in a Bachelor of Computer Science Program at university level. In this era of science and technology, e-learning can be helpful for grassroots in providing them access to education tenant in less developed areas. It is a potential substitute of face-to-face teaching being used in different countries. The purpose of the study was to explore the views of students about e-learning (Facebook) as an instructional tool. By using purposive sampling technique an intact class of 30 students included both male and female were selected where e-learning was used as an instructional tool. The views of students were explored through qualitative approach by using focus group interviews. The approach was helpful to develop comprehensive understanding of students’ views towards e- learning. In addition, probing questions were also asked and recorded. Data was transcribed, generated nodes and then coded text against these nodes. For this purpose and further analysis, NVivo 10 software was used. Themes were generated and tangibly presented through cluster analysis. Findings were interesting and provide sufficient evidence that face book is a subsequent e-learning source for students of higher education. Students acknowledged it as best source of learning and it was aligned with their academic and social behavior. It was not time specific and therefore, feasible for students who work day time and can get on line access to the material when they got free time. There were some distracters (time wasters) reported by the students but can be minimized by little effort. In short, e-learning is need of the day and potential learning source for every individual who have access to internet living at any part of the globe.Keywords: e-learning, facebook, instructional tool, higher education
Procedia PDF Downloads 3755654 Investigation of Optimized Mechanical Properties on Friction Stir Welded Al6063 Alloy
Authors: Lingaraju Dumpala, Narasa Raju Gosangi
Abstract:
Friction Stir Welding (FSW) is relatively new, environmentally friendly, versatile, and widely used joining technique for soft materials such as aluminum. FSW has got a lot of attention as a solid-state joining method which avoids many common problems of fusion welding and provides an improved way of producing aluminum joints in a faster way. FSW can be used for various aerospace, defense, automotive and transportation applications. It is necessary to understand the friction stir welded joints and its characteristics to use this new joining technique in critical applications. This study investigated the mechanical properties of friction stir welded aluminum 6063 alloys. FSW is carried out based on the design of experiments using L16 mixed level array by considering tool rotational speeds, tool feed rate and tool tilt angles as process parameters. The optimization of process parameters is carried by Taguchi based regression analysis and the significance of process parameters is analyzed using ANOVA. It is observed that the considered process parameters are high influences the mechanical properties of Al6063.Keywords: FSW, aluminum alloy, mechanical properties, optimization, Taguchi, ANOVA
Procedia PDF Downloads 1335653 Experimental Characterization of Anti-Icing System and Accretion of Re-Emitted Droplets on Turbojet Engine Blades
Authors: Guillaume Linassier, Morgan Balland, Hugo Pervier, Marie Pervier, David Hammond
Abstract:
Atmospheric icing for turbojet is caused by ingestion of super-cooled water droplets. To prevent operability risks, manufacturer can implement ice protection systems. Thermal systems are commonly used for this purpose, but their activation can cause the formation of a water liquid film, that can freeze downstream the heated surface or even on other components. In the framework of STORM, a European project dedicated to icing physics in turbojet engines, a cascade rig representative of engine inlet blades was built and tested in an icing wind tunnel. This mock-up integrates two rows of blades, the upstream one being anti-iced using an electro-thermal device the downstream one being unheated. Under icing conditions, the anti-icing system is activated and set at power level to observe a liquid film on the surface and droplet re-emission at the trailing edge. These re-emitted droplets will impinge on the downstream row and contribute to ice accretion. A complete experimental database was generated, including the characterization of ice accretion shapes, and the characterization of electro-thermal anti-icing system (power limit for apparition of the runback water or ice accretion). These data will be used for validation of numerical tools for modeling thermal anti-icing systems in the scope of engine application, as well as validation of re-emission droplets model for stator parts.Keywords: turbomachine, anti-icing, cascade rig, runback water
Procedia PDF Downloads 1825652 Development of Methods for Plastic Injection Mold Weight Reduction
Authors: Bita Mohajernia, R. J. Urbanic
Abstract:
Mold making techniques have focused on meeting the customers’ functional and process requirements; however, today, molds are increasing in size and sophistication, and are difficult to manufacture, transport, and set up due to their size and mass. Presently, mold weight saving techniques focus on pockets to reduce the mass of the mold, but the overall size is still large, which introduces costs related to the stock material purchase, processing time for process planning, machining and validation, and excess waste materials. Reducing the overall size of the mold is desirable for many reasons, but the functional requirements, tool life, and durability cannot be compromised in the process. It is proposed to use Finite Element Analysis simulation tools to model the forces, and pressures to determine where the material can be removed. The potential results of this project will reduce manufacturing costs. In this study, a light weight structure is defined by an optimal distribution of material to carry external loads. The optimization objective of this research is to determine methods to provide the optimum layout for the mold structure. The topology optimization method is utilized to improve structural stiffness while decreasing the weight using the OptiStruct software. The optimized CAD model is compared with the primary geometry of the mold from the NX software. Results of optimization show an 8% weight reduction while the actual performance of the optimized structure, validated by physical testing, is similar to the original structure.Keywords: finite element analysis, plastic injection molding, topology optimization, weight reduction
Procedia PDF Downloads 2905651 A Digital Twin Approach for Sustainable Territories Planning: A Case Study on District Heating
Authors: Ahmed Amrani, Oussama Allali, Amira Ben Hamida, Felix Defrance, Stephanie Morland, Eva Pineau, Thomas Lacroix
Abstract:
The energy planning process is a very complex task that involves several stakeholders and requires the consideration of several local and global factors and constraints. In order to optimize and simplify this process, we propose a tool-based iterative approach applied to district heating planning. We build our tool with the collaboration of a French territory using actual district data and implementing the European incentives. We set up an iterative process including data visualization and analysis, identification and extraction of information related to the area concerned by the operation, design of sustainable planning scenarios leveraging local renewable and recoverable energy sources, and finally, the evaluation of scenarios. The last step is performed by a dynamic digital twin replica of the city. Territory’s energy experts confirm that the tool provides them with valuable support towards sustainable energy planning.Keywords: climate change, data management, decision support, digital twin, district heating, energy planning, renewables, smart city
Procedia PDF Downloads 1715650 Evaluating Perceived Usability of ProxTalker App Using Arabic Standard Usability Scale: A Student's Perspective
Authors: S. AlBustan, B. AlGhannam
Abstract:
This oral presentation discusses a proposal for a study that evaluates the usability of an evidence based application named ProxTalker App. The significance of this study will inform administration and faculty staff at the Department of Communication Sciences Disorders (CDS), College of Life Sciences, Kuwait University whether the app is a suitable tool to use for CDS students. A case study will be used involving a sample of CDS students taking practicum and internship courses during the academic year 2018/2019. The study will follow a process used by previous study. The process of calculating SUS is well documented and will be followed. ProxTalker App is an alternative and augmentative tool that speech language pathologist (SLP) can use to customize boards for their clients. SLPs can customize different boards using this app for various activities. A board can be created by the SLP to improve and support receptive and expressive language. Using technology to support therapy can aid SLPs to integrate this ProxTalker App as part of their clients therapy. Supported tools, games and motivation are some advantages of incorporating apps during therapy sessions. A quantitative methodology will be used. It involves the utilization of a standard tool that was the was adapted to the Arabic language to accommodate native Arabic language users. The tool that will be utilized in this research is the Arabic Standard Usability Scale (A-SUS) questionnaire which is an adoption of System Usability Scale (SUS). Standard usability questionnaires are reliable, valid and their process is properly documented. This study builds upon the development of A-SUS, which is a psychometrically evaluated questionnaire that targets Arabic native speakers. Results of the usability will give preliminary indication of whether the ProxTalker App under investigation is appropriate to be integrated within the practicum and internship curriculum of CDS. The results of this study will inform the CDS department of this specific app is an appropriate tool to be used for our specific students within our environment because usability depends on the product, environment, and users.Keywords: A-SUS, communication disorders practicum, evidence based app, Standard Usability Scale
Procedia PDF Downloads 1565649 A Survey of WhatsApp as a Tool for Instructor-Learner Dialogue, Learner-Content Dialogue, and Learner-Learner Dialogue
Authors: Ebrahim Panah, Muhammad Yasir Babar
Abstract:
Thanks to the development of online technology and social networks, people are able to communicate as well as learn. WhatsApp is a popular social network which is growingly gaining popularity. This app can be used for communication as well as education. It can be used for instructor-learner, learner-learner, and learner-content interactions; however, very little knowledge is available on these potentials of WhatsApp. The current study was undertaken to investigate university students’ perceptions of WhatsApp used as a tool for instructor-learner dialogue, learner-content dialogue, and learner-learner dialogue. The study adopted a survey approach and distributed the questionnaire developed by Google Forms to 54 (11 males and 43 females) university students. The obtained data were analyzed using SPSS version 20. The result of data analysis indicates that students have positive attitudes towards WhatsApp as a tool for Instructor-Learner Dialogue: it easy to reach the lecturer (4.07), the instructor gives me valuable feedback on my assignment (4.02), the instructor is supportive during course discussion and offers continuous support with the class (4.00). Learner-Content Dialogue: WhatsApp allows me to academically engage with lecturers anytime, anywhere (4.00), it helps to send graphics such as pictures or charts directly to the students (3.98), it also provides out of class, extra learning materials and homework (3.96), and Learner-Learner Dialogue: WhatsApp is a good tool for sharing knowledge with others (4.09), WhatsApp allows me to academically engage with peers anytime, anywhere (4.07), and we can interact with others through the use of group discussion (4.02). It was also found that there are significant positive correlations between students’ perceptions of Instructor-Learner Dialogue (ILD), Learner-Content Dialogue (LCD), Learner-Learner Dialogue (LLD) and WhatsApp Application in classroom. The findings of the study have implications for lectures, policy makers and curriculum developers.Keywords: instructor-learner dialogue, learners-contents dialogue, learner-learner dialogue, whatsapp application
Procedia PDF Downloads 1585648 A Comparative Study of Three Major Performance Testing Tools
Authors: Abdulaziz Omar Alsadhan, Mohd Mudasir Shafi
Abstract:
Performance testing is done to prove the reliability of any software product. There are a number of tools available in the markets that are used to perform performance testing. In this paper we present a comparative study of the three most commonly used performance testing tools. These tools cover the major share of the performance testing market and are widely used. In this paper we compared the tools on five evaluation parameters which are; User friendliness, portability, tool support, compatibility and cost. The conclusion provided at the end of the paper is based on our study and does not support any tool or company.Keywords: software development, software testing, quality assurance, performance testing, load runner, rational testing, silk performer
Procedia PDF Downloads 6085647 A Machine Learning-Assisted Crime and Threat Intelligence Hunter
Authors: Mohammad Shameel, Peter K. K. Loh, James H. Ng
Abstract:
Cybercrime is a new category of crime which poses a different challenge for crime investigators and incident responders. Attackers can mask their identities using a suite of tools and with the help of the deep web, which makes them difficult to track down. Scouring the deep web manually takes time and is inefficient. There is a growing need for a tool to scour the deep web to obtain useful evidence or intel automatically. In this paper, we will explain the background and motivation behind the research, present a survey of existing research on related tools, describe the design of our own crime/threat intelligence hunting tool prototype, demonstrate its capability with some test cases and lastly, conclude with proposals for future enhancements.Keywords: cybercrime, deep web, threat intelligence, web crawler
Procedia PDF Downloads 1735646 Using the Semantic Web Technologies to Bring Adaptability in E-Learning Systems
Authors: Fatima Faiza Ahmed, Syed Farrukh Hussain
Abstract:
The last few decades have seen a large proportion of our population bending towards e-learning technologies, starting from learning tools used in primary and elementary schools to competency based e-learning systems specifically designed for applications like finance and marketing. The huge diversity in this crowd brings about a large number of challenges for the designers of these e-learning systems, one of which is the adaptability of such systems. This paper focuses on adaptability in the learning material in an e-learning course and how artificial intelligence and the semantic web can be used as an effective tool for this purpose. The study proved that the semantic web, still a hot topic in the area of computer science can prove to be a powerful tool in designing and implementing adaptable e-learning systems.Keywords: adaptable e-learning, HTMLParser, information extraction, semantic web
Procedia PDF Downloads 3395645 Computational Fluid Dynamics Modeling of Flow Properties Fluctuations in Slug-Churn Flow through Pipe Elbow
Authors: Nkemjika Chinenye-Kanu, Mamdud Hossain, Ghazi Droubi
Abstract:
Prediction of multiphase flow induced forces, void fraction and pressure is crucial at both design and operating stages of practical energy and process pipe systems. In this study, transient numerical simulations of upward slug-churn flow through a vertical 90-degree elbow have been conducted. The volume of fluid (VOF) method was used to model the two-phase flows while the K-epsilon Reynolds-Averaged Navier-Stokes (RANS) equations were used to model turbulence in the flows. The simulation results were validated using experimental results. Void fraction signal, peak frequency and maximum magnitude of void fraction fluctuation of the slug-churn flow validation case studies compared well with experimental results. The x and y direction force fluctuation signals at the elbow control volume were obtained by carrying out force balance calculations using the directly extracted time domain signals of flow properties through the control volume in the numerical simulation. The computed force signal compared well with experiment for the slug and churn flow validation case studies. Hence, the present numerical simulation technique was able to predict the behaviours of the one-way flow induced forces and void fraction fluctuations.Keywords: computational fluid dynamics, flow induced vibration, slug-churn flow, void fraction and force fluctuation
Procedia PDF Downloads 1565644 Medical Imaging Fusion: A Teaching-Learning Simulation Environment
Authors: Cristina Maria Ribeiro Martins Pereira Caridade, Ana Rita Ferreira Morais
Abstract:
The use of computational tools has become essential in the context of interactive learning, especially in engineering education. In the medical industry, teaching medical image processing techniques is a crucial part of training biomedical engineers, as it has integrated applications with healthcare facilities and hospitals. The aim of this article is to present a teaching-learning simulation tool developed in MATLAB using a graphical user interface for medical image fusion that explores different image fusion methodologies and processes in combination with image pre-processing techniques. The application uses different algorithms and medical fusion techniques in real time, allowing you to view original images and fusion images, compare processed and original images, adjust parameters, and save images. The tool proposed in an innovative teaching and learning environment consists of a dynamic and motivating teaching simulation for biomedical engineering students to acquire knowledge about medical image fusion techniques and necessary skills for the training of biomedical engineers. In conclusion, the developed simulation tool provides real-time visualization of the original and fusion images and the possibility to test, evaluate and progress the student’s knowledge about the fusion of medical images. It also facilitates the exploration of medical imaging applications, specifically image fusion, which is critical in the medical industry. Teachers and students can make adjustments and/or create new functions, making the simulation environment adaptable to new techniques and methodologies.Keywords: image fusion, image processing, teaching-learning simulation tool, biomedical engineering education
Procedia PDF Downloads 1325643 Towards a Sustainable Energy Future: Method Used in Existing Buildings to Implement Sustainable Energy Technologies
Authors: Georgi Vendramin, Aurea Lúcia, Yamamoto, Carlos Itsuo, Souza Melegari, N. Samuel
Abstract:
This article describes the development of a model that uses a method where openings are represented by single glass and double glass. The model is based on a healthy balance equations purely theoretical and empirical data. Simplified equations are derived through a synthesis of the measured data obtained from meteorological stations. The implementation of the model in a design tool integrated buildings is discussed in this article, to better punctuate the requirements of comfort and energy efficiency in architecture and engineering. Sustainability, energy efficiency, and the integration of alternative energy systems and concepts are beginning to be incorporated into designs for new buildings and renovations to existing buildings. Few means have existed to effectively validate the potential performance benefits of the design concepts. It was used a method of degree-days for an assessment of the energy performance of a building showed that the design of the architectural design should always be considered the materials used and the size of the openings. The energy performance was obtained through the model, considering the location of the building Central Park Shopping Mall, in the city of Cascavel - PR. Obtained climatic data of these locations and in a second step, it was obtained the coefficient of total heat loss in the building pre-established so evaluating the thermal comfort and energy performance. This means that the more openings in buildings in Cascavel – PR, installed to the east side, they may be higher because the glass added to the geometry of architectural spaces will cause the environment conserve energy.Keywords: sustainable design, energy modeling, design validation, degree-days methods
Procedia PDF Downloads 4195642 Dow Polyols near Infrared Chemometric Model Reduction Based on Clustering: Reducing Thirty Global Hydroxyl Number (OH) Models to Less Than Five
Authors: Wendy Flory, Kazi Czarnecki, Matthijs Mercy, Mark Joswiak, Mary Beth Seasholtz
Abstract:
Polyurethane Materials are present in a wide range of industrial segments such as Furniture, Building and Construction, Composites, Automotive, Electronics, and more. Dow is one of the leaders for the manufacture of the two main raw materials, Isocyanates and Polyols used to produce polyurethane products. Dow is also a key player for the manufacture of Polyurethane Systems/Formulations designed for targeted applications. In 1990, the first analytical chemometric models were developed and deployed for use in the Dow QC labs of the polyols business for the quantification of OH, water, cloud point, and viscosity. Over the years many models have been added; there are now over 140 models for quantification and hundreds for product identification, too many to be reasonable for support. There are 29 global models alone for the quantification of OH across > 70 products at many sites. An attempt was made to consolidate these into a single model. While the consolidated model proved good statistics across the entire range of OH, several products had a bias by ASTM E1655 with individual product validation. This project summary will show the strategy for global model updates for OH, to reduce the number of models for quantification from over 140 to 5 or less using chemometric methods. In order to gain an understanding of the best product groupings, we identify clusters by reducing spectra to a few dimensions via Principal Component Analysis (PCA) and Uniform Manifold Approximation and Projection (UMAP). Results from these cluster analyses and a separate validation set allowed dow to reduce the number of models for predicting OH from 29 to 3 without loss of accuracy.Keywords: hydroxyl, global model, model maintenance, near infrared, polyol
Procedia PDF Downloads 1355641 Influences of Plunge Speed on Axial Force and Temperature of Friction Stir Spot Welding in Thin Aluminum A1100
Authors: Suwarsono, Ario S. Baskoro, Gandjar Kiswanto, Budiono
Abstract:
Friction Stir Welding (FSW) is a relatively new technique for joining metal. In some cases on aluminum joining, FSW gives better results compared with the arc welding processes, including the quality of welds and produces less distortion.FSW welding process for a light structure and thin materials requires small forces as possible, to avoid structure deflection. The joining process on FSW occurs because of melting temperature and compressive forces, the temperature generation of caused by material deformation and friction between the cutting tool and material. In this research, High speed rotation of spindle was expected to reduce the force required for deformation. The welding material was Aluminum A1100, with thickness of 0.4 mm. The tool was made of HSS material which was shaped by micro grinding process. Tool shoulder diameter is 4 mm, and the length of pin was 0.6 mm (with pin diameter= 1.5 mm). The parameters that varied were the plunge speed (2 mm/min, 3 mm/min, 4 mm/min). The tool speed is fixed at 33,000 rpm. Responses of FSSW parameters to analyze were Axial Force (Z-Force), Temperature and the Shear Strength of welds. Research found the optimum µFSSW parameters, it can be concluded that the most important parameters in the μFSSW process was plunge speed. lowest plunge speed (2 mm / min) causing the lowest axial force (110.40 Newton). The increases of plunge speed will increase the axial force (maximum Z-Farce= 236.03 Newton), and decrease the shear strength of welds.Keywords: friction stir spot welding, aluminum A1100, plunge speed, axial force, shear strength
Procedia PDF Downloads 3105640 Literature as a Tool for Sustenance of Human Dignity in the 21st Century
Authors: Arubi Thompson Abari
Abstract:
Globally, a writer is absolutely necessary to the society, for he mirrors and projects the society, grumbles and protects against the ills that hinders its development. A writer is committed to the language, social-cultural, political and economic factors that determine the sustenance of human dignity in the society. In this 21st century. The literary artist holds literature as a tool for the restoration and sustenance of human dignity. In Nigeria, literature is politically committed because colonialism gives birth to the modern Nigerian literature. Literature thus was regarded as one of the greatest weapons against colonialism in Nigeria. Nigerian literature is aimed at the restoration and sustenance of the dignity of Nigerians in the 21st century. A literary writer is a member of the society and his sensibility is conditioned by the socio-political situations around him. A writer cannot be excused from the task of regeneration and restoration of his past lost glorious days that must be done. This academic paper therefore showcases the efficacy of literature in bringing about the sustenance of human dignity in the 21st century. Consequently, the paper in its introduction clarifies some vital concepts. It discusses the forms of literature, portrays the ability and capability of literature as a tool for the sustenance of human dignity globally, and makes useful recommendations for the growth of knowledge in the 21st century and beyond.Keywords: literature, sustenance, human dignity, 21st century
Procedia PDF Downloads 91