Search results for: process modeling advancements
15211 Measurement and Analysis of Human Hand Kinematics
Authors: Tamara Grujic, Mirjana Bonkovic
Abstract:
Measurements and quantitative analysis of kinematic parameters of human hand movements have an important role in different areas such as hand function rehabilitation, modeling of multi-digits robotic hands, and the development of machine-man interfaces. In this paper the assessment and evaluation of the reach-to-grasp movement by using computerized and robot-assisted method is described. Experiment involved the measurements of hand positions of seven healthy subjects during grasping three objects of different shapes and sizes. Results showed that three dominant phases of reach-to-grasp movements could be clearly identified.Keywords: human hand, kinematics, measurement and analysis, reach-to-grasp movement
Procedia PDF Downloads 46415210 Design of 3D Bioprinted Scaffolds for Cartilage Regeneration
Authors: Gloria Pinilla, Jose Manuel Baena, Patricia Gálvez-Martín, Juan Antonio Marchad
Abstract:
Cartilage is a dense connective tissue with limited self-repair properties. Currently, the therapeutic use of autologous or allogenic chondrocytes makes up an alternative therapy to the pharmacological treatment. The design of a bioprinted 3D cartilage with chondrocytes and biodegradable biomaterials offers a new therapeutic alternative able of bridging the limitations of current therapies in the field. We have developed an enhanced printing processes-Injection Volume Filling (IVF) to increase the viability and survival of the cells when working with high-temperature thermoplastics without the limitation of the scaffold geometry in contact with cells. We have demonstrated the viability of the printing process using chondrocytes for cartilage regeneration. This development will accelerate the clinical uptake of the technology and overcomes the current limitation when using thermoplastics as scaffolds. An alginate-based hydrogel combined with human chondrocytes (isolated from osteoarthritis patients) was formulated as bioink-A and the polylactic acid as bioink-B. The bioprinting process was carried out with the REGEMAT V1 bioprinter (Regemat 3D, Granada-Spain) through a IVF. The printing capacity of the bioprinting plus the viability and cell proliferation of bioprinted chondrociytes was evaluated after five weeks by confocal microscopy and Alamar Blue Assay (Biorad). Results showed that the IVF process does not decrease the cell viability of the chondrocytes during the printing process as the cells do not have contact with the thermoplastic at elevated temperatures. The viability and cellular proliferation of the bioprinted artificial 3D cartilage increased after 5 weeks. In conclusion, this study demonstrates the potential use of Regemat V1 for 3D bioprinting of cartilage and the viability of bioprinted chondrocytes in the scaffolds for application in regenerative medicine.Keywords: cartilage regeneration, bioprinting, bioink, scaffold, chondrocyte
Procedia PDF Downloads 31315209 Handloom Weaving Quality and Fashion Development Process for Traditional Costumes in the Contemporary Global Fashion Market in Ethiopia
Authors: Adiyam Amare
Abstract:
This research explores the handloom weaving quality and fashion development process for traditional Ethiopian costumes, particularly focusing on the challenges and opportunities within the contemporary global fashion market. Through a qualitative approach, including interviews and direct observations, the study identifies key factors affecting the handloom industry, such as quality improvement, market integration, and cultural preservation. The findings suggest that enhancing production quality, modernizing techniques, and fostering global market participation can significantly improve the competitiveness of Ethiopian traditional garments in the global fashion industry.Keywords: fashion, culture, design, textile
Procedia PDF Downloads 2515208 Development of PVA/polypyrrole Scaffolds by Supercritical CO₂ for Its Application in Biomedicine
Authors: Antonio Montes, Antonio Cozar, Clara Pereyra, Diego Valor, Enrique Martinez de la Ossa
Abstract:
Tissues and organs can be damaged because of traumatism, congenital illnesses, or cancer and the traditional therapeutic alternatives, such as surgery, cannot usually completely repair the damaged tissues. Tissue engineering allows regeneration of the patient's tissues, reducing the problems caused by the traditional methods. Scaffolds, polymeric structures with interconnected porosity, can be promoted the proliferation and adhesion of the patient’s cells in the damaged area. Furthermore, by means of impregnation of the scaffold with beneficial active substances, tissue regeneration can be induced through a drug delivery process. The objective of the work is the fabrication of a PVA scaffold coated with Gallic Acid and polypyrrole through a one-step foaming and impregnation process using the SSI technique (Supercritical Solvent Impregnation). In this technique, supercritical CO₂ penetrates into the polymer chains producing the plasticization of the polymer. In the depressurization step a CO₂ cellular nucleation and growing to take place to an interconnected porous structure of the polymer. The foaming process using supercritical CO₂ as solvent and expansion agent presents advantages compared to the traditional scaffolds’ fabrication methods, such as the polymer’s high solubility in the solvent or the possibility of carrying out the process at a low temperature, avoiding the inactivation of the active substance. In this sense, the supercritical CO₂ avoids the use of organic solvents and reduces the solvent residues in the final product. Moreover, this process does not require long processing time that could cause the stratification of substance inside the scaffold reducing the therapeutic efficiency of the formulation. An experimental design has been carried out to optimize the SSI technique operating conditions, as well as a study of the morphological characteristics of the scaffold for its use in tissue engineerings, such as porosity, conductivity or the release profiles of the active substance. It has been proved that the obtained scaffolds are partially porous, conductors of electricity and are able to release Gallic Acid in the long term.Keywords: scaffold, foaming, supercritical, PVA, polypyrrole, gallic acid
Procedia PDF Downloads 18215207 The Importance of including All Data in a Linear Model for the Analysis of RNAseq Data
Authors: Roxane A. Legaie, Kjiana E. Schwab, Caroline E. Gargett
Abstract:
Studies looking at the changes in gene expression from RNAseq data often make use of linear models. It is also common practice to focus on a subset of data for a comparison of interest, leaving aside the samples not involved in this particular comparison. This work shows the importance of including all observations in the modeling process to better estimate variance parameters, even when the samples included are not directly used in the comparison under test. The human endometrium is a dynamic tissue, which undergoes cycles of growth and regression with each menstrual cycle. The mesenchymal stem cells (MSCs) present in the endometrium are likely responsible for this remarkable regenerative capacity. However recent studies suggest that MSCs also plays a role in the pathogenesis of endometriosis, one of the most common medical conditions affecting the lower abdomen in women in which the endometrial tissue grows outside the womb. In this study we compared gene expression profiles between MSCs and non-stem cell counterparts (‘non-MSC’) obtained from women with (‘E’) or without (‘noE’) endometriosis from RNAseq. Raw read counts were used for differential expression analysis using a linear model with the limma-voom R package, including either all samples in the study or only the samples belonging to the subset of interest (e.g. for the comparison ‘E vs noE in MSC cells’, including only MSC samples from E and noE patients but not the non-MSC ones). Using the full dataset we identified about 100 differentially expressed (DE) genes between E and noE samples in MSC samples (adj.p-val < 0.05 and |logFC|>1) while only 9 DE genes were identified when using only the subset of data (MSC samples only). Important genes known to be involved in endometriosis such as KLF9 and RND3 were missed in the latter case. When looking at the MSC vs non-MSC cells comparison, the linear model including all samples identified 260 genes for noE samples (including the stem cell marker SUSD2) while the subset analysis did not identify any DE genes. When looking at E samples, 12 genes were identified with the first approach and only 1 with the subset approach. Although the stem cell marker RGS5 was found in both cases, the subset test missed important genes involved in stem cell differentiation such as NOTCH3 and other potentially related genes to be used for further investigation and pathway analysis.Keywords: differential expression, endometriosis, linear model, RNAseq
Procedia PDF Downloads 43215206 Multi-Criteria Inventory Classification Process Based on Logical Analysis of Data
Authors: Diana López-Soto, Soumaya Yacout, Francisco Ángel-Bello
Abstract:
Although inventories are considered as stocks of money sitting on shelve, they are needed in order to secure a constant and continuous production. Therefore, companies need to have control over the amount of inventory in order to find the balance between excessive and shortage of inventory. The classification of items according to certain criteria such as the price, the usage rate and the lead time before arrival allows any company to concentrate its investment in inventory according to certain ranking or priority of items. This makes the decision making process for inventory management easier and more justifiable. The purpose of this paper is to present a new approach for the classification of new items based on the already existing criteria. This approach is called the Logical Analysis of Data (LAD). It is used in this paper to assist the process of ABC items classification based on multiple criteria. LAD is a data mining technique based on Boolean theory that is used for pattern recognition. This technique has been tested in medicine, industry, credit risk analysis, and engineering with remarkable results. An application on ABC inventory classification is presented for the first time, and the results are compared with those obtained when using the well-known AHP technique and the ANN technique. The results show that LAD presented very good classification accuracy.Keywords: ABC multi-criteria inventory classification, inventory management, multi-class LAD model, multi-criteria classification
Procedia PDF Downloads 88115205 Heart Murmurs and Heart Sounds Extraction Using an Algorithm Process Separation
Authors: Fatima Mokeddem
Abstract:
The phonocardiogram signal (PCG) is a physiological signal that reflects heart mechanical activity, is a promising tool for curious researchers in this field because it is full of indications and useful information for medical diagnosis. PCG segmentation is a basic step to benefit from this signal. Therefore, this paper presents an algorithm that serves the separation of heart sounds and heart murmurs in case they exist in order to use them in several applications and heart sounds analysis. The separation process presents here is founded on three essential steps filtering, envelope detection, and heart sounds segmentation. The algorithm separates the PCG signal into S1 and S2 and extract cardiac murmurs.Keywords: phonocardiogram signal, filtering, Envelope, Detection, murmurs, heart sounds
Procedia PDF Downloads 14115204 Implementing a Database from a Requirement Specification
Abstract:
Creating a database scheme is essentially a manual process. From a requirement specification, the information contained within has to be analyzed and reduced into a set of tables, attributes and relationships. This is a time-consuming process that has to go through several stages before an acceptable database schema is achieved. The purpose of this paper is to implement a Natural Language Processing (NLP) based tool to produce a from a requirement specification. The Stanford CoreNLP version 3.3.1 and the Java programming were used to implement the proposed model. The outcome of this study indicates that the first draft of a relational database schema can be extracted from a requirement specification by using NLP tools and techniques with minimum user intervention. Therefore, this method is a step forward in finding a solution that requires little or no user intervention.Keywords: information extraction, natural language processing, relation extraction
Procedia PDF Downloads 26115203 Fabrication of Nanostructured Arrays Using Si-Containing Block Copolymer and Dually Responsive Photoresist
Authors: Kyoungok Jung, Chang Hong Bak, Gyeong Cheon Jo, Jin-Baek Kim
Abstract:
Nanostructured arrays have drawn extensive attention because of their unique properties resulting from nanoscale features. However, it is difficult to achieve uniform and freestanding 1D nanostrcutures over a large area. Here, a simple and novel method was developed for fabrication of universal nanoporous templates for high-density nanostructure arrays, by combining self-assembly of a Si-containing block copolymer with a bilayer lithography system. We introduced a dually responsive photoresist bottom layer into which the nanopatterns of block copolymer are transferred by oxygen reactive ion etching. Because the dually responsive layer becomes cross-linked by heating, it can be used as a hard template during the etching process. It becomes soluble again by chain scission upon exposure to light. Therefore, it can be easily removed by the lift-off process. The template was applicable to the various conducting substrates due to the compatibility of the photoresist with a wide range of substrates and was used in electrodeposition for well-aligned and high-density inorganic and organic nanoarrays. We successfully obtained vertically aligned and highly ordered gold nanorods and polypyrrole dots on the substrate without aggregation, and these arrays did not collapse after removing the dually responsive templates by the simple lift-off process.Keywords: block copolymer, dually responsive, nanostructure, photoresist
Procedia PDF Downloads 25715202 Explosion Mechanics of Aluminum Plates Subjected to the Combined Effect of Blast Wave and Fragment Impact Loading: A Multicase Computational Modeling Study
Authors: Atoui Oussama, Maazoun Azer, Belkassem Bachir, Pyl Lincy, Lecompte David
Abstract:
For many decades, researchers have been focused on understanding the dynamic behavior of different structures and materials subjected to fragment impact or blast loads separately. The explosion mechanics, as well as the impact physics studies dealing with the numerical modeling of the response of protective structures under the synergistic effect of a blast wave and the impact of fragments, are quite limited in the literature. This article numerically evaluates the nonlinear dynamic behavior and damage mechanisms of Aluminum plates EN AW-1050A- H24 under different combined loading scenarios varied by the sequence of the applied loads using the commercial software LS-DYNA. For one hand, with respect to the terminal ballistic field investigations, a Lagrangian (LAG) formulation is used to evaluate the different failure modes of the target material in case of a fragment impact. On the other hand, with respect to the blast field analysis, an Arbitrary Lagrangian-Eulerian (ALE) formulation is considered to study the fluid-structure interaction (FSI) of the shock wave and the plate in case of a blast loading. Four different loading scenarios are considered: (1) only blast loading, (2) only fragment impact, (3) blast loading followed by a fragment impact and (4) a fragment impact followed by blast loading. From the numerical results, it was observed that when the impact load is applied to the plate prior to the blast load, it suffers more severe damage due to the hole enlargement phenomenon and the effects of crack propagation on the circumference of the damaged zone. Moreover, it was found that the hole from the fragment impact loading was enlarged to about three times in diameter as compared to the diameter of the projectile. The validation of the proposed computational model is based in part on previous experimental data obtained by the authors and in the other part on experimental data obtained from the literature. A good correspondence between the numerical and experimental results is found.Keywords: computational analysis, combined loading, explosion mechanics, hole enlargement phenomenon, impact physics, synergistic effect, terminal ballistic
Procedia PDF Downloads 18415201 Heavy Metal Contamination in Ship Breaking Yard, A Case Study in Bangladesh
Authors: Mohammad Mosaddik Rahman
Abstract:
This study embarks on an exploratory journey to assess the pervasive issue of heavy metal contamination in the water bodies along Chittagong Coast, Bangladesh. Situated along the mesmerizing Bay of Bengal, known for its potential as an emerging tourist haven, economic zone, ship breaking yard, confronts significant environmental hurdles. The core of these challenges lies in the contamination from heavy metals such as lead, cadmium, chromium, and mercury, which detrimentally impact both the ecological integrity and public health of the region. This contamination primarily stems from industrial activities, particularly those involving metallurgical and chemical processes, which release these metals into the environment, leading to their accumulation in soil and water bodies. The study's primary aim is to conduct a thorough assessment of heavy metal pollution levels, alongside an analysis of nutrient variations, focusing on nitrates and nitrites. Methodologically, the study leverages systematic sampling and advanced analytical tools like the Hach 3900 spectrophotometer to ensure precise and reliable data collection. The implications of heavy metal presence are multifaceted, affecting microbial and aquatic life, and posing severe health risks to the local population, including respiratory problems, neurological disorders, and an increased risk of cancer. The results of this study highlight the urgent need for effective mitigation strategies and regulatory measures to address this critical issue. By providing a comprehensive understanding of the environmental and public health implications of heavy metal contamination in Chittagong Coast, this research endeavours to serve as a catalyst for change, emphasising the need for pollution control and advancements in water management policies. It is envisioned that the outcomes of this study will guide stakeholders in collaborating to develop and implement sustainable solutions, ultimately safeguarding the region’s environment and public health.Keywords: heavy metal, environmental health, pollution control policies, shipbreaking yard
Procedia PDF Downloads 5615200 Modeling and Prediction of Zinc Extraction Efficiency from Concentrate by Operating Condition and Using Artificial Neural Networks
Authors: S. Mousavian, D. Ashouri, F. Mousavian, V. Nikkhah Rashidabad, N. Ghazinia
Abstract:
PH, temperature, and time of extraction of each stage, agitation speed, and delay time between stages effect on efficiency of zinc extraction from concentrate. In this research, efficiency of zinc extraction was predicted as a function of mentioned variable by artificial neural networks (ANN). ANN with different layer was employed and the result show that the networks with 8 neurons in hidden layer has good agreement with experimental data.Keywords: zinc extraction, efficiency, neural networks, operating condition
Procedia PDF Downloads 54515199 Integrating Nursing Informatics to Improve Patient-Centered Care: A Project to Reduce Patient Waiting Time at the Blood Pressure Counter
Authors: Pi-Chi Wu, Tsui-Ping Chu, Hsiu-Hung Wang
Abstract:
Background: The ability to provide immediate medical service in outpatient departments is one of the keys to patient satisfaction. Objectives: This project used electronic equipment to integrate nursing care information to patient care at a blood pressure diagnostic counter. Through process reengineering, the average patient waiting time decreased from 35 minutes to 5 minutes, while service satisfaction increased from a score of 2.7 to 4.6. Methods: Data was collected from a local hospital in Southern Taiwan from a daily average of 2,200 patients in the outpatient department. Previous waiting times were affected by (1) space limitations, (2) the need to help guide patient mobility, (3) the need for nurses to appease irate patients and give instructions, (4), the need for patients to replace lost counter tickets, (5) the need to re-enter information, (6) the replacement of missing patient information. An ad hoc group was established to enhance patient satisfaction and shorten waiting times for patients to see a doctor. A four step strategy consisting of (1) counter relocation, (2) queue reorganization, (3) electronic information integration, (4) process reengineering was implemented. Results: Implementation of the developed strategy decreased patient waiting time from 35 minutes to an average of 5 minutes, and increased patient satisfaction scores from 2.7 to 6.4. Conclusion: Through the integration of information technology and process transformation, waiting times were drastically reduced, patient satisfaction increased, and nurses were allowed more time to engage in more cost-effective services. This strategy was simultaneously enacted in separate hospitals throughout Taiwan.Keywords: process reengineering, electronic information integration, patient satisfaction, patient waiting time
Procedia PDF Downloads 37815198 Sensor Validation Using Bottleneck Neural Network and Variable Reconstruction
Authors: Somia Bouzid, Messaoud Ramdani
Abstract:
The success of any diagnosis strategy critically depends on the sensors measuring process variables. This paper presents a detection and diagnosis sensor faults method based on a Bottleneck Neural Network (BNN). The BNN approach is used as a statistical process control tool for drinking water distribution (DWD) systems to detect and isolate the sensor faults. Variable reconstruction approach is very useful for sensor fault isolation, this method is validated in simulation on a nonlinear system: actual drinking water distribution system. Several results are presented.Keywords: fault detection, localization, PCA, NLPCA, auto-associative neural network
Procedia PDF Downloads 38915197 High Heating Value Bio-Chars from a Bio-Oil Upgrading Process
Authors: Julius K. Gane, Mohamad N. Nahil, Paul T. Williams
Abstract:
In today’s world of rapid population growth and a changing climate, one way to mitigate various negative effects is via renewable energy solutions. Energy and power as basic requirements in almost all human endeavours are also the banes of the changing climate and the impacts thereof. Thus it is crucial to develop innovative and environmentally friendly energy options to ameliorate various negative repercussions. Upgrading of fast pyrolysis bio-oil via hydro-treatment offers such opportunities, as quality renewable liquid transportation fuels can be produced. The process, however, is typically accompanied by bio-char formation as a by-product. The goal of this work was to study the yield and some properties of bio-chars formed from a hydrotreatment process, with an overall aim to promote the valuable utilization of wastes or by-products from renewable energy technologies. It is assumed that bio-chars that have comparable energy contents with coals will be more desirable as solid energy materials due to renewability and environmental friendliness. Therefore, the analytical work in this study focused mainly on determining the higher heating value (HHV) of the chars. The method involved the reaction of bio-oil in an autoclave supplied by the Parr Instrument Company, IL, USA. Two main parameters (different temperatures and resident times) were investigated. The chars were characterized using a Thermo EA2000 CHNS analyser, then oxygen contents and HHVs computed based on the literature. From the results, these bio-chars can readily serve as feedstocks for the production of renewable solid fuels. Their HHVs ranged between 29.26-39.18 MJ/kg, affected by different temperatures and retention times. There was an inverse relationship between the oxygen content and the HHVs of the chars. It can, therefore, be concluded that it is possible to optimize the process efficiency of the hydrotreatment process used through the production of renewable energy materials from the 'waste’ char by-products. Future work should consider developing a suitable balance between the primary objective of bio-oil upgrading processes (which is to improve the quality of the liquid fuels) and the conversion of its solid wastes into value-added products such as smokeless briquettes.Keywords: bio-char, renewable solid biofuels, valorisation, waste-to-energy
Procedia PDF Downloads 12815196 Free Shape Optimisation of Cold Formed Steel Sections
Authors: Mina Mortazavi, Pezhman Sharafi
Abstract:
Cold-formed steel sections are popular construction materials as structural or non-structural elements. The objective of this paper is to propose an optimisation method for open cross sections targeting the maximum nominal axial strength. The cross sections considered in the optimisation process should all meet a determined critical global buckling load to be considered as a candidate for optimisation process. The maximum dimensions of the cross section are fixed and limited into a predefined rectangular area. The optimisation process is repeated for different available coil thicknesses of 1 mm, 2.5 mm and 3 mm to determine the optimum thickness according to the cross section buckling behaviour. A simple-simple boundary is assumed as end conditions. The number of folds is limited to 20 folds to prevent extra complicated sections. The global buckling load is considered as Euler load and is determined according to the moment of inertia of the cross-section with a constant length. The critical buckling loads are obtained using Finite Strip Method. The results of the optimisation analysis are provided, and the optimum cross-section within the considered range is determined.Keywords: shape optimisation, buckling, cold formed steel, finite strip method
Procedia PDF Downloads 39915195 Long-Term Exposure Assessments for Cooking Workers Exposed to Polycyclic Aromatic Hydrocarbons and Aldehydes Containing in Cooking Fumes
Authors: Chun-Yu Chen, Kua-Rong Wu, Yu-Cheng Chen, Perng-Jy Tsai
Abstract:
Cooking fumes are known containing polycyclic aromatic hydrocarbons (PAHs) and aldehydes, and some of them have been proven carcinogenic or possibly carcinogenic to humans. Considering their chronic health effects, long-term exposure data is required for assessing cooking workers’ lifetime health risks. Previous exposure assessment studies, due to both time and cost constraints, mostly were based on the cross-sectional data. Therefore, establishing a long-term exposure data has become an important issue for conducting health risk assessment for cooking workers. An approach was proposed in this study. Here, the generation rates of both PAHs and aldehydes from a cooking process were determined by placing a sampling train exactly under the under the exhaust fan under the both the total enclosure condition and normal operating condition, respectively. Subtracting the concentration collected by the former (representing the total emitted concentration) from that of the latter (representing the hood collected concentration), the fugitive emitted concentration was determined. The above data was further converted to determine the generation rates based on the flow rates specified for the exhaust fan. The determinations of the above generation rates were conducted in a testing chamber with a selected cooking process (deep-frying chicken nuggets under 3 L peanut oil at 200°C). The sampling train installed under the exhaust fan consisted respectively an IOM inhalable sampler with a glass fiber filter for collecting particle-phase PAHs, followed by a XAD-2 tube for gas-phase PAHs. The above was also used to sample aldehydes, however, installed with a filter pre-coated with DNPH, and followed by a 2,4-DNPH-cartridge for collecting particle-phase and gas-phase aldehydes, respectively. PAHs and aldehydes samples were analyzed by GC/MS-MS (Agilent 7890B), and HPLC-UV (HITACHI L-7100), respectively. The obtained generation rates of both PAHs and aldehydes were applied to the near-field/ far-field exposure model to estimate the exposures of cooks (the estimated near-field concentration), and helpers (the estimated far-field concentration). For validating purposes, both PAHs and aldehydes samplings were conducted simultaneously using the same sampling train at both near-field and far-field sites of the testing chamber. The sampling results, together with the use of the mixed-effect model, were used to calibrate the estimated near-field/ far-field exposures. In the present study, the obtained emission rates were further converted to emission factor of both PAHs and aldehydes according to the amount of food oil consumed. Applying the long-term food oil consumption records, the emission rates for both PAHs and aldehydes were determined, and the long-term exposure databanks for cooks (the estimated near-field concentration), and helpers (the estimated far-field concentration) were then determined. Results show that the proposed approach was adequate to determine the generation rates of both PAHs and aldehydes under various fan exhaust flow rate conditions. The estimated near-field/ far-field exposures, though were significantly different from that obtained from the field, can be calibrated using the mixed effect model. Finally, the established long-term data bank could provide a useful basis for conducting long-term exposure assessments for cooking workers exposed to PAHs and aldehydes.Keywords: aldehydes, cooking oil fumes, long-term exposure assessment, modeling, polycyclic aromatic hydrocarbons (PAHs)
Procedia PDF Downloads 14215194 Educational Diagnosis and Evaluation Processes of Disabled Preschoolers in Turkey: Family Opinions
Authors: Şule Yanık, Hasan Gürgür
Abstract:
It is thought that it is important for disabled children to have the opportunity to benefit preschool education that smoothens transition process to formal education, and for the constitution of a precondition for their success. Within this context, it is important for the disabled in Turkey to be evaluated medically firstly and then educational-wise in order for them to benefit early inclusive education. Thus, disabled people are both diagnosed in hospitals and at Guidance and Research Centers (GRC) attached to Ministry of Education educational-wise. It is seen that standard evaluation tools are used and evaluations are done by special education teachers (SET) in order for educational diagnosis and evaluation (EDAE) to be realized. The literature emphasizes the importance of informal evaluation tools as well as formal ones. According to this, it is thought that another party, besides students in EDAE process and SETs, is family, because families are primary care takers for their children, and that the most correct and real information can be obtained via families beside results of educational evaluation processes (EEP). It is thought that obtaining opinions of families during EEP is important to be able to exhibit the present EDAE activities in Turkey, materialize any existing problems, and increase quality of the process. Within this context, the purpose of this study is to exhibit experiences regarding EDAE processes of 10 families having preschool children with hearing loss (CHL). The process of research is designed to be descriptive based on qualitative research paradigms. Data were collected via semi-structured interview questions, and the themes were obtained. As a result, it is seen that families, after they realize the hearing loss of their children, do not have any information regarding the subject, and that they consult to an ear-nose-throat doctor or an audiologist for support. It is seen that families go to hospitals for medical evaluation which is a pre-requisite for benefiting early education opportunities. However, during this process, as some families do not have any experience of having a CHL, it is seen that they are late for medical evaluation and hearing aids. Moreover, families stated that they were directed to GRC via audiologists for educational evaluation. Families stated that their children were evaluated regarding language, academic and psychological development in proportion with their ages in GRC after they were diagnosed medically. However, families stated that EEP realized in GRC was superficial, short and lacked detail. It is seen that many families were not included in EEP process, whereas some families stated that they were asked questions because their children are too small to answer. Regarding the benefits of EEP for themselves and their children, families stated that GRC had to give a report to them for benefiting the free support of Special Education and Rehabilitation Center, and that families had to be directed to inclusive education. As a result, it is seen that opinions of families regarding EDAE processes at GRC indicate inefficiency of the process as it is short and superficial, regardless being to the point.Keywords: children with hearing loss, educational diagnosis and evaluation, guidance and research center, inclusion
Procedia PDF Downloads 23315193 Predictive Analytics for Theory Building
Authors: Ho-Won Jung, Donghun Lee, Hyung-Jin Kim
Abstract:
Predictive analytics (data analysis) uses a subset of measurements (the features, predictor, or independent variable) to predict another measurement (the outcome, target, or dependent variable) on a single person or unit. It applies empirical methods in statistics, operations research, and machine learning to predict the future, or otherwise unknown events or outcome on a single or person or unit, based on patterns in data. Most analyses of metabolic syndrome are not predictive analytics but statistical explanatory studies that build a proposed model (theory building) and then validate metabolic syndrome predictors hypothesized (theory testing). A proposed theoretical model forms with causal hypotheses that specify how and why certain empirical phenomena occur. Predictive analytics and explanatory modeling have their own territories in analysis. However, predictive analytics can perform vital roles in explanatory studies, i.e., scientific activities such as theory building, theory testing, and relevance assessment. In the context, this study is to demonstrate how to use our predictive analytics to support theory building (i.e., hypothesis generation). For the purpose, this study utilized a big data predictive analytics platform TM based on a co-occurrence graph. The co-occurrence graph is depicted with nodes (e.g., items in a basket) and arcs (direct connections between two nodes), where items in a basket are fully connected. A cluster is a collection of fully connected items, where the specific group of items has co-occurred in several rows in a data set. Clusters can be ranked using importance metrics, such as node size (number of items), frequency, surprise (observed frequency vs. expected), among others. The size of a graph can be represented by the numbers of nodes and arcs. Since the size of a co-occurrence graph does not depend directly on the number of observations (transactions), huge amounts of transactions can be represented and processed efficiently. For a demonstration, a total of 13,254 metabolic syndrome training data is plugged into the analytics platform to generate rules (potential hypotheses). Each observation includes 31 predictors, for example, associated with sociodemographic, habits, and activities. Some are intentionally included to get predictive analytics insights on variable selection such as cancer examination, house type, and vaccination. The platform automatically generates plausible hypotheses (rules) without statistical modeling. Then the rules are validated with an external testing dataset including 4,090 observations. Results as a kind of inductive reasoning show potential hypotheses extracted as a set of association rules. Most statistical models generate just one estimated equation. On the other hand, a set of rules (many estimated equations from a statistical perspective) in this study may imply heterogeneity in a population (i.e., different subpopulations with unique features are aggregated). Next step of theory development, i.e., theory testing, statistically tests whether a proposed theoretical model is a plausible explanation of a phenomenon interested in. If hypotheses generated are tested statistically with several thousand observations, most of the variables will become significant as the p-values approach zero. Thus, theory validation needs statistical methods utilizing a part of observations such as bootstrap resampling with an appropriate sample size.Keywords: explanatory modeling, metabolic syndrome, predictive analytics, theory building
Procedia PDF Downloads 27615192 Steady State Analysis of Distribution System with Wind Generation Uncertainity
Authors: Zakir Husain, Neem Sagar, Neeraj Gupta
Abstract:
Due to the increased penetration of renewable energy resources in the distribution system, the system is no longer passive in nature. In this paper, a steady state analysis of the distribution system has been done with the inclusion of wind generation. The modeling of wind turbine generator system and wind generator has been made to obtain the average active and the reactive power injection into the system. The study has been conducted on a IEEE-33 bus system with two wind generators. The present research work is useful not only to utilities but also to customers.Keywords: distributed generation, distribution network, radial network, wind turbine generating system
Procedia PDF Downloads 40715191 Adaptive Auth - Adaptive Authentication Based on User Attributes for Web Application
Authors: Senthuran Manoharan, Rathesan Sivagananalingam
Abstract:
One of the main issues in system security is Authentication. Authentication can be defined as the process of recognizing the user's identity and it is the most important step in the access control process to safeguard data/resources from being accessed by unauthorized users. The static method of authentication cannot ensure the genuineness of the user. Due to this reason, more innovative authentication mechanisms came into play. At first two factor authentication was introduced and later, multi-factor authentication was introduced to enhance the security of the system. It also had some issues and later, adaptive authentication was introduced. In this research paper, the design of an adaptive authentication engine was put forward. The user risk profile was calculated based on the user parameters and then the user was challenged with a suitable authentication method.Keywords: authentication, adaptive authentication, machine learning, security
Procedia PDF Downloads 25015190 Assessment of Students Skills in Error Detection in SQL Classes using Rubric Framework - An Empirical Study
Authors: Dirson Santos De Campos, Deller James Ferreira, Anderson Cavalcante Gonçalves, Uyara Ferreira Silva
Abstract:
Rubrics to learning research provide many evaluation criteria and expected performance standards linked to defined student activity for learning and pedagogical objectives. Despite the rubric being used in education at all levels, academic literature on rubrics as a tool to support research in SQL Education is quite rare. There is a large class of SQL queries is syntactically correct, but certainly, not all are semantically correct. Detecting and correcting errors is a recurring problem in SQL education. In this paper, we usthe Rubric Abstract Framework (RAF), which consists of steps, that allows us to map the information to measure student performance guided by didactic objectives defined by the teacher as long as it is contextualized domain modeling by rubric. An empirical study was done that demonstrates how rubrics can mitigate student difficulties in finding logical errors and easing teacher workload in SQL education. Detecting and correcting logical errors is an important skill for students. Researchers have proposed several ways to improve SQL education because understanding this paradigm skills are crucial in software engineering and computer science. The RAF instantiation was using in an empirical study developed during the COVID-19 pandemic in database course. The pandemic transformed face-to-face and remote education, without presential classes. The lab activities were conducted remotely, which hinders the teaching-learning process, in particular for this research, in verifying the evidence or statements of knowledge, skills, and abilities (KSAs) of students. Various research in academia and industry involved databases. The innovation proposed in this paper is the approach used where the results obtained when using rubrics to map logical errors in query formulation have been analyzed with gains obtained by students empirically verified. The research approach can be used in the post-pandemic period in both classroom and distance learning.Keywords: rubric, logical error, structured query language (SQL), empirical study, SQL education
Procedia PDF Downloads 19015189 The Cost of Innovation in Software Development Projects
Authors: Mihai Liviu Despa
Abstract:
The paper tackles the topic of determining the cost of innovation in software development projects. Innovation can be achieved either in a planned or unplanned manner. The paper approaches the scenarios were innovation is planned for. As a starting point an innovative software development project is analyzed. The project is depicted step by step as it was implemented, from inception to delivery. Costs that are proprietary to innovation in software development are isolated based on the author’s personal experience in managing the above mentioned project. Innovation costs components identified by the author are then validated using open discussions with software development professionals and projects managers on LinkedIn groups. In order to receive relevant feedback only groups that focus on software development and innovation management are targeted. Additional innovation cost components suggested by software development professionals and projects managers are also considered. Based on the identified cost components an indicator is built. The indicator is meant to formalize the process of determining the cost of innovation in a software development project. The indicator aggregates all the innovation cost components that are identified in the research process. The process of calculating each cost component is also described. Conclusions are formulated and new related research topics are submitted for debate.Keywords: innovation cost, IT project management, software development, innovation management
Procedia PDF Downloads 46015188 Epigenetic Modifying Potential of Dietary Spices: Link to Cure Complex Diseases
Authors: Jeena Gupta
Abstract:
In the today’s world of pharmaceutical products, one should not forget the healing properties of inexpensive food materials especially spices. They are known to possess hidden pharmaceutical ingredients, imparting them the qualities of being anti-microbial, anti-oxidant, anti-inflammatory and anti-carcinogenic. Further aberrant epigenetic regulatory mechanisms like DNA methylation, histone modifications or altered microRNA expression patterns, which regulates gene expression without changing DNA sequence, contribute significantly in the development of various diseases. Changing lifestyles and diets exert their effect by influencing these epigenetic mechanisms which are thus the target of dietary phytochemicals. Bioactive components of plants have been in use since ages but their potential to reverse epigenetic alterations and prevention against diseases is yet to be explored. Spices being rich repositories of many bioactive constituents are responsible for providing them unique aroma and taste. Some spices like curcuma and garlic have been well evaluated for their epigenetic regulatory potential, but for others, it is largely unknown. We have evaluated the biological activity of phyto-active components of Fennel, Cardamom and Fenugreek by in silico molecular modeling, in vitro and in vivo studies. Ligand-based similarity studies were conducted to identify structurally similar compounds to understand their biological phenomenon. The database searching has been done by using Fenchone from fennel, Sabinene from cardamom and protodioscin from fenugreek as a query molecule in the different small molecule databases. Moreover, the results of the database searching exhibited that these compounds are having potential binding with the different targets found in the Protein Data Bank. Further in addition to being epigenetic modifiers, in vitro study had demonstrated the antimicrobial, antifungal, antioxidant and cytotoxicity protective effects of Fenchone, Sabinene and Protodioscin. To best of our knowledge, such type of studies facilitate the target fishing as well as making the roadmap in drug design and discovery process for identification of novel therapeutics.Keywords: epigenetics, spices, phytochemicals, fenchone
Procedia PDF Downloads 15815187 Automatic Adjustment of Thresholds via Closed-Loop Feedback Mechanism for Solder Paste Inspection
Authors: Chia-Chen Wei, Pack Hsieh, Jeffrey Chen
Abstract:
Surface Mount Technology (SMT) is widely used in the area of the electronic assembly in which the electronic components are mounted to the surface of the printed circuit board (PCB). Most of the defects in the SMT process are mainly related to the quality of solder paste printing. These defects lead to considerable manufacturing costs in the electronics assembly industry. Therefore, the solder paste inspection (SPI) machine for controlling and monitoring the amount of solder paste printing has become an important part of the production process. So far, the setting of the SPI threshold is based on statistical analysis and experts’ experiences to determine the appropriate threshold settings. Because the production data are not normal distribution and there are various variations in the production processes, defects related to solder paste printing still occur. In order to solve this problem, this paper proposes an online machine learning algorithm, called the automatic threshold adjustment (ATA) algorithm, and closed-loop architecture in the SMT process to determine the best threshold settings. Simulation experiments prove that our proposed threshold settings improve the accuracy from 99.85% to 100%.Keywords: big data analytics, Industry 4.0, SPI threshold setting, surface mount technology
Procedia PDF Downloads 11615186 Comparison of Hydrogen and Electrification Perspectives in Decarbonizing the Transport Sector
Authors: Matteo Nicoli, Gianvito Colucci, Valeria Di Cosmo, Daniele Lerede, Laura Savoldi
Abstract:
The transport sector is currently responsible for approximately 1/3 of greenhouse gas emissions in Europe. In the wider context of achieving carbon neutrality of the global energy system, different alternatives are available to decarbonizethe transport sector. In particular, while electricity is already the most consumed energy commodity in rail transport, battery electric vehicles are one of the zero-emissions options on the market for road transportation. On the other hand, hydrogen-based fuel cell vehicles are available for road and non-road vehicles. The European Commission is strongly pushing toward the integration of hydrogen in the energy systems of European countries and its widespread adoption as an energy vector to achieve the Green Deal targets. Furthermore, the Italian government is defining hydrogen-related objectives with the publication of a dedicated Hydrogen Strategy. The adoption of energy system optimization models to study the possible penetration of alternative zero-emitting transport technologies gives the opportunity to perform an overall analysis of the effects that the development of innovative technologies has on the entire energy system and on the supply-side, devoted to the production of energy carriers such as hydrogen and electricity. Using an open-source modeling framework such as TEMOA, this work aims to compare the role of hydrogen and electric vehicles in the decarbonization of the transport sector. The analysis investigates the advantages and disadvantages of adopting the two options, from the economic point of view (costs associated with the two options) and the environmental one (looking at the emissions reduction perspectives). Moreover, an analysis on the profitability of the investments in hydrogen and electric vehicles will be performed. The study investigates the evolution of energy consumption and greenhouse gas emissions in different transportation modes (road, rail, navigation, and aviation) by detailed analysis of the full range of vehicles included in the techno-economic database used in the TEMOA model instance adopted for this work. The transparency of the analysis is guaranteed by the accessibility of the TEMOA models, based on an open-access source code and databases.Keywords: battery electric vehicles, decarbonization, energy system optimization models, fuel cell vehicles, hydrogen, open-source modeling, TEMOA, transport
Procedia PDF Downloads 11215185 Numerical Simulation and Optimal Control in Gas Dynamic Laser GDLs
Authors: Laggoun Chouki
Abstract:
In this paper we present the design and mechanisms of the physics process and discuss the performances of continuous gas laser dynamics, based on molecules N2(v=1)→C02(001)(v=3). The main objectives of work in this area are, obtaining the high laser energies in short time durations needed for the feasibility studies the physical principles that can be used to make laser sources capable of delivering high average powers. We note that, in order to reach both objectives, one has to convert electrical or chemical energy into laser energy, using gaseous media. The process generating the wave excited, on the basis of the excited level vibration, Theoretical predictions are compared with experimental results. The feasibility and effectiveness of the proposed method is demonstrated by computer simulation.Keywords: modelling, lasers, gas, numerical, nozzle
Procedia PDF Downloads 8215184 The Impacts of Technology on Operations Costs: The Mediating Role of Operation Flexibility
Authors: Fazli Idris, Jihad Mohammad
Abstract:
The study aims to determine the impact of technology and service operations flexibility, which is divided into external flexibility and internal robustness, on operations costs. A mediation model is proposed that links technology to operations costs via operation flexibility. Drawing on a sample of 475 of operations managers of various service sectors in Malaysia and South Africa, Structural Equation Modeling (SEM) was employed to test the relationship using Smart-PLS procedures. It was found that a significant relationship was established between technologies to operations costs via both operations flexibility dimensions. Theoretical and managerial implications are offered to explain the results.Keywords: Operations flexibility, technology, costs, mediation
Procedia PDF Downloads 61315183 Application of Medium High Hydrostatic Pressure in Preserving Textural Quality and Safety of Pineapple Compote
Authors: Nazim Uddin, Yohiko Nakaura, Kazutaka Yamamoto
Abstract:
Compote (fruit in syrup) of pineapple (Ananas comosus L. Merrill) is expected to have a high market potential as one of convenient ready-to-eat (RTE) foods worldwide. High hydrostatic pressure (HHP) in combination with low temperature (LT) was applied to the processing of pineapple compote as well as medium HHP (MHHP) in combination with medium-high temperature (MHT) since both processes can enhance liquid impregnation and inactivate microbes. MHHP+MHT (55 or 65 °C) process, as well as the HHP+LT process, has successfully inactivated the microbes in the compote to a non-detectable level. Although the compotes processed by MHHP+MHT or HHP+LT have lost the fresh texture as in a similar manner as those processed solely by heat, it was indicated that the texture degradations by heat were suppressed under MHHP. Degassing process reduced the hardness, while calcium (Ca) contributed to be retained hardness in MHT and MHHP+MHT processes. Electrical impedance measurement supported the damage due to degassing and heat. The color, Brix, and appearance were not affected by the processing methods significantly. MHHP+MHT and HHP+LT processes may be applicable to produce high-quality, safe RTE pineapple compotes. Further studies on the optimization of packaging and storage condition will be indispensable for commercialization.Keywords: compote of pineapple, RTE, medium high hydrostatic pressure, postharvest loss, texture
Procedia PDF Downloads 13715182 A Combined AHP-GP Model for Selecting Knowledge Management Tool
Authors: Ahmad Sarfaraz, Raiyad Herwies
Abstract:
In this paper, a multi-criteria decision making analysis is used to help any organization selects the best KM tool that fits and serves its needs. The AHP model is used based on a previous study to highlight and identify the main criteria and sub-criteria that are incorporated in the selection process. Different KM tools alternatives with different criteria are compared and weighted accurately to be incorporated in the GP model. The main goal is to combine the GP model with the AHP model to ensure that selecting the KM tool considers the resource constraints. Two important issues are discussed in this paper: how different factors could be taken into consideration in forming the AHP model, and how to incorporate the AHP results into the GP model for better results.Keywords: knowledge management, analytical hierarchy process, goal programming, multi-criteria decision making
Procedia PDF Downloads 385