Search results for: design validation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 13636

Search results for: design validation

13486 Estimation of the Acute Toxicity of Halogenated Phenols Using Quantum Chemistry Descriptors

Authors: Khadidja Bellifa, Sidi Mohamed Mekelleche

Abstract:

Phenols and especially halogenated phenols represent a substantial part of the chemicals produced worldwide and are known as aquatic pollutants. Quantitative structure–toxicity relationship (QSTR) models are useful for understanding how chemical structure relates to the toxicity of chemicals. In the present study, the acute toxicities of 45 halogenated phenols to Tetrahymena Pyriformis are estimated using no cost semi-empirical quantum chemistry methods. QSTR models were established using the multiple linear regression technique and the predictive ability of the models was evaluated by the internal cross-validation, the Y-randomization and the external validation. Their structural chemical domain has been defined by the leverage approach. The results show that the best model is obtained with the AM1 method (R²= 0.91, R²CV= 0.90, SD= 0.20 for the training set and R²= 0.96, SD= 0.11 for the test set). Moreover, all the Tropsha’ criteria for a predictive QSTR model are verified.

Keywords: halogenated phenols, toxicity mechanism, hydrophobicity, electrophilicity index, quantitative stucture-toxicity relationships

Procedia PDF Downloads 301
13485 Toxicological Validation during the Development of New Catalytic Systems Using Air/Liquid Interface Cell Exposure

Authors: M. Al Zallouha, Y. Landkocz, J. Brunet, R. Cousin, J. M. Halket, E. Genty, P. J. Martin, A. Verdin, D. Courcot, S. Siffert, P. Shirali, S. Billet

Abstract:

Toluene is one of the most used Volatile Organic Compounds (VOCs) in the industry. Amongst VOCs, Benzene, Toluene, Ethylbenzene and Xylenes (BTEX) emitted into the atmosphere have a major and direct impact on human health. It is, therefore, necessary to minimize emissions directly at source. Catalytic oxidation is an industrial technique which provides remediation efficiency in the treatment of these organic compounds. However, during operation, the catalysts can release some compounds, called byproducts, more toxic than the original VOCs. The catalytic oxidation of a gas stream containing 1000ppm of toluene on Pd/α-Al2O3 can release a few ppm of benzene, according to the operating temperature of the catalyst. The development of new catalysts must, therefore, include chemical and toxicological validation phases. In this project, A549 human lung cells were exposed in air/liquid interface (Vitrocell®) to gas mixtures derived from the oxidation of toluene with a catalyst of Pd/α-Al2O3. Both exposure concentrations (i.e. 10 and 100% of catalytic emission) resulted in increased gene expression of Xenobiotics Metabolising Enzymes (XME) (CYP2E1 CYP2S1, CYP1A1, CYP1B1, EPHX1, and NQO1). Some of these XMEs are known to be induced by polycyclic organic compounds conventionally not searched during the development of catalysts for VOCs degradation. The increase in gene expression suggests the presence of undetected compounds whose toxicity must be assessed before the adoption of new catalyst. This enhances the relevance of toxicological validation of such systems before scaling-up and marketing.

Keywords: BTEX toxicity, air/liquid interface cell exposure, Vitrocell®, catalytic oxidation

Procedia PDF Downloads 411
13484 Modeling of Sediment Yield and Streamflow of Watershed Basin in the Philippines Using the Soil Water Assessment Tool Model for Watershed Sustainability

Authors: Warda L. Panondi, Norihiro Izumi

Abstract:

Sedimentation is a significant threat to the sustainability of reservoirs and their watershed. In the Philippines, the Pulangi watershed experienced a high sediment loss mainly due to land conversions and plantations that showed critical erosion rates beyond the tolerable limit of -10 ton/ha/yr in all of its sub-basin. From this event, the prediction of runoff volume and sediment yield is essential to examine using the country's soil conservation techniques realistically. In this research, the Pulangi watershed was modeled using the soil water assessment tool (SWAT) to predict its watershed basin's annual runoff and sediment yield. For the calibration and validation of the model, the SWAT-CUP was utilized. The model was calibrated with monthly discharge data for 1990-1993 and validated for 1994-1997. Simultaneously, the sediment yield was calibrated in 2014 and validated in 2015 because of limited observed datasets. Uncertainty analysis and calculation of efficiency indexes were accomplished through the SUFI-2 algorithm. According to the coefficient of determination (R2), Nash Sutcliffe efficiency (NSE), King-Gupta efficiency (KGE), and PBIAS, the calculation of streamflow indicates a good performance for both calibration and validation periods while the sediment yield resulted in a satisfactory performance for both calibration and validation. Therefore, this study was able to identify the most critical sub-basin and severe needs of soil conservation. Furthermore, this study will provide baseline information to prevent floods and landslides and serve as a useful reference for land-use policies and watershed management and sustainability in the Pulangi watershed.

Keywords: Pulangi watershed, sediment yield, streamflow, SWAT model

Procedia PDF Downloads 210
13483 Cascaded Neural Network for Internal Temperature Forecasting in Induction Motor

Authors: Hidir S. Nogay

Abstract:

In this study, two systems were created to predict interior temperature in induction motor. One of them consisted of a simple ANN model which has two layers, ten input parameters and one output parameter. The other one consisted of eight ANN models connected each other as cascaded. Cascaded ANN system has 17 inputs. Main reason of cascaded system being used in this study is to accomplish more accurate estimation by increasing inputs in the ANN system. Cascaded ANN system is compared with simple conventional ANN model to prove mentioned advantages. Dataset was obtained from experimental applications. Small part of the dataset was used to obtain more understandable graphs. Number of data is 329. 30% of the data was used for testing and validation. Test data and validation data were determined for each ANN model separately and reliability of each model was tested. As a result of this study, it has been understood that the cascaded ANN system produced more accurate estimates than conventional ANN model.

Keywords: cascaded neural network, internal temperature, inverter, three-phase induction motor

Procedia PDF Downloads 345
13482 Using Machine Learning to Extract Patient Data from Non-standardized Sports Medicine Physician Notes

Authors: Thomas Q. Pan, Anika Basu, Chamith S. Rajapakse

Abstract:

Machine learning requires data that is categorized into features that models train on. This topic is important to the field of sports medicine due to the many tools it provides to physicians such as diagnosis support and risk assessment. Physician note that healthcare professionals take are usually unclean and not suitable for model training. The objective of this study was to develop and evaluate an advanced approach for extracting key features from sports medicine data without the need for extensive model training or data labeling. An LLM (Large Language Model) was given a narrative (Physician’s Notes) and prompted to extract four features (details about the patient). The narrative was found in a datasheet that contained six columns: Case Number, Validation Age, Validation Gender, Validation Diagnosis, Validation Body Part, and Narrative. The validation columns represent the accurate responses that the LLM attempts to output. With the given narrative, the LLM would output its response and extract the age, gender, diagnosis, and injured body part with each category taking up one line. The output would then be cleaned, matched, and added to new columns containing the extracted responses. Five ways of checking the accuracy were used: unclear count, substring comparison, LLM comparison, LLM re-check, and hand-evaluation. The unclear count essentially represented the extractions the LLM missed. This can be also understood as the recall score ([total - false negatives] over total). The rest of these correspond to the precision score ([total - false positives] over total). Substring comparison evaluated the validation (X) and extracted (Y) columns’ likeness by checking if X’s results were a substring of Y's findings and vice versa. LLM comparison directly asked an LLM if the X and Y’s results were similar. LLM Re-check prompted the LLM to see if the extracted results can be found in the narrative. Lastly, A selection of 1,000 random narratives was also selected and hand-evaluated to give an estimate of how well the LLM-based feature extraction model performed. With a selection of 10,000 narratives, the LLM-based approach had a recall score of roughly 98%. However, the precision scores of the substring comparison and LLM comparison models were around 72% and 76% respectively. The reason for these low figures is due to the minute differences between answers. For example, the ‘chest’ is a part of the ‘upper trunk’ however, these models cannot detect that. On the other hand, the LLM re-check and subset of hand-tested narratives showed a precision score of 96% and 95%. If this subset is used to extrapolate the possible outcome of the whole 10,000 narratives, the LLM-based approach would be strong in both precision and recall. These results indicated that an LLM-based feature extraction model could be a useful way for medical data in sports to be collected and analyzed by machine learning models. Wide use of this method could potentially increase the availability of data thus improving machine learning algorithms and supporting doctors with more enhanced tools.

Keywords: AI, LLM, ML, sports

Procedia PDF Downloads 12
13481 Comprehensive Validation of High-Performance Liquid Chromatography-Diode Array Detection (HPLC-DAD) for Quantitative Assessment of Caffeic Acid in Phenolic Extracts from Olive Mill Wastewater

Authors: Layla El Gaini, Majdouline Belaqziz, Meriem Outaki, Mariam Minhaj

Abstract:

In this study, it introduce and validate a high-performance liquid chromatography method with diode-array detection (HPLC-DAD) specifically designed for the accurate quantification of caffeic acid in phenolic extracts obtained from olive mill wastewater. The separation process of caffeic acid was effectively achieved through the use of an Acclaim Polar Advantage column (5µm, 250x4.6mm). A meticulous multi-step gradient mobile phase was employed, comprising water acidified with phosphoric acid (pH 2.3) and acetonitrile, to ensure optimal separation. The diode-array detection was adeptly conducted within the UV–VIS spectrum, spanning a range of 200–800 nm, which facilitated precise analytical results. The method underwent comprehensive validation, addressing several essential analytical parameters, including specificity, repeatability, linearity, as well as the limits of detection and quantification, alongside measurement uncertainty. The generated linear standard curves displayed high correlation coefficients, underscoring the method's efficacy and consistency. This validated approach is not only robust but also demonstrates exceptional reliability for the focused analysis of caffeic acid within the intricate matrices of wastewater, thus offering significant potential for applications in environmental and analytical chemistry.

Keywords: high-performance liquid chromatography (HPLC-DAD), caffeic acid analysis, olive mill wastewater phenolics, analytical method validation

Procedia PDF Downloads 72
13480 New Method for the Determination of Montelukast in Human Plasma by Solid Phase Extraction Using Liquid Chromatography Tandem Mass Spectrometry

Authors: Vijayalakshmi Marella, NageswaraRaoPilli

Abstract:

This paper describes a simple, rapid and sensitive liquid chromatography / tandem mass spectrometry assay for the determination of montelukast in human plasma using montelukast d6 as an internal standard. Analyte and the internal standard were extracted from 50 µL of human plasma via solid phase extraction technique without evaporation, drying and reconstitution steps. The chromatographic separation was achieved on a C18 column by using a mixture of methanol and 5mM ammonium acetate (80:20, v/v) as the mobile phase at a flow rate of 0.8 mL/min. Good linearity results were obtained during the entire course of validation. Method validation was performed as per FDA guidelines and the results met the acceptance criteria. A run time of 2.5 min for each sample made it possible to analyze more number of samples in short time, thus increasing the productivity. The proposed method was found to be applicable to clinical studies.

Keywords: Montelukast, tandem mass spectrometry, montelukast d6, FDA guidelines

Procedia PDF Downloads 316
13479 Application of Axiomatic Design in Industrial Control and Automation Software

Authors: Aydin Homay, Mario de Sousa, Martin Wollschlaeger

Abstract:

Axiomatic design is a system design methodology that systematically analyses the transformation of customer needs into functional requirements, design parameters, and process variables. This approach aims to create high-quality product or system designs by adhering to specific design principles or axioms, namely, the independence and information axiom. The application of axiomatic design in the design of industrial control and automation software systems could be challenging due to the high flexibility exposed by the software system and the coupling enforced by the hardware part. This paper aims to present how to use axiomatic design for designing industrial control and automation software systems and how to satisfy the independence axiom within these tightly coupled systems.

Keywords: axiomatic design, decoupling, uncoupling, automation

Procedia PDF Downloads 53
13478 Verification & Validation of Map Reduce Program Model for Parallel K-Mediod Algorithm on Hadoop Cluster

Authors: Trapti Sharma, Devesh Kumar Srivastava

Abstract:

This paper is basically a analysis study of above MapReduce implementation and also to verify and validate the MapReduce solution model for Parallel K-Mediod algorithm on Hadoop Cluster. MapReduce is a programming model which authorize the managing of huge amounts of data in parallel, on a large number of devices. It is specially well suited to constant or moderate changing set of data since the implementation point of a position is usually high. MapReduce has slowly become the framework of choice for “big data”. The MapReduce model authorizes for systematic and instant organizing of large scale data with a cluster of evaluate nodes. One of the primary affect in Hadoop is how to minimize the completion length (i.e. makespan) of a set of MapReduce duty. In this paper, we have verified and validated various MapReduce applications like wordcount, grep, terasort and parallel K-Mediod clustering algorithm. We have found that as the amount of nodes increases the completion time decreases.

Keywords: hadoop, mapreduce, k-mediod, validation, verification

Procedia PDF Downloads 370
13477 Numerical Calculation of Dynamic Response of Catamaran Vessels Based on 3D Green Function Method

Authors: Md. Moinul Islam, N. M. Golam Zakaria

Abstract:

Seakeeping analysis of catamaran vessels in the earlier stages of design has become an important issue as it dictates the seakeeping characteristics, and it ensures safe navigation during the voyage. In the present paper, a 3D numerical method for the seakeeping prediction of catamaran vessel is presented using the 3D Green Function method. Both steady and unsteady potential flow problem is dealt with here. Using 3D linearized potential theory, the dynamic wave loads and the subsequent response of the vessel is computed. For validation of the numerical procedure catamaran vessel composed of twin, Wigley form demi-hull is used. The results of the present calculation are compared with the available experimental data and also with other calculations. The numerical procedure is also carried out for NPL-based round bilge catamaran, and hydrodynamic coefficients along with heave and pitch motion responses are presented for various Froude number. The results obtained by the present numerical method are found to be in fairly good agreement with the available data. This can be used as a design tool for predicting the seakeeping behavior of catamaran ships in waves.

Keywords: catamaran, hydrodynamic coefficients , motion response, 3D green function

Procedia PDF Downloads 222
13476 Method Validation for Heavy Metal Determination in Spring Water and Sediments

Authors: Habtamu Abdisa

Abstract:

Spring water is particularly valuable due to its high mineral content, which is beneficial for human health. However, anthropogenic activities usually imbalance the natural levels of its composition, which can cause adverse health effects. Regular monitoring of a naturally given environmental resource is of great concern in the world today. The spectrophotometric application is one of the best methods for qualifying and quantifying the mineral contents of environmental water samples. This research was conducted to evaluate the quality of spring water concerning its heavy metal composition. A grab sampling technique was employed to collect representative samples, including duplicates. The samples were then treated with concentrated HNO3 to a pH level below 2 and stored at 4oC. The samples were digested and analyzed for cadmium (Cd), chromium (Cr), manganese (Mn), copper (Cu), iron (Fe), and zinc (Zn) following method validation. Atomic Absorption Spectrometry (AAS) was utilized for the sample analysis. Quality control measures, including blanks, duplicates, and certified reference materials (CRMs), were implemented to ensure the accuracy and precision of the analytical results. Of the metals analyzed in the water samples, Cd and Cr were found to be below the detection limit. However, the concentrations of Mn, Cu, Fe, and Zn ranged from mean values of 0.119-0.227 mg/L, 0.142-0.166 mg/L, 0.183-0.267 mg/L, and 0.074-0.181 mg/L, respectively. Sediment analysis revealed mean concentration ranges of 348.31-429.21 mg/kg, 0.23-0.28 mg/kg, 18.73-22.84 mg/kg, 2.76-3.15 mg/kg, 941.84-1128.56 mg/kg, and 42.39-66.53 mg/kg for Mn, Cd, Cu, Cr, Fe, and Zn, respectively. The study results established that the evaluated spring water and its associated sediment met the regulatory standards and guidelines for heavy metal concentrations. Furthermore, this research can enhance the quality assurance and control processes for environmental sample analysis, ensuring the generation of reliable data.

Keywords: method validation, heavy metal, spring water, sediment, method detection limit

Procedia PDF Downloads 68
13475 Applying Transformative Service Design to Develop Brand Community Service in Women, Children and Infants Retailing

Authors: Shian Wan, Yi-Chang Wang, Yu-Chien Lin

Abstract:

This research discussed the various theories of service design, the importance of service design methodology, and the development of transformative service design framework. In this study, transformative service design is applied while building a new brand community service for women, children and infants retailing business. The goal is to enhance the brand recognition and customer loyalty, effectively increase the brand community engagement by embedding the brand community in social network and ultimately, strengthen the impact and the value of the company brand.

Keywords: service design, transformative service design, brand community, innovation

Procedia PDF Downloads 499
13474 Product Form Bionic Design Based on Eye Tracking Data: A Case Study of Desk Lamp

Authors: Huan Lin, Liwen Pang

Abstract:

In order to reduce the ambiguity and uncertainty of product form bionic design, a product form bionic design method based on eye tracking is proposed. The eye-tracking experiment is designed to calculate the average time ranking of the specific parts of the bionic shape that the subjects are looking at. Key bionic shape is explored through the experiment and then applied to a desk lamp bionic design. During the design case, FAHP (Fuzzy Analytic Hierachy Process) and SD (Semantic Differential) method are firstly used to identify consumer emotional perception model toward desk lamp before product design. Through investigating different desk lamp design elements and consumer views, the form design factors on the desk lamp product are reflected and all design schemes are sequenced after caculation. Desk lamp form bionic design method is combined the key bionic shape extracted from eye-tracking experiment and priority of desk lamp design schemes. This study provides an objective and rational method to product form bionic design.

Keywords: Bionic design; Form; Eye tracking; FAHP; Desk lamp

Procedia PDF Downloads 229
13473 Towards a Systematic Evaluation of Web Design

Authors: Ivayla Trifonova, Naoum Jamous, Holger Schrödl

Abstract:

A good web design is a prerequisite for a successful business nowadays, especially since the internet is the most common way for people to inform themselves. Web design includes the optical composition, the structure, and the user guidance of websites. The importance of each website leads to the question if there is a way to measure its usefulness. The aim of this paper is to suggest a methodology for the evaluation of web design. The desired outcome is to have an evaluation that is concentrated on a specific website and its target group.

Keywords: evaluation methodology, factor analysis, target group, web design

Procedia PDF Downloads 638
13472 Computational Model for Predicting Effective siRNA Sequences Using Whole Stacking Energy (ΔG) for Gene Silencing

Authors: Reena Murali, David Peter S.

Abstract:

The small interfering RNA (siRNA) alters the regulatory role of mRNA during gene expression by translational inhibition. Recent studies shows that up regulation of mRNA cause serious diseases like Cancer. So designing effective siRNA with good knockdown effects play an important role in gene silencing. Various siRNA design tools had been developed earlier. In this work, we are trying to analyze the existing good scoring second generation siRNA predicting tools and to optimize the efficiency of siRNA prediction by designing a computational model using Artificial Neural Network and whole stacking energy (ΔG), which may help in gene silencing and drug design in cancer therapy. Our model is trained and tested against a large data set of siRNA sequences. Validation of our results is done by finding correlation coefficient of experimental versus observed inhibition efficacy of siRNA. We achieved a correlation coefficient of 0.727 in our previous computational model and we could improve the correlation coefficient up to 0.753 when the threshold of whole tacking energy is greater than or equal to -32.5 kcal/mol.

Keywords: artificial neural network, double stranded RNA, RNA interference, short interfering RNA

Procedia PDF Downloads 526
13471 Evaluating the Validity of CFD Model of Dispersion in a Complex Urban Geometry Using Two Sets of Experimental Measurements

Authors: Mohammad R. Kavian Nezhad, Carlos F. Lange, Brian A. Fleck

Abstract:

This research presents the validation study of a computational fluid dynamics (CFD) model developed to simulate the scalar dispersion emitted from rooftop sources around the buildings at the University of Alberta North Campus. The ANSYS CFX code was used to perform the numerical simulation of the wind regime and pollutant dispersion by solving the 3D steady Reynolds-averaged Navier-Stokes (RANS) equations on a building-scale high-resolution grid. The validation study was performed in two steps. First, the CFD model performance in 24 cases (eight wind directions and three wind speeds) was evaluated by comparing the predicted flow fields with the available data from the previous measurement campaign designed at the North Campus, using the standard deviation method (SDM), while the estimated results of the numerical model showed maximum average percent errors of approximately 53% and 37% for wind incidents from the North and Northwest, respectively. Good agreement with the measurements was observed for the other six directions, with an average error of less than 30%. In the second step, the reliability of the implemented turbulence model, numerical algorithm, modeling techniques, and the grid generation scheme was further evaluated using the Mock Urban Setting Test (MUST) dispersion dataset. Different statistical measures, including the fractional bias (FB), the geometric mean bias (MG), and the normalized mean square error (NMSE), were used to assess the accuracy of the predicted dispersion field. Our CFD results are in very good agreement with the field measurements.

Keywords: CFD, plume dispersion, complex urban geometry, validation study, wind flow

Procedia PDF Downloads 137
13470 Modeling of Nanocomposite Films Made of Cloisite 30b- Metal Nanoparticle in Packaging of Soy Burger

Authors: Faranak Beigmohammadi, Seyed Hadi Peighambardoust, Seyed Jamaledin Peighambardoust

Abstract:

This study undertakes to investigate the ability of different kinds of nanocomposite films made of cloisite-30B with different percentages of silver and copper oxide nanoparticles incorporated into a low-density polyethylene (LDPE) polymeric matrix by a melt mixing method in order to inhibit the growth of microorganism in soy burger. The number of surviving cell of the total count was decreased by 3.61 log and mold and yeast diminished by 2.01 log after 8 weeks storage at 18 ± 0.5°C below zero, whilst pure LDPE did not has any antimicrobial effect. A composition of 1.3 % cloisite 30B-Ag and 2.7 % cloisite 30B-CuO for total count and 0 % cloisite 30B-Ag and 4 % cloisite 30B-CuO for yeast & mold gave optimum points in combined design test in Design Expert 7.1.5. Suitable microbial models were suggested for retarding above microorganisms growth in soy burger. To validation of optimum point, the difference between the optimum point of nanocomposite film and its repeat was not significant (p<0.05) by one-way ANOVA analysis using SPSS 17.0 software, while the difference was significant for pure film. Migration of metallic nanoparticles into a food stimulant was within the accepted safe level.

Keywords: modeling, nanocomposite film, packaging, soy burger

Procedia PDF Downloads 303
13469 Simultaneous Determination of Proposed Anti-HIV Combination Comprising of Elvitegravir and Quercetin in Rat Plasma Using the HPLC–ESI-MS/MS Method: Drug Interaction Study

Authors: Lubna Azmi, Ila Shukla, Shyam Sundar Gupta, Padam Kant, C. V. Rao

Abstract:

Elvitegravir is the mainstay of anti-HIV combination therapy in most endemic countries presently. However, it cannot be used alone owing to its long onset time of action. 2-(3,4-dihydroxyphenyl)-3,5,7-trihydroxychromen-4-one (Quercetin: QU) is a polyphenolic compound obtained from Argeria speciosa Linn (Family: Convolvulaceae), an anti-HIV candidate. In the present study, a sensitive, simple and rapid high-performance liquid chromatography coupled with positive ion electrospray ionization-tandem mass spectrometry (LC-ESI-MS/MS) method was developed for the simultaneous determination elvitegravir and Quercetin, in rat plasma. The method was linear over a range of 0.2–500 ng/ml. All validation parameters met the acceptance criteria according to regulatory guidelines. LC–MS/MS method for determination of Elvitegravir and Quercetin was developed and validated. Results show the potential of drug–drug interaction upon co-administration this marketed drugs and plant derived secondary metabolite.

Keywords: anti-HIV resistance, extraction, HPLC-ESI-MS-MS, validation

Procedia PDF Downloads 345
13468 A Sustainable Design Model by Integrated Evaluation of Closed-loop Design and Supply Chain Using a Mathematical Model

Authors: Yuan-Jye Tseng, Yi-Shiuan Chen

Abstract:

The paper presented a sustainable design model for integrated evaluation of the design and supply chain of a product for the sustainable objectives. To design a product, there can be alternative ways to assign the detailed specifications to fulfill the same design objectives. In the design alternative cases, different material and manufacturing processes with various supply chain activities may be required for the production. Therefore, it is required to evaluate the different design cases based on the sustainable objectives. In this research, a closed-loop design model is developed by integrating the forward design model and reverse design model. From the supply chain point of view, the decisions in the forward design model are connected with the forward supply chain. The decisions in the reverse design model are connected with the reverse supply chain considering the sustainable objectives. The purpose of this research is to develop a mathematical model for analyzing the design cases by integrated evaluating the criteria in the closed-loop design and the closed-loop supply chain. The decision variables are built to represent the design cases of the forward design and reverse design. The cost parameters in a forward design include the costs of material and manufacturing processes. The cost parameters in a reverse design include the costs of recycling, disassembly, reusing, remanufacturing, and disposing. The mathematical model is formulated to minimize the total cost under the design constraints. In practical applications, the decisions of the mathematical model can be used for selecting a design case for the purpose of sustainable design of a product. An example product is demonstrated in the paper. The test result shows that the sustainable design model is useful for integrated evaluation of the design and the supply chain to achieve the sustainable objectives.

Keywords: closed-loop design, closed-loop supply chain, design evaluation, supply chain management, sustainable design model

Procedia PDF Downloads 426
13467 Seamless MATLAB® to Register-Transfer Level Design Methodology Using High-Level Synthesis

Authors: Petri Solanti, Russell Klein

Abstract:

Many designers are asking for an automated path from an abstract mathematical MATLAB model to a high-quality Register-Transfer Level (RTL) hardware description. Manual transformations of MATLAB or intermediate code are needed, when the design abstraction is changed. Design conversion is problematic as it is multidimensional and it requires many different design steps to translate the mathematical representation of the desired functionality to an efficient hardware description with the same behavior and configurability. Yet, a manual model conversion is not an insurmountable task. Using currently available design tools and an appropriate design methodology, converting a MATLAB model to efficient hardware is a reasonable effort. This paper describes a simple and flexible design methodology that was developed together with several design teams.

Keywords: design methodology, high-level synthesis, MATLAB, verification

Procedia PDF Downloads 140
13466 A Human Centered Design of an Exoskeleton Using Multibody Simulation

Authors: Sebastian Kölbl, Thomas Reitmaier, Mathias Hartmann

Abstract:

Trial and error approaches to adapt wearable support structures to human physiology are time consuming and elaborate. However, during preliminary design, the focus lies on understanding the interaction between exoskeleton and the human body in terms of forces and moments, namely body mechanics. For the study at hand, a multi-body simulation approach has been enhanced to evaluate actual forces and moments in a human dummy model with and without a digital mock-up of an active exoskeleton. Therefore, different motion data have been gathered and processed to perform a musculosceletal analysis. The motion data are ground reaction forces, electromyography data (EMG) and human motion data recorded with a marker-based motion capture system. Based on the experimental data, the response of the human dummy model has been calibrated. Subsequently, the scalable human dummy model, in conjunction with the motion data, is connected with the exoskeleton structure. The results of the human-machine interaction (HMI) simulation platform are in particular resulting contact forces and human joint forces to compare with admissible values with regard to the human physiology. Furthermore, it provides feedback for the sizing of the exoskeleton structure in terms of resulting interface forces (stress justification) and the effect of its compliance. A stepwise approach for the setup and validation of the modeling strategy is presented and the potential for a more time and cost-effective development of wearable support structures is outlined.

Keywords: assistive devices, ergonomic design, inverse dynamics, inverse kinematics, multibody simulation

Procedia PDF Downloads 164
13465 Strategic Shear Wall Arrangement in Buildings under Seismic Loads

Authors: Akram Khelaifia, Salah Guettala, Nesreddine Djafar Henni, Rachid Chebili

Abstract:

Reinforced concrete shear walls are pivotal in protecting buildings from seismic forces by providing strength and stiffness. This study highlights the importance of strategically placing shear walls and optimizing the shear wall-to-floor area ratio in building design. Nonlinear analyses were conducted on an eight-story building situated in a high seismic zone, exploring various scenarios of shear wall positioning and ratios to floor area. Employing the performance-based seismic design (PBSD) approach, the study aims to meet acceptance criteria such as inter-story drift ratio and damage levels. The results indicate that concentrating shear walls in the middle of the structure during the design phase yields superior performance compared to peripheral distributions. Utilizing shear walls that fully infill the frame and adopting compound shapes (e.g., Box, U, and L) enhances reliability in terms of inter-story drift. Conversely, the absence of complete shear walls within the frame leads to decreased stiffness and degradation of shorter beams. Increasing the shear wall-to-floor area ratio in building design enhances structural rigidity and reliability regarding inter-story drift, facilitating the attainment of desired performance levels. The study suggests that a shear wall ratio of 1.0% is necessary to meet validation criteria for inter-story drift and structural damage, as exceeding this percentage leads to excessive performance levels, proving uneconomical as structural elements operate near the elastic range.

Keywords: nonlinear analyses, pushover analysis, shear wall, plastic hinge, performance level

Procedia PDF Downloads 50
13464 Principles of Editing and Storytelling in Relation to Editorial Graphic Design

Authors: Melike Tascioglu

Abstract:

This paper aims to combine film editing principles to basic design principles to explore what graphic designers do in terms of storytelling. The sequential aspect of film is designed and examined through the art of editing. Examining the rules, principles and formulas of film editing can be a method for graphic designers to further practice the art of storytelling. Although there are many research and publications on design basics, time, pace, dramatic structure and choreography are not very well defined in the area of graphic design. In this era of creative storytelling and interdisciplinary collaboration, not only film editors but also graphic designers and students in the arts and design should understand the theory and practice of editing to be able to create a strong mise-en-scène and not only a mise-en-page.

Keywords: design principles, editing principles, editorial design, film editing, graphic design, storytelling

Procedia PDF Downloads 333
13463 Axial Flux Permanent Magnet Motor Design and Optimization by Using Artificial Neural Networks

Authors: Tugce Talay, Kadir Erkan

Abstract:

In this study, the necessary steps for the design of axial flow permanent magnet motors are shown. The design and analysis of the engine were carried out based on ANSYS Maxwell program. The design parameters of the ANSYS Maxwell program and the artificial neural network system were established in MATLAB and the most efficient design parameters were found with the trained neural network. The results of the Maxwell program and the results of the artificial neural networks are compared and optimal working design parameters are found. The most efficient design parameters were submitted to the ANSYS Maxwell 3D design and the cogging torque was examined and design studies were carried out to reduce the cogging torque.

Keywords: AFPM, ANSYS Maxwell, cogging torque, design optimisation, efficiency, NNTOOL

Procedia PDF Downloads 221
13462 Materials for Sustainability

Authors: Qiuying Li

Abstract:

It is a shared opinion that sustainable development requires a system discontinuity, meaning that radical changes in the way we produce and consume are needed. Within this framework there is an emerging understanding that an important contribution to this change can be directly linked to decisions taken in the design phase of products, services and systems. Design schools have therefore to be able to provide design students with a broad knowledge and effective Design for Sustainability tools, in order to enable a new generation of designers in playing an active role in reorienting our consumption and production patterns.

Keywords: design for sustainability, services, systems, materials, ecomaterials

Procedia PDF Downloads 447
13461 Comparison of Allowable Stress Method and Time History Response Analysis for Seismic Design of Buildings

Authors: Sayuri Inoue, Naohiro Nakamura, Tsubasa Hamada

Abstract:

The seismic design method of buildings is classified into two types: static design and dynamic design. The static design is a design method that exerts static force as seismic force and is a relatively simple design method created based on the experience of seismic motion in the past 100 years. At present, static design is used for most of the Japanese buildings. Dynamic design mainly refers to the time history response analysis. It is a comparatively difficult design method that input the earthquake motion assumed in the building model and examine the response. Currently, it is only used for skyscrapers and specific buildings. In the present design standard in Japan, it is good to use either the design method of the static design and the dynamic design in the medium and high-rise buildings. However, when actually designing middle and high-rise buildings by two kinds of design methods, the relatively simple static design method satisfies the criteria, but in the case of a little difficult dynamic design method, the criterion isn't often satisfied. This is because the dynamic design method was built with the intention of designing super high-rise buildings. In short, higher safety is required as compared with general buildings, and criteria become stricter. The authors consider applying the dynamic design method to general buildings designed by the static design method so far. The reason is that application of the dynamic design method is reasonable for buildings that are out of the conventional standard structural form such as emphasizing design. For the purpose, it is important to compare the design results when the criteria of both design methods are arranged side by side. In this study, we performed time history response analysis to medium-rise buildings that were actually designed with allowable stress method. Quantitative comparison between static design and dynamic design was conducted, and characteristics of both design methods were examined.

Keywords: buildings, seismic design, allowable stress design, time history response analysis, Japanese seismic code

Procedia PDF Downloads 157
13460 Development and Validation of an Instrument Measuring the Coping Strategies in Situations of Stress

Authors: Lucie Côté, Martin Lauzier, Guy Beauchamp, France Guertin

Abstract:

Stress causes deleterious effects to the physical, psychological and organizational levels, which highlight the need to use effective coping strategies to deal with it. Several coping models exist, but they don’t integrate the different strategies in a coherent way nor do they take into account the new research on the emotional coping and acceptance of the stressful situation. To fill these gaps, an integrative model incorporating the main coping strategies was developed. This model arises from the review of the scientific literature on coping and from a qualitative study carried out among workers with low or high levels of stress, as well as from an analysis of clinical cases. The model allows one to understand under what circumstances the strategies are effective or ineffective and to learn how one might use them more wisely. It includes Specific Strategies in controllable situations (the Modification of the Situation and the Resignation-Disempowerment), Specific Strategies in non-controllable situations (Acceptance and Stubborn Relentlessness) as well as so-called General Strategies (Wellbeing and Avoidance). This study is intended to undertake and present the process of development and validation of an instrument to measure coping strategies based on this model. An initial pool of items has been generated from the conceptual definitions and three expert judges have validated the content. Of these, 18 items have been selected for a short form questionnaire. A sample of 300 students and employees from a Quebec university was used for the validation of the questionnaire. Concerning the reliability of the instrument, the indices observed following the inter-rater agreement (Krippendorff’s alpha) and the calculation of the coefficients for internal consistency (Cronbach's alpha) are satisfactory. To evaluate the construct validity, a confirmatory factor analysis using MPlus supports the existence of a model with six factors. The results of this analysis suggest also that this configuration is superior to other alternative models. The correlations show that the factors are only loosely related to each other. Overall, the analyses carried out suggest that the instrument has good psychometric qualities and demonstrates the relevance of further work to establish predictive validity and reconfirm its structure. This instrument will help researchers and clinicians better understand and assess coping strategies to cope with stress and thus prevent mental health issues.

Keywords: acceptance, coping strategies, stress, validation process

Procedia PDF Downloads 339
13459 Pharmacokinetic Monitoring of Glimepiride and Ilaprazole in Rat Plasma by High Performance Liquid Chromatography with Diode Array Detection

Authors: Anil P. Dewani, Alok S. Tripathi, Anil V. Chandewar

Abstract:

Present manuscript reports the development and validation of a quantitative high performance liquid chromatography method for the pharmacokinetic evaluation of Glimepiride (GLM) and Ilaprazole (ILA) in rat plasma. The plasma samples were involved with Solid phase extraction process (SPE). The analytes were resolved on a Phenomenex C18 column (4.6 mm× 250 mm; 5 µm particle size) using a isocratic elution mode comprising methanol:water (80:20 % v/v) with pH of water modified to 3 using Formic acid, the total run time was 10 min at 225 nm as common wavelength, the flow rate throughout was 1ml/min. The method was validated over the concentration range from 10 to 600 ng/mL for GLM and ILA, in rat plasma. Metformin (MET) was used as Internal Standard. Validation data demonstrated the method to be selective, sensitive, accurate and precise. The limit of detection was 1.54 and 4.08 and limit of quantification was 5.15 and 13.62 for GLM and ILA respectively, the method demonstrated excellent linearity with correlation coefficients (r2) 0.999. The intra and inter-day precision (RSD%) values were < 2.0% for both ILA and GLM. The method was successfully applied in pharmacokinetic studies followed by oral administration in rats.

Keywords: pharmacokinetics, glimepiride, ilaprazole, HPLC, SPE

Procedia PDF Downloads 369
13458 Design of Torque Actuator in Hybrid Multi-DOF System with Taking into Account Magnetic Saturation

Authors: Hyun-Seok Hong, Tae-Chul Jeong, Huai-Cong Liu, Ju Lee

Abstract:

In this paper, proposes to replace the three-phase SPM for tilting by a single-phase torque actuator of the hybrid multi-DOF system. If a three-phase motor for tilting SPM as acting as instantaneous, low electricity use efficiency, controllability is bad disadvantages. It uses a single-phase torque actuator has a high electrical efficiency compared, good controllability. Thus this will have a great influence on the development and practical use of the system. This study designed a single phase torque actuator in consideration of the magnetic saturation. And compared the SPM and FEM analysis and validation through testing of the production model.

Keywords: hybrid multi-DOF system, SPM, torque actuator, UAV, drone

Procedia PDF Downloads 613
13457 FT-NIR Method to Determine Moisture in Gluten Free Rice-Based Pasta during Drying

Authors: Navneet Singh Deora, Aastha Deswal, H. N. Mishra

Abstract:

Pasta is one of the most widely consumed food products around the world. Rapid determination of the moisture content in pasta will assist food processors to provide online quality control of pasta during large scale production. Rapid Fourier transform near-infrared method (FT-NIR) was developed for determining moisture content in pasta. A calibration set of 150 samples, a validation set of 30 samples and a prediction set of 25 samples of pasta were used. The diffuse reflection spectra of different types of pastas were measured by FT-NIR analyzer in the 4,000-12,000 cm-1 spectral range. Calibration and validation sets were designed for the conception and evaluation of the method adequacy in the range of moisture content 10 to 15 percent (w.b) of the pasta. The prediction models based on partial least squares (PLS) regression, were developed in the near-infrared. Conventional criteria such as the R2, the root mean square errors of cross validation (RMSECV), root mean square errors of estimation (RMSEE) as well as the number of PLS factors were considered for the selection of three pre-processing (vector normalization, minimum-maximum normalization and multiplicative scatter correction) methods. Spectra of pasta sample were treated with different mathematic pre-treatments before being used to build models between the spectral information and moisture content. The moisture content in pasta predicted by FT-NIR methods had very good correlation with their values determined via traditional methods (R2 = 0.983), which clearly indicated that FT-NIR methods could be used as an effective tool for rapid determination of moisture content in pasta. The best calibration model was developed with min-max normalization (MMN) spectral pre-processing (R2 = 0.9775). The MMN pre-processing method was found most suitable and the maximum coefficient of determination (R2) value of 0.9875 was obtained for the calibration model developed.

Keywords: FT-NIR, pasta, moisture determination, food engineering

Procedia PDF Downloads 258