Search results for: tool validation
5549 Development of an Instrument Assessing Participants’ Motivation on Assigning Monetary Value to Quality of Life
Authors: Afentoula Mavrodi, Andreas Georgiou, Georgios Tsiotras, Vassilis Aletras
Abstract:
Placing a monetary value on a quality-adjusted-life-year (QALY) is of utmost importance in economic evaluation. Identifying the population’s preferences is critical in order to understand some of the reasons driving variations in the assigned monetary value. Yet, evidence of the motives behind value assignment to a QALY by the general public is limited. Developing an instrument that would capture the population’s motives could be proven valuable to policy-makers, to guide them in allocating different values to a QALY based on users’ motivations. The aim of this study was to identify the most relevant motives and develop an appropriate instrument to assess them. To design the instrument, we employed: a) the EQ-5D-3L tool to assess participants’ current health status, and b) the Willingness-to-Pay (WTP) approach, within the Contingent Valuation (CV) Method framework, to elicit the monetary value. Advancing the open-ended approach adopted to assess solely protest bidders’ motives; a variety of follow-up item-specific statements were designed (deductive approach), aiming to evaluate motives of both protest bidders and participants willing to pay for the hypothetical treatment under consideration. The initial design of the survey instrument was the outcome of an extensive literature review. This instrument was revised based on 15 semi-structured interviews that took place in September 2018 and a pilot study held during two months (October-November) in 2018. Individuals with different educational, occupational and economical backgrounds and adequate verbal skills were recruited to complete the semi-structured interviews. The follow-up motivation statements of both protest bidders and those willing to pay were revised and rephrased after the semi-structured interviews. In total 4 statements for protest bidders and 3 statements for those willing to pay for the treatment were chosen to be included in the survey tool. Using the CATI (Computer Assisted Telephone Interview) method, a randomly selected sample of 97 persons living in Thessaloniki, Greece, completed the questionnaire on two occasions over a period of 4 weeks. Based on pilot study results, a test-retest reliability assessment was performed using the intra-class correlation coefficient (ICC). All statements formulated for protest bidders showed acceptable reliability (ICC values of 0.84 (95% CI: 0.67, 0.92) and above). Similarly, all statements for those willing to pay for the treatment showed high reliability (ICC values of 0.86 (95% CI: 0.78, 0.91) and above). Overall, the instrument designed in this study was reliable with regards to the item-specific statements assessing participants’ motivation. Validation of the instrument will take place in a future study. For a holistic WTP per QALY instrument, participants’ motivation must be addressed broadly. The instrument developed in this study captured a variety of motives and provided insight with regards to the method through which the latter are evaluated. Last but not least, it extended motive assessment to all study participants and not only protest bidders.Keywords: contingent valuation method, instrument, motives, quality-adjusted life-year, willingness-to-pay
Procedia PDF Downloads 1365548 Innovative Tool for Improving Teaching and Learning
Authors: Izharul Haq
Abstract:
Every one of us seek to aspire to gain quality education. The biggest stake holders are students who labor through years acquiring knowledge and skill to help them prepare for their career. Parents spend a fortune on their children’s education. Companies spend billions of dollars to enhance standards by developing new education products and services. Quality education is the golden key to a long lasting prosperity for the individual and the nation. But unfortunately, education standards are continuously deteriorating and it has become a global phenomenon. Unfortunately, teaching is often described as a ‘popularity contest’ and those teachers who are usually popular with students are often those who compromise teaching to appease students. Such teachers also ‘teach-to-the-test’ ensuring high test scores. Such teachers, hence, receive good student rating. Teachers who are conscientious, rigorous and thorough are often the victims of good appraisal. Government and private organizations are spending billions of dollars trying to capture the characteristics of a good teacher. But the results are still vague and inconclusive. At present there is no objective way to measure teaching effectiveness. In this paper we present an innovative method to objectively measure teaching effectiveness using a new teaching tool (TSquare). The TSquare tool used in the study is practical, easy to use, cost effective and requires no special equipment to implement. Hence it has a global appeal for poor and the rich countries alike.Keywords: measuring teaching effectiveness, quality in education, student learning, teaching styles
Procedia PDF Downloads 2965547 An Assessment of Finite Element Computations in the Structural Analysis of Diverse Coronary Stent Types: Identifying Prerequisites for Advancement
Authors: Amir Reza Heydari, Yaser Jenab
Abstract:
Coronary artery disease, a common cardiovascular disease, is attributed to the accumulation of cholesterol-based plaques in the coronary arteries, leading to atherosclerosis. This disease is associated with risk factors such as smoking, hypertension, diabetes, and elevated cholesterol levels, contributing to severe clinical consequences, including acute coronary syndromes and myocardial infarction. Treatment approaches such as from lifestyle interventions to surgical procedures like percutaneous coronary intervention and coronary artery bypass surgery. These interventions often employ stents, including bare-metal stents (BMS), drug-eluting stents (DES), and bioresorbable vascular scaffolds (BVS), each with its advantages and limitations. Computational tools have emerged as critical in optimizing stent designs and assessing their performance. The aim of this study is to provide an overview of the computational methods of studies based on the finite element (FE) method in the field of coronary stenting and discuss the potential for development and clinical application of stent devices. Additionally, the importance of assessing the ability of computational models is emphasized to represent real-world phenomena, supported by recent guidelines from the American Society of Mechanical Engineers (ASME). Validation processes proposed include comparing model performance with in vivo, ex-vivo, or in vitro data, alongside uncertainty quantification and sensitivity analysis. These methods can enhance the credibility and reliability of in silico simulations, ultimately aiding in the assessment of coronary stent designs in various clinical contexts.Keywords: atherosclerosis, materials, restenosis, review, validation
Procedia PDF Downloads 915546 Potential Effects of Climate Change on Streamflow, Based on the Occurrence of Severe Floods in Kelantan, East Coasts of Peninsular Malaysia River Basin
Authors: Muhd. Barzani Gasim, Mohd. Ekhwan Toriman, Mohd. Khairul Amri Kamarudin, Azman Azid, Siti Humaira Haron, Muhammad Hafiz Md. Saad
Abstract:
Malaysia is a country in Southeast Asia that constantly exposed to flooding and landslide. The disaster has caused some troubles such loss of property, loss of life and discomfort of people involved. This problem occurs as a result of climate change leading to increased stream flow rate as a result of disruption to regional hydrological cycles. The aim of the study is to determine hydrologic processes in the east coasts of Peninsular Malaysia, especially in Kelantan Basin. Parameterized to account for the spatial and temporal variability of basin characteristics and their responses to climate variability. For hydrological modeling of the basin, the Soil and Water Assessment Tool (SWAT) model such as relief, soil type, and its use, and historical daily time series of climate and river flow rates are studied. The interpretation of Landsat map/land uses will be applied in this study. The combined of SWAT and climate models, the system will be predicted an increase in future scenario climate precipitation, increase in surface runoff, increase in recharge and increase in the total water yield. As a result, this model has successfully developed the basin analysis by demonstrating analyzing hydrographs visually, good estimates of minimum and maximum flows and severe floods observed during calibration and validation periods.Keywords: east coasts of Peninsular Malaysia, Kelantan river basin, minimum and maximum flows, severe floods, SWAT model
Procedia PDF Downloads 2625545 Improving Neonatal Abstinence Syndrome Assessments
Authors: Nancy Wilson
Abstract:
In utero, fetal drug exposure is prevalent amongst birthing facilities. Assessment tools for neonatal abstinence syndrome (NAS) are often cumbersome and ill-fitting, harboring immense subjectivity. This paradox often leads the clinical assessor to be hypervigilant when assessing the newborn for subtle symptoms of NAS, often mistaken for normal newborn behaviors. As a quality improvement initiative, this project led to a more adaptable NAS tool termed eat, sleep, console (ESC). This function-based NAS assessment scores the infant based on the ability to accomplish three basic newborn necessities- to sleep, to eat, and to be consoled. Literature supports that ESC methodology improves patient and family outcomes while providing more cost-effective care.Keywords: neonatal abstinence syndrome, neonatal opioid withdrawal, maternal substance abuse, pregnancy, and addiction, Finnegan neonatal abstinence syndrome tool, eat, sleep, console
Procedia PDF Downloads 1525544 Thermal Evaluation of Printed Circuit Board Design Options and Voids in Solder Interface by a Simulation Tool
Authors: B. Arzhanov, A. Correia, P. Delgado, J. Meireles
Abstract:
Quad Flat No-Lead (QFN) packages have become very popular for turners, converters and audio amplifiers, among others applications, needing efficient power dissipation in small footprints. Since semiconductor junction temperature (TJ) is a critical parameter in the product quality. And to ensure that die temperature does not exceed the maximum allowable TJ, a thermal analysis conducted in an earlier development phase is essential to avoid repeated re-designs process with huge losses in cost and time. A simulation tool capable to estimate die temperature of components with QFN package was developed. Allow establish a non-empirical way to define an acceptance criterion for amount of voids in solder interface between its exposed pad and Printed Circuit Board (PCB) to be applied during industrialization process, and evaluate the impact of PCB designs parameters. Targeting PCB layout designer as an end user for the application, a user-friendly interface (GUI) was implemented allowing user to introduce design parameters in a convenient and secure way and hiding all the complexity of finite element simulation process. This cost effective tool turns transparent a simulating process and provides useful outputs after acceptable time, which can be adopted by PCB designers, preventing potential risks during the design stage and make product economically efficient by not oversizing it. This article gathers relevant information related to the design and implementation of the developed tool, presenting a parametric study conducted with it. The simulation tool was experimentally validated using a Thermal-Test-Chip (TTC) in a QFN open-cavity, in order to measure junction temperature (TJ) directly on the die under controlled and knowing conditions. Providing a short overview about standard thermal solutions and impacts in exposed pad packages (i.e. QFN), accurately describe the methods and techniques that the system designer should use to achieve optimum thermal performance, and demonstrate the effect of system-level constraints on the thermal performance of the design.Keywords: QFN packages, exposed pads, junction temperature, thermal management and measurements
Procedia PDF Downloads 2565543 Balanced Score Card a Tool to Improve Naac Accreditation – a Case Study in Indian Higher Education
Authors: CA Kishore S. Peshori
Abstract:
Introduction: India, a country with vast diversity and huge population is going to have largest young population by 2020. Higher education has and will always be the basic requirement for making a developing nation to a developed nation. To improve any system it needs to be bench-marked. There have been various tools for bench-marking the systems. Education is delivered in India by universities which are mainly funded by government. This universities for delivering the education sets up colleges which are again funded mainly by government. Recently however there has also been autonomy given to universities and colleges. Moreover foreign universities are waiting to enter Indian boundaries. With a large number of universities and colleges it has become more and more necessary to measure this institutes for bench-marking. There have been various tools for measuring the institute. In India college assessments have been made compulsory by UGC. Naac has been offically recognised as the accrediation criteria. The Naac criteria has been based on seven criterias namely: 1. Curricular assessments, 2. Teaching learning and evaluation, 3. Research Consultancy and Extension, 4. Infrastructure and learning resources, 5. Student support and progression, 6. Governance leadership and management, 7. Innovation and best practices. The Naac tries to bench mark the institution for identification, sustainability, dissemination and adaption of best practices. It grades the institution according to this seven criteria and the funding of institution is based on these grades. Many of the colleges are struggling to get best of grades but they have not come across a systematic tool to achieve the results. Balanced Scorecard developed by Kaplan has been a successful tool for corporates to develop best of practices so as to increase their financial performance and also retain and increase their customers so as to grow the organization to next level.It is time to test this tool for an educational institute. Methodology: The paper tries to develop a prototype for college based on the secondary data. Once a prototype is developed the researcher based on questionnaire will try to test this tool for successful implementation. The success of this research will depend on its implementation of BSC on an institute and its grading improved due to this successful implementation. Limitation of time is a major constraint in this research as Naac cycle takes minimum 4 years for accreditation and reaccreditation the methodology will limit itself to secondary data and questionnaire to be circulated to colleges along with the prototype model of BSC. Conclusion: BSC is a successful tool for enhancing growth of an organization. Educational institutes are no exception to these. BSC will only have to be realigned to suit the Naac criteria. Once this prototype is developed the success will be tested only on its implementation but this research paper will be the first step towards developing this tool and will also initiate the success by developing a questionnaire and getting and evaluating the responses for moving to the next level of actual implementationKeywords: balanced scorecard, bench marking, Naac, UGC
Procedia PDF Downloads 2725542 Storage System Validation Study for Raw Cocoa Beans Using Minitab® 17 and R (R-3.3.1)
Authors: Anthony Oppong Kyekyeku, Sussana Antwi-Boasiako, Emmanuel De-Graft Johnson Owusu Ansah
Abstract:
In this observational study, the performance of a known conventional storage system was tested and evaluated for fitness for its intended purpose. The system has a scope extended for the storage of dry cocoa beans. System sensitivity, reproducibility and uncertainties are not known in details. This study discusses the system performance in the context of existing literature on factors that influence the quality of cocoa beans during storage. Controlled conditions were defined precisely for the system to give reliable base line within specific established procedures. Minitab® 17 and R statistical software (R-3.3.1) were used for the statistical analyses. The approach to the storage system testing was to observe and compare through laboratory test methods the quality of the cocoa beans samples before and after storage. The samples were kept in Kilner jars and the temperature of the storage environment controlled and monitored over a period of 408 days. Standard test methods use in international trade of cocoa such as the cut test analysis, moisture determination with Aqua boy KAM III model and bean count determination were used for quality assessment. The data analysis assumed the entire population as a sample in order to establish a reliable baseline to the data collected. The study concluded a statistically significant mean value at 95% Confidence Interval (CI) for the performance data analysed before and after storage for all variables observed. Correlational graphs showed a strong positive correlation for all variables investigated with the exception of All Other Defect (AOD). The weak relationship between the before and after data for AOD had an explained variability of 51.8% with the unexplained variability attributable to the uncontrolled condition of hidden infestation before storage. The current study concluded with a high-performance criterion for the storage system.Keywords: benchmarking performance data, cocoa beans, hidden infestation, storage system validation
Procedia PDF Downloads 1745541 Performance Enhancement of Autopart Manufacturing Industry Using Lean Manufacturing Strategies: A Case Study
Authors: Raman Kumar, Jasgurpreet Singh Chohan, Chander Shekhar Verma
Abstract:
Today, the manufacturing industries respond rapidly to new demands and compete in this continuously changing environment, thus seeking out new methods allowing them to remain competitive and flexible simultaneously. The aim of the manufacturing organizations is to reduce manufacturing costs and wastes through system simplification, organizational potential, and proper infrastructural planning by using modern techniques like lean manufacturing. In India, large number of medium and large scale manufacturing industries has successfully implemented lean manufacturing techniques. Keeping in view the above-mentioned facts, different tools will be involved in the successful implementation of the lean approach. The present work is focused on the auto part manufacturing industry to improve the performance of the recliner assembly line. There is a number of lean manufacturing tools available, but the experience and complete knowledge of manufacturing processes are required to select an appropriate tool for a specific process. Fishbone diagrams (scrap, inventory, and waiting) have been drawn to identify the root cause of different. Effect of cycle time reduction on scrap and inventory is analyzed thoroughly in the case company. Results have shown that there is a decrease in inventory cost by 7 percent after the successful implementation of the lean tool.Keywords: lean tool, fish-bone diagram, cycle time reduction, case study
Procedia PDF Downloads 1275540 AI-Driven Solutions for Optimizing Master Data Management
Authors: Srinivas Vangari
Abstract:
In the era of big data, ensuring the accuracy, consistency, and reliability of critical data assets is crucial for data-driven enterprises. Master Data Management (MDM) plays a crucial role in this endeavor. This paper investigates the role of Artificial Intelligence (AI) in enhancing MDM, focusing on how AI-driven solutions can automate and optimize various stages of the master data lifecycle. By integrating AI (Quantitative and Qualitative Analysis) into processes such as data creation, maintenance, enrichment, and usage, organizations can achieve significant improvements in data quality and operational efficiency. Quantitative analysis is employed to measure the impact of AI on key metrics, including data accuracy, processing speed, and error reduction. For instance, our study demonstrates an 18% improvement in data accuracy and a 75% reduction in duplicate records across multiple systems post-AI implementation. Furthermore, AI’s predictive maintenance capabilities reduced data obsolescence by 22%, as indicated by statistical analyses of data usage patterns over a 12-month period. Complementing this, a qualitative analysis delves into the specific AI-driven strategies that enhance MDM practices, such as automating data entry and validation, which resulted in a 28% decrease in manual errors. Insights from case studies highlight how AI-driven data cleansing processes reduced inconsistencies by 25% and how AI-powered enrichment strategies improved data relevance by 24%, thus boosting decision-making accuracy. The findings demonstrate that AI significantly enhances data quality and integrity, leading to improved enterprise performance through cost reduction, increased compliance, and more accurate, real-time decision-making. These insights underscore the value of AI as a critical tool in modern data management strategies, offering a competitive edge to organizations that leverage its capabilities.Keywords: artificial intelligence, master data management, data governance, data quality
Procedia PDF Downloads 175539 Numerical Analysis of NOₓ Emission in Staged Combustion for the Optimization of Once-Through-Steam-Generators
Authors: Adrien Chatel, Ehsan Askari Mahvelati, Laurent Fitschy
Abstract:
Once-Through-Steam-Generators are commonly used in the oil-sand industry in the heavy fuel oil extraction process. They are composed of three main parts: the burner, the radiant and convective sections. Natural gas is burned through staged diffusive flames stabilized by the burner. The heat generated by the combustion is transferred to the water flowing through the piping system in the radiant and convective sections. The steam produced within the pipes is then directed to the ground to reduce the oil viscosity and allow its pumping. With the rapid development of the oil-sand industry, the number of OTSG in operation has increased as well as the associated emissions of environmental pollutants, especially the Nitrous Oxides (NOₓ). To limit the environmental degradation, various international environmental agencies have established regulations on the pollutant discharge and pushed to reduce the NOₓ release. To meet these constraints, OTSG constructors have to rely on more and more advanced tools to study and predict the NOₓ emission. With the increase of the computational resources, Computational Fluid Dynamics (CFD) has emerged as a flexible tool to analyze the combustion and pollutant formation process. Moreover, to optimize the burner operating condition regarding the NOx emission, field characterization and measurements are usually accomplished. However, these kinds of experimental campaigns are particularly time-consuming and sometimes even impossible for industrial plants with strict operation schedule constraints. Therefore, the application of CFD seems to be more adequate in order to provide guidelines on the NOₓ emission and reduction problem. In the present work, two different software are employed to simulate the combustion process in an OTSG, namely the commercial software ANSYS Fluent and the open source software OpenFOAM. RANS (Reynolds-Averaged Navier–Stokes) equations combined with the Eddy Dissipation Concept to model the combustion and closed by the k-epsilon model are solved. A mesh sensitivity analysis is performed to assess the independence of the solution on the mesh. In the first part, the results given by the two software are compared and confronted with experimental data as a mean to assess the numerical modelling. Flame temperatures and chemical composition are used as reference fields to perform this validation. Results show a fair agreement between experimental and numerical data. In the last part, OpenFOAM is employed to simulate several operating conditions, and an Emission Characteristic Map of the combustion system is generated. The sources of high NOₓ production inside the OTSG are pointed and correlated to the physics of the flow. CFD is, therefore, a useful tool for providing an insight into the NOₓ emission phenomena in OTSG. Sources of high NOₓ production can be identified, and operating conditions can be adjusted accordingly. With the help of RANS simulations, an Emission Characteristics Map can be produced and then be used as a guide for a field tune-up.Keywords: combustion, computational fluid dynamics, nitrous oxides emission, once-through-steam-generators
Procedia PDF Downloads 1135538 Practice of Social Audit in Hotel Companies: Case Study of Agadir, Morocco
Authors: M. El Mousadik, F. Elkandoussi
Abstract:
The concern for increased rigor in social management has led more and more Moroccan business leaders to question the value of applying social audit as an essential tool in the management of human resources. Hotel companies are not excluded; in fact, they are expected to implement such an audit to develop sound and credible human resources management (HRM) policies. The main objective of this paper is to establish the relationship between the practice of social audit as a tool, and its impact on the tourism sector, especially on hotels at one of the Morocco’s first and most popular city for tourism, Agadir. This exploratory study of properties in Agadir has revealed that hotel executives are aware of the importance of social auditing to hone their decisions in the field of HRM.Keywords: social audit, hotel companies, human resources management, social piloting
Procedia PDF Downloads 2785537 Educatronic Prototype for Learning Geometry, Based on a Multitouch Surface
Authors: Vicario Marina, Bustos Freddy, Olivares Jesús, Gómez Pilar
Abstract:
This paper presents a didactic model and a tool as educational resources to support the learning of geometry; they focus on topics difficult to understand. The target population is elementary school students. The tool is based on a collaborative educational approach using multi-touch devices. The proposal is based on the challenges found in the instructional design and prototype implementation. Traditionally, elementary students have had many problems assimilating mathematical topics; this new Educatronic prototype facilitates the learning experience using exercises and they were tested with different children demonstrating the benefits of the prototype by improving their mathematical skills.Keywords: educatronic prototype, geometry, multitouch surface, educational computing, primary school, mathematics, educational informatics
Procedia PDF Downloads 3195536 External Validation of Risk Prediction Score for Candidemia in Critically Ill Patients: A Retrospective Observational Study
Authors: Nurul Mazni Abdullah, Saw Kian Cheah, Raha Abdul Rahman, Qurratu 'Aini Musthafa
Abstract:
Purpose: Candidemia was associated with high mortality in the critically ill patients. Early candidemia prediction is imperative for preemptive antifungal treatment. This study aimed to externally validate the candidemia risk prediction scores by Jameran et al. (2021) by identifying risk factors of acute kidney injury, renal replacement therapy, parenteral nutrition, and multifocal candida colonization. Methods: This single-center, retrospective observational study included all critically ill patients admitted to the intensive care unit (ICU) in a tertiary referral center from January 2018 to December 2023. The study evaluated the candidemia risk prediction score performance by analysing the occurrence of candidemia within the study period. Patients’ demographic characteristics, comorbidities, SOFA scores, and ICU outcomes were analyzed. Patients who were diagnosed with candidemia prior to ICU admission were excluded. Results: A total of 500 patients were analyzed with 2 dropouts due to incomplete data. Validation analysis showed that the candidemia risk prediction score has a sensitivity of 75.00% (95% CI: 59.66-86.81), specificity of 65.35% (95% CI: 60.78-69.72), positive predictive value of 17.28, and negative predictive value of 96.44. The incidence of candidemia was 8.86%, with no significant differences in demographics or comorbidities except for higher SOFA scoring in the candidemia group. The candidemia group showed significantly longer ICU, hospital LOS, and higher ICU in-hospital mortality. Conclusion: This study concluded the candidemia risk prediction score by Jameran et al. (2021) had good sensitivity and a high negative prediction value. Thus, the risk prediction score was validated for candidemia prediction in critically ill patients.Keywords: Candidemia, intensive care, acute kidney injury, clinical prediction rule, incidence
Procedia PDF Downloads 75535 Implications of Optimisation Algorithm on the Forecast Performance of Artificial Neural Network for Streamflow Modelling
Authors: Martins Y. Otache, John J. Musa, Abayomi I. Kuti, Mustapha Mohammed
Abstract:
The performance of an artificial neural network (ANN) is contingent on a host of factors, for instance, the network optimisation scheme. In view of this, the study examined the general implications of the ANN training optimisation algorithm on its forecast performance. To this end, the Bayesian regularisation (Br), Levenberg-Marquardt (LM), and the adaptive learning gradient descent: GDM (with momentum) algorithms were employed under different ANN structural configurations: (1) single-hidden layer, and (2) double-hidden layer feedforward back propagation network. Results obtained revealed generally that the gradient descent with momentum (GDM) optimisation algorithm, with its adaptive learning capability, used a relatively shorter time in both training and validation phases as compared to the Levenberg- Marquardt (LM) and Bayesian Regularisation (Br) algorithms though learning may not be consummated; i.e., in all instances considering also the prediction of extreme flow conditions for 1-day and 5-day ahead, respectively especially using the ANN model. In specific statistical terms on the average, model performance efficiency using the coefficient of efficiency (CE) statistic were Br: 98%, 94%; LM: 98 %, 95 %, and GDM: 96 %, 96% respectively for training and validation phases. However, on the basis of relative error distribution statistics (MAE, MAPE, and MSRE), GDM performed better than the others overall. Based on the findings, it is imperative to state that the adoption of ANN for real-time forecasting should employ training algorithms that do not have computational overhead like the case of LM that requires the computation of the Hessian matrix, protracted time, and sensitivity to initial conditions; to this end, Br and other forms of the gradient descent with momentum should be adopted considering overall time expenditure and quality of the forecast as well as mitigation of network overfitting. On the whole, it is recommended that evaluation should consider implications of (i) data quality and quantity and (ii) transfer functions on the overall network forecast performance.Keywords: streamflow, neural network, optimisation, algorithm
Procedia PDF Downloads 1525534 Development and Total Error Concept Validation of Common Analytical Method for Quantification of All Residual Solvents Present in Amino Acids by Gas Chromatography-Head Space
Authors: A. Ramachandra Reddy, V. Murugan, Prema Kumari
Abstract:
Residual solvents in Pharmaceutical samples are monitored using gas chromatography with headspace (GC-HS). Based on current regulatory and compendial requirements, measuring the residual solvents are mandatory for all release testing of active pharmaceutical ingredients (API). Generally, isopropyl alcohol is used as the residual solvent in proline and tryptophan; methanol in cysteine monohydrate hydrochloride, glycine, methionine and serine; ethanol in glycine and lysine monohydrate; acetic acid in methionine. In order to have a single method for determining these residual solvents (isopropyl alcohol, ethanol, methanol and acetic acid) in all these 7 amino acids a sensitive and simple method was developed by using gas chromatography headspace technique with flame ionization detection. During development, no reproducibility, retention time variation and bad peak shape of acetic acid peaks were identified due to the reaction of acetic acid with the stationary phase (cyanopropyl dimethyl polysiloxane phase) of column and dissociation of acetic acid with water (if diluent) while applying temperature gradient. Therefore, dimethyl sulfoxide was used as diluent to avoid these issues. But most the methods published for acetic acid quantification by GC-HS uses derivatisation technique to protect acetic acid. As per compendia, risk-based approach was selected as appropriate to determine the degree and extent of the validation process to assure the fitness of the procedure. Therefore, Total error concept was selected to validate the analytical procedure. An accuracy profile of ±40% was selected for lower level (quantitation limit level) and for other levels ±30% with 95% confidence interval (risk profile 5%). The method was developed using DB-Waxetr column manufactured by Agilent contains 530 µm internal diameter, thickness: 2.0 µm, and length: 30 m. A constant flow of 6.0 mL/min. with constant make up mode of Helium gas was selected as a carrier gas. The present method is simple, rapid, and accurate, which is suitable for rapid analysis of isopropyl alcohol, ethanol, methanol and acetic acid in amino acids. The range of the method for isopropyl alcohol is 50ppm to 200ppm, ethanol is 50ppm to 3000ppm, methanol is 50ppm to 400ppm and acetic acid 100ppm to 400ppm, which covers the specification limits provided in European pharmacopeia. The accuracy profile and risk profile generated as part of validation were found to be satisfactory. Therefore, this method can be used for testing of residual solvents in amino acids drug substances.Keywords: amino acid, head space, gas chromatography, total error
Procedia PDF Downloads 1485533 Computational Aided Approach for Strut and Tie Model for Non-Flexural Elements
Authors: Mihaja Razafimbelo, Guillaume Herve-Secourgeon, Fabrice Gatuingt, Marina Bottoni, Tulio Honorio-De-Faria
Abstract:
The challenge of the research is to provide engineering with a robust, semi-automatic method for calculating optimal reinforcement for massive structural elements. In the absence of such a digital post-processing tool, design office engineers make intensive use of plate modelling, for which automatic post-processing is available. Plate models in massive areas, on the other hand, produce conservative results. In addition, the theoretical foundations of automatic post-processing tools for reinforcement are those of reinforced concrete beam sections. As long as there is no suitable alternative for automatic post-processing of plates, optimal modelling and a significant improvement of the constructability of massive areas cannot be expected. A method called strut-and-tie is commonly used in civil engineering, but the result itself remains very subjective to the calculation engineer. The tool developed will facilitate the work of supporting the engineers in their choice of structure. The method implemented consists of defining a ground-structure built on the basis of the main constraints resulting from an elastic analysis of the structure and then to start an optimization of this structure according to the fully stressed design method. The first results allow to obtain a coherent return in the first network of connecting struts and ties, compared to the cases encountered in the literature. The evolution of the tool will then make it possible to adapt the obtained latticework in relation to the cracking states resulting from the loads applied during the life of the structure, cyclic or dynamic loads. In addition, with the constructability constraint, a final result of reinforcement with an orthogonal arrangement with a regulated spacing will be implemented in the tool.Keywords: strut and tie, optimization, reinforcement, massive structure
Procedia PDF Downloads 1415532 Comparison of Different Artificial Intelligence-Based Protein Secondary Structure Prediction Methods
Authors: Jamerson Felipe Pereira Lima, Jeane Cecília Bezerra de Melo
Abstract:
The difficulty and cost related to obtaining of protein tertiary structure information through experimental methods, such as X-ray crystallography or NMR spectroscopy, helped raising the development of computational methods to do so. An approach used in these last is prediction of tridimensional structure based in the residue chain, however, this has been proved an NP-hard problem, due to the complexity of this process, explained by the Levinthal paradox. An alternative solution is the prediction of intermediary structures, such as the secondary structure of the protein. Artificial Intelligence methods, such as Bayesian statistics, artificial neural networks (ANN), support vector machines (SVM), among others, were used to predict protein secondary structure. Due to its good results, artificial neural networks have been used as a standard method to predict protein secondary structure. Recent published methods that use this technique, in general, achieved a Q3 accuracy between 75% and 83%, whereas the theoretical accuracy limit for protein prediction is 88%. Alternatively, to achieve better results, support vector machines prediction methods have been developed. The statistical evaluation of methods that use different AI techniques, such as ANNs and SVMs, for example, is not a trivial problem, since different training sets, validation techniques, as well as other variables can influence the behavior of a prediction method. In this study, we propose a prediction method based on artificial neural networks, which is then compared with a selected SVM method. The chosen SVM protein secondary structure prediction method is the one proposed by Huang in his work Extracting Physico chemical Features to Predict Protein Secondary Structure (2013). The developed ANN method has the same training and testing process that was used by Huang to validate his method, which comprises the use of the CB513 protein data set and three-fold cross-validation, so that the comparative analysis of the results can be made comparing directly the statistical results of each method.Keywords: artificial neural networks, protein secondary structure, protein structure prediction, support vector machines
Procedia PDF Downloads 6215531 Development of a Culturally Safe Wellbeing Intervention Tool for and with the Inuit in Quebec
Authors: Liliana Gomez Cardona, Echo Parent-Racine, Joy Outerbridge, Arlene Laliberté, Outi Linnaranta
Abstract:
Suicide rates among Inuit in Nunavik are six to eleven times larger than the Canadian average. The colonization, religious missions, residential schools as well as economic and political marginalization are factors that have challenged the well-being and mental health of these populations. In psychiatry, screening for mental illness is often done using questionnaires with which the patient is expected to respond how often he/she has certain symptoms. However, the Indigenous view of mental wellbeing may not fit well with this approach. Moreover, biomedical treatments do not always meet the needs of Indigenous peoples because they do not understand the culture and traditional healing methods that persist in many communities. Assess whether the questionnaires used to measure symptoms, commonly used in psychiatry are appropriate and culturally safe for the Inuit in Quebec. Identify the most appropriate tool to assess and promote wellbeing and follow the process necessary to improve its cultural sensitivity and safety for the Inuit population. Qualitative, collaborative, and participatory action research project which respects First Nations and Inuit protocols and the principles of ownership, control, access, and possession (OCAP). Data collection based on five focus groups with stakeholders working with these populations and members of Indigenous communities. Thematic analysis of the data collected and emerging through an advisory group that led a revision of the content, use, and cultural and conceptual relevance of the instruments. The questionnaires measuring psychiatric symptoms face significant limitations in the local indigenous context. We present the factors that make these tools not relevant among Inuit. Although the scale called Growth and Empowerment Measure (GEM) was originally developed among Indigenous in Australia, the Inuit in Quebec found that this tool comprehends critical aspects of their mental health and wellbeing more respectfully and accurately than questionnaires focused on measuring symptoms. We document the process of cultural adaptation of this tool which was supported by community members to create a culturally safe tool that helps in resilience and empowerment. The cultural adaptation of the GEM provides valuable information about the factors affecting wellbeing and contributes to mental health promotion. This process improves mental health services by giving health care providers useful information about the Inuit population and their clients. We believe that integrating this tool in interventions can help create a bridge to improve communication between the Indigenous cultural perspective of the patient and the biomedical view of health care providers. Further work is needed to confirm the clinical utility of this tool in psychological and psychiatric intervention along with social and community services.Keywords: cultural adaptation, cultural safety, empowerment, Inuit, mental health, Nunavik, resiliency
Procedia PDF Downloads 1185530 Investigation of Design Process of an Impedance Matching in the Specific Frequency for Radio Frequency Application
Authors: H. Nabaei, M. Joghataie
Abstract:
In this article, we study the design methods of matched filter with commercial software including CST Studio and ADS in specific frequency: 900 MHz. At first, we select two amounts of impedance for studying matching of them. Then, using by matched filter utility tool in ADS software, we simulate and deviate the elements of matched filters. In the following, we implement matched filter in CST STUDIO software. The simulated results show the great conformity in this field. Also, we peruse scattering and Impedance parameters in the Derivative structure. Finally, the layout of matched filter is obtained by the schematic tool of CST STUDIO. In fact, here, we present the design process of matched filters in the specific frequency.Keywords: impedance matching, lumped element, transmission line, maximum power transmission, 3D layout
Procedia PDF Downloads 5025529 Territorial Marketing as a Tool to Overcome the "Underdevelopment Whirlpools": Prospective Directions and Experiences of Developing Countries
Authors: E. G. Popkova, I. A. Morozova, T. N. Litvinova
Abstract:
As a result, numerous studies of economic systems the authors have identified and substantiated the existence of a“underdevelopment whirlpool” is a phenomenon of considerable differentiation level of economic development in developed and developing countries. This article reflects the relationship “underdevelopment whirlpools” marketing areas as a tool to overcome them. The article presents the author's recommendations for dealing with “underdevelopment whirlpools”. Based on the experience of successful developing countries showing strong economic growth, the author analyzes possible future direction of overcoming the “underdevelopment whirlpools”. The author details the aspect of increasing product through the positioning of the territory as a way out of the “underdevelopment whirlpools”.Keywords: underdevelopment whirlpool, developed countries, developing countries, disparities of economic growth, marketing territories
Procedia PDF Downloads 4465528 Effect on the Performance of the Nano-Particulate Graphite Lubricant in the Turning of AISI 1040 Steel under Variable Machining Conditions
Authors: S. Srikiran, Dharmala Venkata Padmaja, P. N. L. Pavani, R. Pola Rao, K. Ramji
Abstract:
Technological advancements in the development of cutting tools and coolant/lubricant chemistry have enhanced the machining capabilities of hard materials under higher machining conditions. Generation of high temperatures at the cutting zone during machining is one of the most important and pertinent problems which adversely affect the tool life and surface finish of the machined components. Generally, cutting fluids and solid lubricants are used to overcome the problem of heat generation, which is not effectively addressing the problems. With technological advancements in the field of tribology, nano-level particulate solid lubricants are being used nowadays in machining operations, especially in the areas of turning and grinding. The present investigation analyses the effect of using nano-particulate graphite powder as lubricant in the turning of AISI 1040 steel under variable machining conditions and to study its effect on cutting forces, tool temperature and surface roughness of the machined component. Experiments revealed that the increase in cutting forces and tool temperature resulting in the decrease of surface quality with the decrease in the size of nano-particulate graphite powder as lubricant.Keywords: solid lubricant, graphite, minimum quantity lubrication (MQL), nano–particles
Procedia PDF Downloads 2705527 Performance Comparison and Visualization of COMSOL Multiphysics, Matlab, and Fortran for Predicting the Reservoir Pressure on Oil Production in a Multiple Leases Reservoir with Boundary Element Method
Authors: N. Alias, W. Z. W. Muhammad, M. N. M. Ibrahim, M. Mohamed, H. F. S. Saipol, U. N. Z. Ariffin, N. A. Zakaria, M. S. Z. Suardi
Abstract:
This paper presents the performance comparison of some computation software for solving the boundary element method (BEM). BEM formulation is the numerical technique and high potential for solving the advance mathematical modeling to predict the production of oil well in arbitrarily shaped based on multiple leases reservoir. The limitation of data validation for ensuring that a program meets the accuracy of the mathematical modeling is considered as the research motivation of this paper. Thus, based on this limitation, there are three steps involved to validate the accuracy of the oil production simulation process. In the first step, identify the mathematical modeling based on partial differential equation (PDE) with Poisson-elliptic type to perform the BEM discretization. In the second step, implement the simulation of the 2D BEM discretization using COMSOL Multiphysic and MATLAB programming languages. In the last step, analyze the numerical performance indicators for both programming languages by using the validation of Fortran programming. The performance comparisons of numerical analysis are investigated in terms of percentage error, comparison graph and 2D visualization of pressure on oil production of multiple leases reservoir. According to the performance comparison, the structured programming in Fortran programming is the alternative software for implementing the accurate numerical simulation of BEM. As a conclusion, high-level language for numerical computation and numerical performance evaluation are satisfied to prove that Fortran is well suited for capturing the visualization of the production of oil well in arbitrarily shaped.Keywords: performance comparison, 2D visualization, COMSOL multiphysic, MATLAB, Fortran, modelling and simulation, boundary element method, reservoir pressure
Procedia PDF Downloads 4915526 Entrepreneurship and the Growth of Small and Medium Enterprises in the Kwara state, Nigeria
Authors: Salman Abdulrasaq
Abstract:
Small and Medium Enterprises (SMEs) has been considered as indices for economic development in a country economy. The development of entrepreneurship skills is therefore necessary. This study, seeks to examine the impact of Entrepreneurship on the Growth of Small Businesses Kwara State, Nigeria. The data used were primarily obtained from the questionnaire administered to the randomly selected areas in the state. Regression statistical tool was employed with aid of SPSS to test the validity of the hypothesis formulated in the study. The study therefore concludes that; the qualities of entrepreneur have impact the growth of Small Businesses s in the selected areas of the state. In view of this, the study recommends that; entrepreneurship development would serve as a tool for the growth of small business enterprises.Keywords: entrepreneurship, growth, development, Nigeria
Procedia PDF Downloads 4075525 In situ Real-Time Multivariate Analysis of Methanolysis Monitoring of Sunflower Oil Using FTIR
Authors: Pascal Mwenge, Tumisang Seodigeng
Abstract:
The combination of world population and the third industrial revolution led to high demand for fuels. On the other hand, the decrease of global fossil 8fuels deposits and the environmental air pollution caused by these fuels has compounded the challenges the world faces due to its need for energy. Therefore, new forms of environmentally friendly and renewable fuels such as biodiesel are needed. The primary analytical techniques for methanolysis yield monitoring have been chromatography and spectroscopy, these methods have been proven reliable but are more demanding, costly and do not provide real-time monitoring. In this work, the in situ monitoring of biodiesel from sunflower oil using FTIR (Fourier Transform Infrared) has been studied; the study was performed using EasyMax Mettler Toledo reactor equipped with a DiComp (Diamond) probe. The quantitative monitoring of methanolysis was performed by building a quantitative model with multivariate calibration using iC Quant module from iC IR 7.0 software. 15 samples of known concentrations were used for the modelling which were taken in duplicate for model calibration and cross-validation, data were pre-processed using mean centering and variance scale, spectrum math square root and solvent subtraction. These pre-processing methods improved the performance indexes from 7.98 to 0.0096, 11.2 to 3.41, 6.32 to 2.72, 0.9416 to 0.9999, RMSEC, RMSECV, RMSEP and R2Cum, respectively. The R2 value of 1 (training), 0.9918 (test), 0.9946 (cross-validation) indicated the fitness of the model built. The model was tested against univariate model; small discrepancies were observed at low concentration due to unmodelled intermediates but were quite close at concentrations above 18%. The software eliminated the complexity of the Partial Least Square (PLS) chemometrics. It was concluded that the model obtained could be used to monitor methanol of sunflower oil at industrial and lab scale.Keywords: biodiesel, calibration, chemometrics, methanolysis, multivariate analysis, transesterification, FTIR
Procedia PDF Downloads 1485524 Management Workspaces to Create Value
Authors: Nevruz Zogu, Shpetim Rezniqi
Abstract:
It is very important that a new environment where work shall be constructed in such a strong record to be creative and eligible for workers, can not have success in the workplace. But, is it possible to design the inner-inspire to create and collaborate? By watching and analyzing examples of creativity in business, construction managers can learn ways on how to encourage their imagination inside buildings. We struggle to find and retain talented employees and skilled labor environment is becoming more and always an important tool for recruiting and retaining employees. Managers who recognize the importance are gaining an edge over their competitors. The physical work environment is as important as its quality is often used as a recruiting tool and even to companies with The relationship between the company and the employees between strategy and behavior, between the product and the customer can reincorporated under the light of symbolic mediation of space, as instrument and interpreter of the core values and identity of the organization.Keywords: strategy, business, quality, productivity, space, offices, assets
Procedia PDF Downloads 3895523 Key Findings on Rapid Syntax Screening Test for Children
Authors: Shyamani Hettiarachchi, Thilini Lokubalasuriya, Shakeela Saleem, Dinusha Nonis, Isuru Dharmaratne, Lakshika Udugama
Abstract:
Introduction: Late identification of language difficulties in children could result in long-term negative consequences for communication, literacy and self-esteem. This highlights the need for early identification and intervention for speech, language and communication difficulties. Speech and language therapy is a relatively new profession in Sri Lanka and at present, there are no formal standardized screening tools to assess language skills in Sinhala-speaking children. The development and validation of a short, accurate screening tool to enable the identification of children with syntactic difficulties in Sinhala is a current need. Aims: 1) To develop test items for a Sinhala Syntactic Structures (S3 Short Form) test on children aged between 3;0 to 5;0 years 2) To validate the test of Sinhala Syntactic Structures (S3 Short Form) on children aged between 3; 0 to 5; 0 years Methods: The Sinhala Syntactic Structures (S3 Short Form) was devised based on the Renfrew Action Picture Test. As Sinhala contains post-positions in contrast to English, the principles of the Renfrew Action Picture Test were followed to gain an information score and a grammar score but the test devised reflected the linguistic-specificity and complexity of Sinhala and the pictures were in keeping with the culture of the country. This included the dative case marker ‘to give something to her’ (/ejɑ:ʈə/ meaning ‘to her’), the instrumental case marker ‘to get something from’ (/ejɑ:gən/ meaning ‘from him’ or /gɑhən/ meaning ‘from the tree’), possessive noun (/ɑmmɑge:/ meaning ‘mother’s’ or /gɑhe:/ meaning ‘of the tree’ or /male:/ meaning ‘of the flower’) and plural markers (/bɑllɑ:/ bɑllo:/ meaning ‘dog/dogs’, /mɑlə/mɑl/ meaning ‘flower/flowers’, /gɑsə/gɑs/ meaning ‘tree/trees’ and /wɑlɑ:kulə/wɑlɑ:kulu/ meaning ‘cloud/clouds’). The picture targets included socio-culturally appropriate scenes of the Sri Lankan New Year celebration, elephant procession and the Buddhist ‘Wesak’ ceremony. The test was piloted with a group of 60 participants and necessary changes made. In phase 1, the test was administered to 100 Sinhala-speaking children aged between 3; 0 and 5; 0 years in one district. In this presentation on phase 2, the test was administered to another 100 Sinhala-speaking children aged between 3; 0 to 5; 0 in three districts. In phase 2, the selection of the test items was assessed via measures of content validity, test-retest reliability and inter-rater reliability. The age of acquisition of each syntactic structure was determined using content and grammar scores which were statistically analysed using t-tests and one-way ANOVAs. Results: High percentage agreement was found on test-retest reliability on content validity and Pearson correlation measures and on inter-rater reliability. As predicted, there was a statistically significant influence of age on the production of syntactic structures at p<0.05. Conclusions: As the target test items included generated the information and the syntactic structures expected, the test could be used as a quick syntactic screening tool with preschool children.Keywords: Sinhala, screening, syntax, language
Procedia PDF Downloads 3405522 Effect of Color on Anagram Solving Ability
Authors: Khushi Chhajed
Abstract:
Context: Color has been found to have an impact on cognitive performance. Due to the negative connotation associated with red, it has been found to impair performance on intellectual tasks. Aim: This study aims to assess the effect of color on individuals' anagram solving ability. Methodology: An experimental study was conducted on 66 participants in the age group of 18–24 years. A self-made anagram assessment tool was administered. Participants were expected to solve the tool in three colors- red, blue and grey. Results: A lower score was found when presented with the color blue as compared to red. The study also found that participants took relatively greater time to solve the red colored sheet. However these results are inconsistent with pre-existing literature. Conclusion: Hence, an association between color and performance on cognitive tasks can be seen. Future directions and potential limitations are discussed.Keywords: color psychology, experiment, anagram, performance
Procedia PDF Downloads 885521 Social Networks in a Communication Strategy of a Large Company
Authors: Kherbache Mehdi
Abstract:
Within the framework of the validation of the Master in business administration marketing and sales in INSIM institute international in management Blida, we get the opportunity to do a professional internship in Sonelgaz Enterprise and a thesis. The thesis deals with the integration of social networking in the communication strategy of a company. The problematic is: How communicate with social network can be a solution for companies? The challenges stressed by this thesis were to suggest limits and recommendations to Sonelgaz Enterprise concerning social networks. The whole social networks represent more than a billion people as a potential target for the companies. Thanks to research and a qualitative approach, we have identified tree valid hypothesis. The first hypothesis allows confirming that using social networks cannot be ignored by any company in its communication strategy. However, the second hypothesis demonstrates that it’s necessary to prepare a strategy that integrates social networks in the communication plan of the company. The risk of this strategy is very limited because failure on social networks is not a restraint for the enterprise, social networking is not expensive and, a bad image which could result from it is not as important in the long-term. Furthermore, the return on investment is difficult to evaluate. Finally, the last hypothesis shows that firms establish a new relation between consumers and brands thanks to the proximity allowed by social networks. After the validation of the hypothesis, we suggested some recommendations to Sonelgaz Enterprise regarding the communication through social networks. Firstly, the company must use the interactivity of social network in order to have fruitful exchanges with the community. We also recommended having a strategy to treat negative comments. The company must also suggest delivering resources to the community thanks to a community manager, in order to have a good relation with the community. Furthermore, we advised using social networks to do business intelligence. Sonelgaz Enterprise can have some creative and interactive contents with some amazing applications on Facebook for example. Finally, we recommended to the company to be not intrusive with “fans” or “followers” and to be open to all the platforms: Twitter, Facebook, Linked-In for example.Keywords: social network, buzz, communication, consumer, return on investment, internet users, web 2.0, Facebook, Twitter, interaction
Procedia PDF Downloads 4225520 In Silico Exploration of Quinazoline Derivatives as EGFR Inhibitors for Lung Cancer: A Multi-Modal Approach Integrating QSAR-3D, ADMET, Molecular Docking, and Molecular Dynamics Analyses
Authors: Mohamed Moussaoui
Abstract:
A series of thirty-one potential inhibitors targeting the epidermal growth factor receptor kinase (EGFR), derived from quinazoline, underwent 3D-QSAR analysis using CoMFA and CoMSIA methodologies. The training and test sets of quinazoline derivatives were utilized to construct and validate the QSAR models, respectively, with dataset alignment performed using the lowest energy conformer of the most active compound. The best-performing CoMFA and CoMSIA models demonstrated impressive determination coefficients, with R² values of 0.981 and 0.978, respectively, and Leave One Out cross-validation determination coefficients, Q², of 0.645 and 0.729, respectively. Furthermore, external validation using a test set of five compounds yielded predicted determination coefficients, R² test, of 0.929 and 0.909 for CoMFA and CoMSIA, respectively. Building upon these promising results, eighteen new compounds were designed and assessed for drug likeness and ADMET properties through in silico methods. Additionally, molecular docking studies were conducted to elucidate the binding interactions between the selected compounds and the enzyme. Detailed molecular dynamics simulations were performed to analyze the stability, conformational changes, and binding interactions of the quinazoline derivatives with the EGFR kinase. These simulations provided deeper insights into the dynamic behavior of the compounds within the active site. This comprehensive analysis enhances the understanding of quinazoline derivatives as potential anti-cancer agents and provides valuable insights for lead optimization in the early stages of drug discovery, particularly for developing highly potent anticancer therapeuticsKeywords: 3D-QSAR, CoMFA, CoMSIA, ADMET, molecular docking, quinazoline, molecular dynamic, egfr inhibitors, lung cancer, anticancer
Procedia PDF Downloads 48