Search results for: clinical assessment tools
9865 Impact of Overall Teaching Program of Anatomy in Learning: A Students Perspective
Authors: Mamatha Hosapatna, Anne D. Souza, Antony Sylvan Dsouza, Vrinda Hari Ankolekar
Abstract:
Our study intends to know the effect of the overall teaching program of Anatomy on a students learning. The advancement of various teaching methodologies in the present era has led to progressive changes in education. A student should be able to correlate well between the theory and practical knowledge attained even in the early years of their education in medicine and should be able to implement the same in patient care. The present study therefore aims to assess the impact the current anatomy teaching program has on a students learning and to what extent is it successful in making the learning program effective. Specific objectives of our study to assess the impact of overall teaching program of Anatomy in a students’ learning. Description of process proposed: A questionnaire will be constructed and the students will be asked to put forth their views regarding the Anatomy teaching program and its method of assessment. Suggestions, if any will also be encouraged to be put forth. Type of study is cross sectional observations. Target population is the first year MBBS students and sample size is 250. Assessment plan is to obtaining students responses using questionnaire. Calculating percentages of the responses obtained. Tabulation of the results will be done.Keywords: anatomy, observational study questionnaire, observational study, M.B.B.S students
Procedia PDF Downloads 4999864 Lifetime Assessment for Test Strips of POCT Device through Accelerated Degradation Test
Authors: Jinyoung Choi, Sunmook Lee
Abstract:
In general, single parameter, i.e. temperature, as an accelerating parameter is used to assess the accelerated stability of Point-of-Care Testing (POCT) diagnostic devices. However, humidity also plays an important role in deteriorating the strip performance since major components of test strips are proteins such as enzymes. 4 different Temp./Humi. Conditions were used to assess the lifetime of strips. Degradation of test strips were studied through the accelerated stability test and the lifetime was assessed using commercial POCT products. The life distribution of strips, which were obtained by monitoring the failure time of test strip under each stress condition, revealed that the weibull distribution was the most proper distribution describing the life distribution of strips used in the present study. Equal shape parameters were calculated to be 0.9395 and 0.9132 for low and high concentrations, respectively. The lifetime prediction was made by adopting Peck Eq. Model for Stress-Life relationship, and the B10 life was calculated to be 70.09 and 46.65 hrs for low and high concentrations, respectively.Keywords: accelerated degradation, diagnostic device, lifetime assessment, POCT
Procedia PDF Downloads 4159863 Effective Validation Model and Use of Mobile-Health Apps for Elderly People
Authors: Leonardo Ramirez Lopez, Edward Guillen Pinto, Carlos Ramos Linares
Abstract:
The controversy brought about by the increasing use of mHealth apps and their effectiveness for disease prevention and diagnosis calls for immediate control. Although a critical topic in research areas such as medicine, engineering, economics, among others, this issue lacks reliable implementation models. However, projects such as Open Web Application Security Project (OWASP) and various studies have helped to create useful and reliable apps. This research is conducted under a quality model to optimize two mHealth apps for older adults. Results analysis on the use of two physical activity monitoring apps - AcTiv (physical activity) and SMCa (energy expenditure) - is positive and ideal. Through a theoretical and practical analysis, precision calculations and personal information control of older adults for disease prevention and diagnosis were performed. Finally, apps are validated by a physician and, as a result, they may be used as health monitoring tools in physical performance centers or any other physical activity. The results obtained provide an effective validation model for this type of mobile apps, which, in turn, may be applied by other software developers that along with medical staff would offer digital healthcare tools for elderly people.Keywords: model, validation, effective, healthcare, elderly people, mobile app
Procedia PDF Downloads 2189862 Experiments on Weakly-Supervised Learning on Imperfect Data
Authors: Yan Cheng, Yijun Shao, James Rudolph, Charlene R. Weir, Beth Sahlmann, Qing Zeng-Treitler
Abstract:
Supervised predictive models require labeled data for training purposes. Complete and accurate labeled data, i.e., a ‘gold standard’, is not always available, and imperfectly labeled data may need to serve as an alternative. An important question is if the accuracy of the labeled data creates a performance ceiling for the trained model. In this study, we trained several models to recognize the presence of delirium in clinical documents using data with annotations that are not completely accurate (i.e., weakly-supervised learning). In the external evaluation, the support vector machine model with a linear kernel performed best, achieving an area under the curve of 89.3% and accuracy of 88%, surpassing the 80% accuracy of the training sample. We then generated a set of simulated data and carried out a series of experiments which demonstrated that models trained on imperfect data can (but do not always) outperform the accuracy of the training data, e.g., the area under the curve for some models is higher than 80% when trained on the data with an error rate of 40%. Our experiments also showed that the error resistance of linear modeling is associated with larger sample size, error type, and linearity of the data (all p-values < 0.001). In conclusion, this study sheds light on the usefulness of imperfect data in clinical research via weakly-supervised learning.Keywords: weakly-supervised learning, support vector machine, prediction, delirium, simulation
Procedia PDF Downloads 1999861 The Environmental Impacts of Textiles Reuse and Recycling: A Review on Life-Cycle-Assessment Publications
Authors: Samuele Abagnato, Lucia Rigamonti
Abstract:
Life-Cycle-Assessment (LCA) is an effective tool to quantify the environmental impacts of reuse models and recycling technologies for textiles. In this work, publications in the last ten years about LCA on textile waste are classified according to location, goal and scope, functional unit, waste composition, impact assessment method, impact categories, and sensitivity analysis. Twenty papers have been selected: 50% are focused only on recycling, 30% only on reuse, the 15% on both, while only one paper considers only the final disposal of the waste. It is found that reuse is generally the best way to decrease the environmental impacts of textiles waste management because of the avoided impacts of manufacturing a new item. In the comparison between a product made with recycled yarns and a product from virgin materials, in general, the first option is less impact, especially for the categories of climate change, water depletion, and land occupation, while for other categories, such as eutrophication or ecotoxicity, under certain conditions the impacts of the recycled fibres can be higher. Cultivation seems to have quite high impacts when natural fibres are involved, especially in the land use and water depletion categories, while manufacturing requires a remarkable amount of electricity, with its associated impact on climate change. In the analysis of the reuse processes, relevant importance is covered by the laundry phase, with water consumption and impacts related to the use of detergents. About the sensitivity analysis, it can be stated that one of the main variables that influence the LCA results and that needs to be further investigated in the modeling of the LCA system about this topic is the substitution rate between recycled and virgin fibres, that is the amount of recycled material that can be used in place of virgin one. Related to this, also the yield of the recycling processes has a strong influence on the results of the impact. The substitution rate is also important in the modeling of the reuse processes because it represents the number of avoided new items bought in place of the reused ones. Another aspect that appears to have a large influence on the impacts is consumer behaviour during the use phase (for example, the number of uses between two laundry cycles). In conclusion, to have a deeper knowledge of the impacts of a life-cycle approach of textile waste, further data and research are needed in the modeling of the substitution rate and of the use phase habits of the consumers.Keywords: environmental impacts, life-cycle-assessment, textiles recycling, textiles reuse, textiles waste management
Procedia PDF Downloads 899860 Somatic Delusional Disorder Subsequent to Phantogeusia: A Case Report
Authors: Pedro Felgueiras, Ana Miguel, Nélson Almeida, Raquel Silva
Abstract:
Objective: Through the study of a clinical case of delusional somatic disorder secondary to phantogeusia, we aim to highlight the importance of considering psychosomatic conditions in differential diagnosis, as well as to emphasize the complexity of its comprehension, treatment, and respective impact on patients’ functioning. Methods: Bearing this in mind, we conducted a critical analysis of a case series based on patient observations, clinical data, and complementary diagnostic methods, as well as a non-systematic review of the literature on the subject. Results: A 61-year-old female patient with no history of psychiatric conditions. Family psychiatric history of mood disorder (depression), with psychotic features found in her mother. Medical history of many comorbidities affecting different organ systems (endocrine, gastrointestinal, genitourinary, ophthalmological). Documented neuroticism traits of personality. The patient’s family described a persistent concern about several physical symptoms across her life, with a continuous effort to obtain explanations about any sensation out of her normal perception. Since being subjected to endoscopy in 2018, she started complaints of persistent phantogeusia (acid taste) and developed excessive thoughts, feelings, and behaviors associated with this somatic symptom. The patient was evaluated by several medical specialties, and an extensive panel of medical exams was carried out, excluding any disease. Besides all the investigation and with no evidence of disease signs, acute anxiety, time, and energy dispended to this symptom culminated in severe psychosocial impairment. The patient was admitted to a psychiatric ward for investigation and treatment of this clinical picture, leading to the diagnosis of the delusional somatic disorder. In order to exclude the acute organic etiology of this psychotic disorder, an analytic panel was carried out with no abnormal results. In the context of a psychotic clinical picture, a CT scan was performed, which revealed a right cortical vascular lesion. Neuropsychological evaluation was made, with the description of cognitive functioning being globally normative. During treatment with an antipsychotic (pimozide), a complete remission of the somatic delusion was associated with the disappearance of gustative perception disturbance. In follow-up, a relapse of gustative sensation was documented, and her thoughts and speech were dominated by concerns about multiple somatic symptoms. Conclusion: In terms of abnormal bodily sensations, the oral cavity is one of the frequent sites of delusional disorder. Patients with these gustatory perception distortions complain about unusual sensations without corresponding abnormal findings in the oral area. Its pathophysiology has not been fully elucidated yet. In terms of its comprehensive psychopathology, this case was hypothesized as a paranoid development of a delusional somatic disorder triggered by a post-invasive procedure phantogeusia (which is described as a possible side effect of an endoscopy) in a patient with an anankastic personality. This case presents interesting psychopathology, reinforcing the complexity of psychosomatic disorders in terms of their etiopathogenesis, clinical treatment, and long-term prognosis.Keywords: psychosomatics, delusional somatic disorder, phantogeusia, paranoid development
Procedia PDF Downloads 1299859 School-Based Oral Assessment in Malaysian Schools
Authors: Sedigheh Abbasnasab Sardareh
Abstract:
The current study investigates ESL teachers' voices in order to formulate further research on the effectiveness of the SBOA practices. It is an attempt to find out (1) what are ESL experienced teachers’ perceptions, experiences, attitudes, and beliefs of SBOA; (2) what teaching and learning aspects of SBOA needs focus to enhance its effectiveness; (3) external issues related to the implementation of SBOA; (4) internal issues related to the implementation of SBOA; and also (5) perceived recommendations on SBOA. The study utilized focus group discussion sessions. 9 experienced ESL (5 females and 4 males) teachers were selected based on the consent letters sent to them. These teachers had over 20 years experience in both traditional and SBOA-type assessment and the train-the-trainer experts recommended by the Ministry of Education. Respondents were guided with open-ended questions to extracts their perceived experiences implementing SBOA guided structurally by the author as the moderator. Data were first discussed with the respondents for further clarifications and then only analyzed and re-confirmed with some recommendations before the final presentation of this preliminary results were presented here. The focus group discussions yielded some important perceived views on the SBOA implementation. Some of the themes were discussed and some recommendations were proposed for further in-depth study by the Ministry of Education. Some of the future directions based on the results were also put forward. Some external and internal variables were important in order for successful implementation of SBOA. Mere implementing a policy should be taken into consideration because this might impede some of the teaching and learning processes both by the classroom stakeholders such as teachers and student. More research methods such as the use of questionnaires could be utilized to further investigate to large populations of teacher educators in Malaysia.Keywords: school based oral assessment, Malaysia, ESL, focus group discussion
Procedia PDF Downloads 3259858 Continuous Improvement of Teaching Quality through Course Evaluation by the Students
Authors: Valerie Follonier, Henrike Hamelmann, Jean-Michel Jullien
Abstract:
The Distance Learning University in Switzerland (UniDistance) is offering bachelor and master courses as well as further education programs. The professors and their assistants work at traditional Swiss universities and are giving their courses at UniDistance following a blended learning and flipped classroom approach. A standardized course evaluation by the students has been established as a component of a quality improvement process. The students’ feedback enables the stakeholders to identify areas of improvement, initiate professional development for the teaching teams and thus continuously augment the quality of instruction. This paper describes the evaluation process, the tools involved and how the approach involving all stakeholders helps forming a culture of quality in teaching. Additionally, it will present the first evaluation results following the new process. Two software tools have been developed to support all stakeholders in the process of the semi-annual formative evaluation. The first tool allows to create the survey and to assign it to the relevant courses and students. The second tool presents the results of the evaluation to the stakeholders, providing specific features for the teaching teams, the dean, the directorate and EDUDL+ (Educational development unit distance learning). The survey items were selected in accordance with the e-learning strategy of the institution and are formulated to support the professional development of the teaching teams. By reviewing the results the teaching teams become aware of the opinion of the students and are asked to write a feedback for the attention of their dean. The dean reviews the results of the faculty and writes a general report about the situation of the faculty and the possible improvements intended. Finally, EDUDL+ writes a final report summarising the evaluation results. A mechanism of adjustable warnings allows it to generate quality indicators for each module. These are summarised for each faculty and globally for the whole institution in order to increase the vigilance of the responsible. The quality process involves changing the indicators regularly to focus on different areas each semester, to facilitate the professional development of the teaching teams and to progressively augment the overall teaching quality of the institution.Keywords: continuous improvement process, course evaluation, distance learning, software tools, teaching quality
Procedia PDF Downloads 2599857 Sensing of Cancer DNA Using Resonance Frequency
Authors: Sungsoo Na, Chanho Park
Abstract:
Lung cancer is one of the most common severe diseases driving to the death of a human. Lung cancer can be divided into two cases of small-cell lung cancer (SCLC) and non-SCLC (NSCLC), and about 80% of lung cancers belong to the case of NSCLC. From several studies, the correlation between epidermal growth factor receptor (EGFR) and NSCLCs has been investigated. Therefore, EGFR inhibitor drugs such as gefitinib and erlotinib have been used as lung cancer treatments. However, the treatments result showed low response (10~20%) in clinical trials due to EGFR mutations that cause the drug resistance. Patients with resistance to EGFR inhibitor drugs usually are positive to KRAS mutation. Therefore, assessment of EGFR and KRAS mutation is essential for target therapies of NSCLC patient. In order to overcome the limitation of conventional therapies, overall EGFR and KRAS mutations have to be monitored. In this work, the only detection of EGFR will be presented. A variety of techniques has been presented for the detection of EGFR mutations. The standard detection method of EGFR mutation in ctDNA relies on real-time polymerase chain reaction (PCR). Real-time PCR method provides high sensitive detection performance. However, as the amplification step increases cost effect and complexity increase as well. Other types of technology such as BEAMing, next generation sequencing (NGS), an electrochemical sensor and silicon nanowire field-effect transistor have been presented. However, those technologies have limitations of low sensitivity, high cost and complexity of data analyzation. In this report, we propose a label-free and high-sensitive detection method of lung cancer using quartz crystal microbalance based platform. The proposed platform is able to sense lung cancer mutant DNA with a limit of detection of 1nM.Keywords: cancer DNA, resonance frequency, quartz crystal microbalance, lung cancer
Procedia PDF Downloads 2339856 Role of Direct Immunofluorescence in Diagnosing Vesiculobullous Lesions
Authors: Mitakshara Sharma, Sonal Sharma
Abstract:
Vesiculobullous diseases are heterogeneous group of dermatological disorders with protean manifestations. The most important technique for the patients with vesiculobullous diseases is conventional histopathology and confirmatory tests like direct immunofluorescence (DIF) and indirect immunofluorescence (IIF). DIF has been used for decades to investigate pathophysiology and in the diagnosis. It detects molecules such as immunoglobulins and complement components. It is done on the perilesional skin. Diagnosis of DIF test depends on features like primary site of the immune deposits, class of immunoglobulin, number of immune deposits and deposition at other sites. The aim of the study is to correlate DIF with clinical and histopathological findings and to analyze the utility of DIF in the diagnosis of these disorders. It is a retrospective descriptive study conducted for 2 years from 2015 to 2017 in Department of Pathology, GTB Hospital on perilesional punch biopsies of vesiculobullous lesions. Biopsies were sent in Michael’s medium. The specimens were washed, frozen and incubated with fluorescein isothiocyanate (FITC) tagged antihuman antibodies IgA, IgG, IgM, C3 & F and were viewed under fluorescent microscope. Out of 401 skin biopsies submitted for DIF, 285 were vesiculobullous diseases, in which the most common was Pemphigus vulgaris (34%) followed by Bullous pemphigoid (21.5%), Dermatitis herpetiformis (16%), Pemphigus foliaceus (11.9%), Linear IgA disease (11.9%), Epidermolysisbullosa (2.39%) and Pemphigus herpetiformis (1.7%). We will be presenting the DIF findings in the all these vesiculobullous diseases. DIF in conjugation with histopathology gives the best diagnostic yield in these lesions. It also helps in the diagnosis whenever there is a clinical and histopathological overlap.Keywords: antibodies, direct immunofluorescence, pemphigus, vesiculobullous
Procedia PDF Downloads 3639855 Mapping Iron Content in the Brain with Magnetic Resonance Imaging and Machine Learning
Authors: Gabrielle Robertson, Matthew Downs, Joseph Dagher
Abstract:
Iron deposition in the brain has been linked with a host of neurological disorders such as Alzheimer’s, Parkinson’s, and Multiple Sclerosis. While some treatment options exist, there are no objective measurement tools that allow for the monitoring of iron levels in the brain in vivo. An emerging Magnetic Resonance Imaging (MRI) method has been recently proposed to deduce iron concentration through quantitative measurement of magnetic susceptibility. This is a multi-step process that involves repeated modeling of physical processes via approximate numerical solutions. For example, the last two steps of this Quantitative Susceptibility Mapping (QSM) method involve I) mapping magnetic field into magnetic susceptibility and II) mapping magnetic susceptibility into iron concentration. Process I involves solving an ill-posed inverse problem by using regularization via injection of prior belief. The end result from Process II highly depends on the model used to describe the molecular content of each voxel (type of iron, water fraction, etc.) Due to these factors, the accuracy and repeatability of QSM have been an active area of research in the MRI and medical imaging community. This work aims to estimate iron concentration in the brain via a single step. A synthetic numerical model of the human head was created by automatically and manually segmenting the human head on a high-resolution grid (640x640x640, 0.4mm³) yielding detailed structures such as microvasculature and subcortical regions as well as bone, soft tissue, Cerebral Spinal Fluid, sinuses, arteries, and eyes. Each segmented region was then assigned tissue properties such as relaxation rates, proton density, electromagnetic tissue properties and iron concentration. These tissue property values were randomly selected from a Probability Distribution Function derived from a thorough literature review. In addition to having unique tissue property values, different synthetic head realizations also possess unique structural geometry created by morphing the boundary regions of different areas within normal physical constraints. This model of the human brain is then used to create synthetic MRI measurements. This is repeated thousands of times, for different head shapes, volume, tissue properties and noise realizations. Collectively, this constitutes a training-set that is similar to in vivo data, but larger than datasets available from clinical measurements. This 3D convolutional U-Net neural network architecture was used to train data-driven Deep Learning models to solve for iron concentrations from raw MRI measurements. The performance was then tested on both synthetic data not used in training as well as real in vivo data. Results showed that the model trained on synthetic MRI measurements is able to directly learn iron concentrations in areas of interest more effectively than other existing QSM reconstruction methods. For comparison, models trained on random geometric shapes (as proposed in the Deep QSM method) are less effective than models trained on realistic synthetic head models. Such an accurate method for the quantitative measurement of iron deposits in the brain would be of important value in clinical studies aiming to understand the role of iron in neurological disease.Keywords: magnetic resonance imaging, MRI, iron deposition, machine learning, quantitative susceptibility mapping
Procedia PDF Downloads 1369854 Formex Algebra Adaptation into Parametric Design Tools: Dome Structures
Authors: Réka Sárközi, Péter Iványi, Attila B. Széll
Abstract:
The aim of this paper is to present the adaptation of the dome construction tool for formex algebra to the parametric design software Grasshopper. Formex algebra is a mathematical system, primarily used for planning structural systems such like truss-grid domes and vaults, together with the programming language Formian. The goal of the research is to allow architects to plan truss-grid structures easily with parametric design tools based on the versatile formex algebra mathematical system. To produce regular structures, coordinate system transformations are used and the dome structures are defined in spherical coordinate system. Owing to the abilities of the parametric design software, it is possible to apply further modifications on the structures and gain special forms. The paper covers the basic dome types, and also additional dome-based structures using special coordinate-system solutions based on spherical coordinate systems. It also contains additional structural possibilities like making double layer grids in all geometry forms. The adaptation of formex algebra and the parametric workflow of Grasshopper together give the possibility of quick and easy design and optimization of special truss-grid domes.Keywords: parametric design, structural morphology, space structures, spherical coordinate system
Procedia PDF Downloads 2549853 Digital Library Evaluation by SWARA-WASPAS Method
Authors: Mehmet Yörükoğlu, Serhat Aydın
Abstract:
Since the discovery of the manuscript, mechanical methods for storing, transferring and using the information have evolved into digital methods over the time. In this process, libraries that are the center of the information have also become digitized and become accessible from anywhere and at any time in the world by taking on a structure that has no physical boundaries. In this context, some criteria for information obtained from digital libraries have become more important for users. This paper evaluates the user criteria from different perspectives that make a digital library more useful. The Step-Wise Weight Assessment Ratio Analysis-Weighted Aggregated Sum Product Assessment (SWARA-WASPAS) method is used with flexibility and easy calculation steps for the evaluation of digital library criteria. Three different digital libraries are evaluated by information technology experts according to five conflicting main criteria, ‘interface design’, ‘effects on users’, ‘services’, ‘user engagement’ and ‘context’. Finally, alternatives are ranked in descending order.Keywords: digital library, multi criteria decision making, SWARA-WASPAS method
Procedia PDF Downloads 1519852 Targeted Delivery of Sustained Release Polymeric Nanoparticles for Cancer Therapy
Authors: Jamboor K. Vishwanatha
Abstract:
Among the potent anti-cancer agents, curcumin has been found to be very efficacious against various cancer cells. Despite multiple medicinal benefits of curcumin, poor water solubility, poor physiochemical properties and low bioavailability continue to pose major challenges in developing a formulation for clinical efficacy. To improve its potential application in the clinical area, we formulated poly lactic-co-glycolic acid (PLGA) nanoparticles. The PLGA nanoparticles were formulated using solid-oil/water emulsion solvent evaporation method and then characterized for percent yield, encapsulation efficiency, surface morphology, particle size, drug distribution within nanoparticles and drug polymer interaction. Our studies showed the successful formation of smooth and spherical curcumin loaded PLGA nanoparticles with a high percent yield of about 92.01±0.13% and an encapsulation efficiency of 90.88±0.14%. The mean particle size of the nanoparticles was found to be 145nm. The in vitro drug release profile showed 55-60% drug release from the nanoparticles over a period of 24 hours with continued sustained release over a period of 8 days. Exposure to curcumin loaded nanoparticles resulted in reduced cell viability of cancer cells compared to normal cells. We used a novel non-covalent insertion of a homo-bifunctional spacer for targeted delivery of curcumin to various cancer cells. Functionalized nanoparticles for antibody/targeting agent conjugation was prepared using a cross-linking ligand, bis(sulfosuccinimidyl) suberate (BS3), which has reactive carboxyl group to conjugate efficiently to the primary amino groups of the targeting agents. In our studies, we demonstrated successful conjugation of antibodies, Annexin A2 or prostate specific membrane antigen (PSMA), to curcumin loaded PLGA nanoparticles for targeting to prostate and breast cancer cells. The percent antibody attachment to PLGA nanoparticles was found to be 92.8%. Efficient intra-cellular uptake of the targeted nanoparticles was observed in the cancer cells. These results have emphasized the potential of our multifunctional curcumin nanoparticles to improve the clinical efficacy of curcumin therapy in patients with cancer.Keywords: polymeric nanoparticles, cancer therapy, sustained release, curcumin
Procedia PDF Downloads 3259851 Revolutionizing Gaming Setup Design: Utilizing Generative and Iterative Methods to Prop and Environment Design, Transforming the Landscape of Game Development Through Automation and Innovation
Authors: Rashmi Malik, Videep Mishra
Abstract:
The practice of generative design has become a transformative approach for an efficient way of generating multiple iterations for any design project. The conventional way of modeling the game elements is very time-consuming and requires skilled artists to design. A 3D modeling tool like 3D S Max, Blender, etc., is used traditionally to create the game library, which will take its stipulated time to model. The study is focused on using the generative design tool to increase the efficiency in game development at the stage of prop and environment generation. This will involve procedural level and customized regulated or randomized assets generation. The paper will present the system design approach using generative tools like Grasshopper (visual scripting) and other scripting tools to automate the process of game library modeling. The script will enable the generation of multiple products from the single script, thus creating a system that lets designers /artists customize props and environments. The main goal is to measure the efficacy of the automated system generated to create a wide variety of game elements, further reducing the need for manual content creation and integrating it into the workflow of AAA and Indie Games.Keywords: iterative game design, generative design, gaming asset automation, generative game design
Procedia PDF Downloads 709850 Pose-Dependency of Machine Tool Structures: Appearance, Consequences, and Challenges for Lightweight Large-Scale Machines
Authors: S. Apprich, F. Wulle, A. Lechler, A. Pott, A. Verl
Abstract:
Large-scale machine tools for the manufacturing of large work pieces, e.g. blades, casings or gears for wind turbines, feature pose-dependent dynamic behavior. Small structural damping coefficients lead to long decay times for structural vibrations that have negative impacts on the production process. Typically, these vibrations are handled by increasing the stiffness of the structure by adding mass. That is counterproductive to the needs of sustainable manufacturing as it leads to higher resource consumption both in material and in energy. Recent research activities have led to higher resource efficiency by radical mass reduction that rely on control-integrated active vibration avoidance and damping methods. These control methods depend on information describing the dynamic behavior of the controlled machine tools in order to tune the avoidance or reduction method parameters according to the current state of the machine. The paper presents the appearance, consequences and challenges of the pose-dependent dynamic behavior of lightweight large-scale machine tool structures in production. The paper starts with the theoretical introduction of the challenges of lightweight machine tool structures resulting from reduced stiffness. The statement of the pose-dependent dynamic behavior is corroborated by the results of the experimental modal analysis of a lightweight test structure. Afterwards, the consequences of the pose-dependent dynamic behavior of lightweight machine tool structures for the use of active control and vibration reduction methods are explained. Based on the state of the art on pose-dependent dynamic machine tool models and the modal investigation of an FE-model of the lightweight test structure, the criteria for a pose-dependent model for use in vibration reduction are derived. The description of the approach for a general pose-dependent model of the dynamic behavior of large lightweight machine tools that provides the necessary input to the aforementioned vibration avoidance and reduction methods to properly tackle machine vibrations is the outlook of the paper.Keywords: dynamic behavior, lightweight, machine tool, pose-dependency
Procedia PDF Downloads 4599849 Functional Outcome of Femoral Neck System (FNS) In the Management of Neck of Femur Fractures
Authors: Ronak Mishra, Sachin Kale
Abstract:
Background: The clinical outcome of a new fixation device (femoral neck system, FNS) for femoral neck fractures is not described properly. The main purpose of this study was to evaluate the functional outcome of the patients of femoral neck fractures treated with FNS. Methods: A retrospective study was done among patients aged 60 years or less. On the basis of inclusion and exclusion criteria a final sample size of 30 was considered. Blood loss, type of fracture internal fixation, and length of clinical follow-up were all acquired from patient records. The volume of blood loss was calculated. The mean and standard deviation of continuous variables were reported (with range). Harris Hip score (HHS) And Post op xrays at intervals(6 weeks, 6 months ,12 months ) we used to clinically asses the patient. Results: Out of all 60% were females and 40% were males. The mean age of the patients was. 44.12(+-) years The comparison of functional outcomes of the patients treated with FNS using Harris Hip Score. It showed a highly significant comparison between the patients at post operatively , 6 weeks and 3 months and 12 months . There were no postoperative complications seen among the patients. Conclusion: FNS offers superior biomechanical qualities and greatly improved overall build stability. It allows for a significant reduction in operation time, potentially lowering risks and consequences associated with surgery.Keywords: FNS, trauma, hip, neck femur fracture, minimally invasive surgery
Procedia PDF Downloads 889848 Post-Pandemic Public Space, Case Study of Public Parks in Kerala
Authors: Nirupama Sam
Abstract:
COVID-19, the greatest pandemic since the turn of the century, presents several issues for urban planners, the most significant of which is determining appropriate mitigation techniques for creating pandemic-friendly and resilient public spaces. The study is conducted in four stages. The first stage consisted of literature reviews to examine the evolution and transformation of public spaces during pandemics throughout history and the role of public spaces during pandemic outbreaks. The second stage is to determine the factors that influence the success of public spaces, which was accomplished by an analysis of current literature and case studies. The influencing factors are categorized under comfort and images, uses and activity, access and linkages, and sociability. The third stage is to establish the priority of identified factors for which a questionnaire survey of stakeholders is conducted and analyzing of certain factors with the help of GIS tools. COVID-19 has been in effect in India for the last two years. Kerala has the highest daily COVID-19 prevalence due to its high population density, making it more susceptible to viral outbreaks. Despite all preventive measures taken against COVID-19, Kerala remains the worst-affected state in the country. Finally, two live case studies of the hardest-hit localities, namely Subhash bose park and Napier Museum park in the Ernakulam and Trivandrum districts of Kerala, respectively, were chosen as study areas for the survey. The responses to the questionnaire were analyzed using SPSS for determining the weights of the influencing factors. The spatial success of the selected case studies was examined using the GIS interpolation model. Following the overall assessment, the fourth stage is to develop strategies and guidelines for planning public spaces to make them more efficient and robust, which further leads to improved quality, safety and resilience to future pandemics.Keywords: urban design, public space, covid-19, post-pandemic, public spaces
Procedia PDF Downloads 1379847 Prediction of Cardiovascular Markers Associated With Aromatase Inhibitors Side Effects Among Breast Cancer Women in Africa
Authors: Jean Paul M. Milambo
Abstract:
Purpose: Aromatase inhibitors (AIs) are indicated in the treatment of hormone-receptive breast cancer in postmenopausal women in various settings. Studies have shown cardiovascular events in some developed countries. To date the data is sparce for evidence-based recommendations in African clinical settings due to lack of cancer registries, capacity building and surveillance systems. Therefore, this study was conducted to assess the feasibility of HyBeacon® probe genotyping adjunctive to standard care for timely prediction and diagnosis of Aromatase inhibitors (AIs) associated adverse events in breast cancer survivors in Africa. Methods: Cross sectional study was conducted to assess the knowledge of POCT among six African countries using online survey and telephonically contacted. Incremental cost effectiveness ratio (ICER) was calculated, using diagnostic accuracy study. This was based on mathematical modeling. Results: One hundred twenty-six participants were considered for analysis (mean age = 61 years; SD = 7.11 years; 95%CI: 60-62 years). Comparison of genotyping from HyBeacon® probe technology to Sanger sequencing showed that sensitivity was reported at 99% (95% CI: 94.55% to 99.97%), specificity at 89.44% (95% CI: 87.25 to 91.38%), PPV at 51% (95%: 43.77 to 58.26%), and NPV at 99.88% (95% CI: 99.31 to 100.00%). Based on the mathematical model, the assumptions revealed that ICER was R7 044.55. Conclusion: POCT using HyBeacon® probe genotyping for AI-associated adverse events maybe cost effective in many African clinical settings. Integration of preventive measures for early detection and prevention guided by different subtype of breast cancer diagnosis with specific clinical, biomedical and genetic screenings may improve cancer survivorship. Feasibility of POCT was demonstrated but the implementation could be achieved by improving the integration of POCT within primary health cares, referral cancer hospitals with capacity building activities at different level of health systems. This finding is pertinent for a future envisioned implementation and global scale-up of POCT-based initiative as part of risk communication strategies with clear management pathways.Keywords: breast cancer, diagnosis, point of care, South Africa, aromatase inhibitors
Procedia PDF Downloads 789846 Vulnerability Assessment of Vertically Irregular Structures during Earthquake
Authors: Pranab Kumar Das
Abstract:
Vulnerability assessment of buildings with irregularity in the vertical direction has been carried out in this study. The constructions of vertically irregular buildings are increasing in the context of fast urbanization in the developing countries including India. During two reconnaissance based survey performed after Nepal earthquake 2015 and Imphal (India) earthquake 2016, it has been observed that so many structures are damaged due to the vertically irregular configuration. These irregular buildings are necessary to perform safely during seismic excitation. Therefore, it is very urgent demand to point out the actual vulnerability of the irregular structure. So that remedial measures can be taken for protecting those structures during natural hazard as like earthquake. This assessment will be very helpful for India and as well as for the other developing countries. A sufficient number of research has been contributed to the vulnerability of plan asymmetric buildings. In the field of vertically irregular buildings, the effort has not been forwarded much to find out their vulnerability during an earthquake. Irregularity in vertical direction may be caused due to irregular distribution of mass, stiffness and geometrically irregular configuration. Detailed analysis of such structures, particularly non-linear/ push over analysis for performance based design seems to be challenging one. The present paper considered a number of models of irregular structures. Building models made of both reinforced concrete and brick masonry are considered for the sake of generality. The analyses are performed with both help of finite element method and computational method.The study, as a whole, may help to arrive at a reasonably good estimate, insight for fundamental and other natural periods of such vertically irregular structures. The ductility demand, storey drift, and seismic response study help to identify the location of critical stress concentration. Summarily, this paper is a humble step for understanding the vulnerability and framing up the guidelines for vertically irregular structures.Keywords: ductility, stress concentration, vertically irregular structure, vulnerability
Procedia PDF Downloads 2299845 Transcendence, Spirituality and Well-Being: A Cognitive-Theological Perspective
Authors: Monir Ahmed
Abstract:
This paper aims at discussing transcendence, spirituality, and well-being in light of the psychology of religion and spirituality. The main purpose of this paper is i) to demonstrate the importance of cognitive psychological process (thoughts, faith, and beliefs) and the doctrine of creation (‘creatio ex nihilo’) in transcendence, spirituality, and well-being; ii) to discuss the relationships among transcendence, spirituality, and well-being. Psychological studies of spiritual and religious phenomena have been advanced in the decade, mainly to understand how faith, spiritual and religious rituals influence or contribute to well-being. Psychologists of religion and spirituality have put forward methods, tools, and approaches necessary for promoting well-being. For instance, Kenneth I. Pargament, an American psychologist of religion and spirituality, developed spiritually integrated psychotherapy for clinical practice in dealing with the spiritual and religious issues affecting well-being. However, not much progress has been made in understanding the ability of transcendence and how such ability influences spirituality and religion as well as well-being. A possible reason could be that well-being has only been understood in a spiritual and religious context. It appears that transcendence, the core element of spirituality and religion, has not been explored adequately for well-being. In other words, the approaches that have been used so far for spirituality, religion, and well-being lack an integrated approach combining theology and psychology. The author of this paper proposes that cognitive-theological understanding involving faith and belief about the creation and the creator, the transcendent God is likely to offer a comprehensive understanding of transcendence as well as spirituality, religion, and their relationships with well-being. The importance of transcendence and the integration of psychology and theology can advance our knowledge of transcendence, spirituality, and well-being. It is inevitable that the creation is contingent and that the ultimate origin, source of the contingent physical reality, is a non-contingent being, the divine creator. As such, it is not unreasonable for many individuals to believe that the source of existence of non-contingent being, although undiscoverable in physical reality but transcendentally exists. ‘Creatio ex nihilo’ is the most fundamental doctrine in the Abrahamic faiths, i.e., Judaism, Christianity and Islam, and is widely accepted scriptural and philosophical background about the creation, creator, the divine that God created the universe out of nothing. Therefore, it is crucial to integrate theology, i.e., ‘creatio ex nihilo’ doctrine and psychology for a comprehensive understanding of transcendence, spirituality and their relationships with well-being.Keywords: transcendence, spirituality, well-being, ‘creatio ex nihilo’ doctrine
Procedia PDF Downloads 1399844 A Multi Sensor Monochrome Video Fusion Using Image Quality Assessment
Authors: M. Prema Kumar, P. Rajesh Kumar
Abstract:
The increasing interest in image fusion (combining images of two or more modalities such as infrared and visible light radiation) has led to a need for accurate and reliable image assessment methods. This paper gives a novel approach of merging the information content from several videos taken from the same scene in order to rack up a combined video that contains the finest information coming from different source videos. This process is known as video fusion which helps in providing superior quality (The term quality, connote measurement on the particular application.) image than the source images. In this technique different sensors (whose redundant information can be reduced) are used for various cameras that are imperative for capturing the required images and also help in reducing. In this paper Image fusion technique based on multi-resolution singular value decomposition (MSVD) has been used. The image fusion by MSVD is almost similar to that of wavelets. The idea behind MSVD is to replace the FIR filters in wavelet transform with singular value decomposition (SVD). It is computationally very simple and is well suited for real time applications like in remote sensing and in astronomy.Keywords: multi sensor image fusion, MSVD, image processing, monochrome video
Procedia PDF Downloads 5729843 A Framework on Data and Remote Sensing for Humanitarian Logistics
Authors: Vishnu Nagendra, Marten Van Der Veen, Stefania Giodini
Abstract:
Effective humanitarian logistics operations are a cornerstone in the success of disaster relief operations. However, for effectiveness, they need to be demand driven and supported by adequate data for prioritization. Without this data operations are carried out in an ad hoc manner and eventually become chaotic. The current availability of geospatial data helps in creating models for predictive damage and vulnerability assessment, which can be of great advantage to logisticians to gain an understanding on the nature and extent of the disaster damage. This translates into actionable information on the demand for relief goods, the state of the transport infrastructure and subsequently the priority areas for relief delivery. However, due to the unpredictable nature of disasters, the accuracy in the models need improvement which can be done using remote sensing data from UAVs (Unmanned Aerial Vehicles) or satellite imagery, which again come with certain limitations. This research addresses the need for a framework to combine data from different sources to support humanitarian logistic operations and prediction models. The focus is on developing a workflow to combine data from satellites and UAVs post a disaster strike. A three-step approach is followed: first, the data requirements for logistics activities are made explicit, which is done by carrying out semi-structured interviews with on field logistics workers. Second, the limitations in current data collection tools are analyzed to develop workaround solutions by following a systems design approach. Third, the data requirements and the developed workaround solutions are fit together towards a coherent workflow. The outcome of this research will provide a new method for logisticians to have immediately accurate and reliable data to support data-driven decision making.Keywords: unmanned aerial vehicles, damage prediction models, remote sensing, data driven decision making
Procedia PDF Downloads 3789842 Assessment of HIV/Hepatitis B Virus Co-Infection among Patients Living with HIV in Northern and Southern Region of Nigeria
Authors: Folajinmi Oluwasina, Greg Abiaziem, Moses Luke, Mobolaji Kolawole, Nancy Yibowei, Anne Taiwo
Abstract:
Background: Occurrence of HIV infection has an adverse effect on the natural causes of Hepatitis B Viral (HBV) infection, faster progression of hepatic fibrosis demonstrated in patients with co-infection. This study was carried out to determine the incidence of HBV infection among HIV-positive patients, and to retrospectively evaluate laboratory characteristics of patients with HIV/HBV co-infection. Methods: A retrospective analysis of patient files for all HIV-infected cases followed-up and treated at 52 health facilities. Among HIV-infected cases, those with HBsAg positivity and HIV/Hepatitis B co-infection were determined. Socio demographic, alcohol or substance use, ART, CD4, Viral Load levels and treatment durations were retrospectively evaluated. Results: Of the 125 HIV-infected patients evaluated retrospectively, 17 (13.6%) had HBsAg positivity. Of these 17 cases were 11(64.7%) male and 6 (35.3%) female, with a mean age of 48.7 years. No patients had a history of alcohol or substance use. The mean duration of follow up was 28 months. 9 (52.9%) patients had negative HBV DNA at presentation while 8(47%) had positive HBV DNA, with normal ALT levels in all subjects. Among the 9 cases with negative HBV DNA who had no indication for the treatment of chronic hepatitis B. In five cases, treatment was commenced since HBV DNA was elevated in conjunction with low CD4. One patient in whom treatment was not indicated based on HBV DNA and CD4 levels in conjunction with the absence of AIDS defining clinical picture was currently being followed-up without treatment. Of the patients receiving HAART therapy, the average CD4 count at presentation was 278 cells/mm3 vs. 466 cells/mm3 at the end of 12 months. In three subjects with positive HBV DNA, a decrease in HBV DNA was noted after initiation of treatment. In four patients with negative DNA who received treatment, the HBV DNA negative status was found to remain, while one patient who did not receive treatment had elevated HBV DNA and decreased CD4 levels. Conclusion: It was shown that this group of patients with HIV/HBV co-infection, HAART was found to be associated with a decrease in HBV DNA in HBV DNA positive cases, absence of transition to positivity among those with negative HBV DNA, and with increased CD4 in all subjects.Keywords: Hepatitis B, DNA, anti retroviral therapy, co-infection
Procedia PDF Downloads 2709841 Telehealth Psychotherapy: A Comparison of Two Swedish Randomized Clinical Trials
Authors: Madeline Foster
Abstract:
Since the COVID-19 pandemic, telehealth usage for the delivery of psychotherapy has surged. The evidence base evaluating the success of telehealth interventions continues to grow, with both benefits as well as potential risks identified. This study compared two recent randomized clinical trials (RCTs) from Sweden that looked at the effectiveness of Cognitive Behavioral Therapy (CBT) delivered via telehealth (TH) versus face-to-face (FTF) for individuals with Obsessive Compulsive Disorder (OCD). The papers had mixed results. The first paper by Aspvall and colleagues compared the effect of a therapist-supported, internet-delivered stepped-care CBT program for children and adolescents aged 7 to 17 with face-to-face CBT (2021). In Aspvall’s study, the control scored a mean Y-BOCS of 10.57 and the TH intervention group scored a mean Y-BOCS of 11.57. The mean difference (0.91) met the criteria for noninferiority (p = 0.03). The second study by Lundström and colleagues also compared therapist-supported, internet-based CBT with FTF CBT for the treatment of those with DSM-5-diagnosed OCD. Conversely, while Lundström’s study reported improved symptoms across all groups, at follow up the difference in symptom severity between FTF and TH was clinically significant, with 77% of FTF participants responding to treatment compared to only 45% of TH participants. Due to the methodological limitations of Lundström’s study, it was concluded that Aspvall’s paper made a stronger scientific argument.Keywords: telehealth, Sweden, RCT, cognitive-behavioral therapy, obsessive-compulsive disorder
Procedia PDF Downloads 619840 Conceptualizing IoT Based Framework for Enhancing Environmental Accounting By ERP Systems
Authors: Amin Ebrahimi Ghadi, Morteza Moalagh
Abstract:
This research is carried out to find how a perfect combination of IoT architecture (Internet of Things) and ERP system can strengthen environmental accounting to incorporate both economic and environmental information. IoT (e.g., sensors, software, and other technologies) can be used in the company’s value chain from raw material extraction through materials processing, manufacturing products, distribution, use, repair, maintenance, and disposal or recycling products (Cradle to Grave model). The desired ERP software then will have the capability to track both midpoint and endpoint environmental impacts on a green supply chain system for the whole life cycle of a product. All these enable environmental accounting to calculate, and real-time analyze the operation environmental impacts, control costs, prepare for environmental legislation and enhance the decision-making process. In this study, we have developed a model on how to use IoT devices in life cycle assessment (LCA) to gather emissions, energy consumption, hazards, and wastes information to be processed in different modules of ERP systems in an integrated way for using in environmental accounting to achieve sustainability.Keywords: ERP, environmental accounting, green supply chain, IOT, life cycle assessment, sustainability
Procedia PDF Downloads 1729839 Role of Estrogen Receptor-alpha in Mammary Carcinoma by Single Nucleotide Polymorphisms and Molecular Docking: An In-silico Analysis
Authors: Asif Bilal, Fouzia Tanvir, Sibtain Ahmad
Abstract:
Estrogen receptor alpha, also known as estrogen receptor-1, is highly involved in risk of mammary carcinoma. The objectives of this study were to identify non-synonymous SNPs of estrogen receptor and their association with breast cancer and to identify the chemotherapeutic responses of phytochemicals against it via in-silico study design. For this purpose, different online tools. to identify pathogenic SNPs the tools were SIFT, Polyphen, Polyphen-2, fuNTRp, SNAP2, for finding disease associated SNPs the tools SNP&GO, PhD-SNP, PredictSNP, MAPP, SNAP, MetaSNP, PANTHER, and to check protein stability Mu-Pro, I-Mutant, and CONSURF were used. Post-translational modifications (PTMs) were detected by Musitedeep, Protein secondary structure by SOPMA, protein to protein interaction by STRING, molecular docking by PyRx. Seven SNPs having rsIDs (rs760766066, rs779180038, rs956399300, rs773683317, rs397509428, rs755020320, and rs1131692059) showing mutations on I229T, R243C, Y246H, P336R, Q375H, R394S, and R394H, respectively found to be completely deleterious. The PTMs found were 96 times Glycosylation; 30 times Ubiquitination, a single time Acetylation; and no Hydroxylation and Phosphorylation were found. The protein secondary structure consisted of Alpha helix (Hh) is (28%), Extended strand (Ee) is (21%), Beta turn (Tt) is 7.89% and Random coil (Cc) is (44.11%). Protein-protein interaction analysis revealed that it has strong interaction with Myeloperoxidase, Xanthine dehydrogenase, carboxylesterase 1, Glutathione S-transferase Mu 1, and with estrogen receptors. For molecular docking we used Asiaticoside, Ilekudinuside, Robustoflavone, Irinoticane, Withanolides, and 9-amin0-5 as ligands that extract from phytochemicals and docked with this protein. We found that there was great interaction (from -8.6 to -9.7) of these ligands of phytochemicals at ESR1 wild and two mutants (I229T and R394S). It is concluded that these SNPs found in ESR1 are involved in breast cancer and given phytochemicals are highly helpful against breast cancer as chemotherapeutic agents. Further in vitro and in vivo analysis should be performed to conduct these interactions.Keywords: breast cancer, ESR1, phytochemicals, molecular docking
Procedia PDF Downloads 699838 Risk Assessment on New Bio-Composite Materials Made from Water Resource Recovery
Authors: Arianna Nativio, Zoran Kapelan, Jan Peter van der Hoek
Abstract:
Bio-composite materials are becoming increasingly popular in various applications, such as the automotive industry. Usually, bio-composite materials are made from natural resources recovered from plants, now, a new type of bio-composite material has begun to be produced in the Netherlands. This material is made from resources recovered from drinking water treatments (calcite), wastewater treatment (cellulose), and material from surface water management (aquatic plants). Surface water, raw drinking water, and wastewater can be contaminated with pathogens and chemical compounds. Therefore, it would be valuable to develop a framework to assess, monitor, and control the potential risks. Indeed, the goal is to define the major risks in terms of human health, quality of materials, and environment associated with the production and application of these new materials. This study describes the general risk assessment framework, starting with a qualitative risk assessment. The qualitative risk analysis was carried out by using the HAZOP methodology for the hazard identification phase. The HAZOP methodology is logical and structured and able to identify the hazards in the first stage of the design when hazards and associated risks are not well known. The identified hazards were analyzed to define the potential associated risks, and then these were evaluated by using the qualitative Event Tree Analysis. ETA is a logical methodology used to define the consequences for a specific hazardous incidents, evaluating the failure modes of safety barriers and dangerous intermediate events that lead to the final scenario (risk). This paper shows the effectiveness of combining of HAZOP and qualitative ETA methodologies for hazard identification and risk mapping. Then, key risks were identified, and a quantitative framework was developed based on the type of risks identified, such as QMRA and QCRA. These two models were applied to assess human health risks due to the presence of pathogens and chemical compounds such as heavy metals into the bio-composite materials. Thus, due to these contaminations, the bio-composite product, during its application, might release toxic substances into the environment leading to a negative environmental impact. Therefore, leaching tests are going to be planned to simulate the application of these materials into the environment and evaluate the potential leaching of inorganic substances, assessing environmental risk.Keywords: bio-composite, risk assessment, water reuse, resource recovery
Procedia PDF Downloads 1099837 A Tool to Measure Efficiency and Trust Towards eXplainable Artificial Intelligence in Conflict Detection Tasks
Authors: Raphael Tuor, Denis Lalanne
Abstract:
The ATM research community is missing suitable tools to design, test, and validate new UI prototypes. Important stakes underline the implementation of both DSS and XAI methods into current systems. ML-based DSS are gaining in relevance as ATFM becomes increasingly complex. However, these systems only prove useful if a human can understand them, and thus new XAI methods are needed. The human-machine dyad should work as a team and should understand each other. We present xSky, a configurable benchmark tool that allows us to compare different versions of an ATC interface in conflict detection tasks. Our main contributions to the ATC research community are (1) a conflict detection task simulator (xSky) that allows to test the applicability of visual prototypes on scenarios of varying difficulty and outputting relevant operational metrics (2) a theoretical approach to the explanations of AI-driven trajectory predictions. xSky addresses several issues that were identified within available research tools. Researchers can configure the dimensions affecting scenario difficulty with a simple CSV file. Both the content and appearance of the XAI elements can be customized in a few steps. As a proof-of-concept, we implemented an XAI prototype inspired by the maritime field.Keywords: air traffic control, air traffic simulation, conflict detection, explainable artificial intelligence, explainability, human-automation collaboration, human factors, information visualization, interpretability, trajectory prediction
Procedia PDF Downloads 1609836 Performance Enhancement of Autopart Manufacturing Industry Using Lean Manufacturing Strategies: A Case Study
Authors: Raman Kumar, Jasgurpreet Singh Chohan, Chander Shekhar Verma
Abstract:
Today, the manufacturing industries respond rapidly to new demands and compete in this continuously changing environment, thus seeking out new methods allowing them to remain competitive and flexible simultaneously. The aim of the manufacturing organizations is to reduce manufacturing costs and wastes through system simplification, organizational potential, and proper infrastructural planning by using modern techniques like lean manufacturing. In India, large number of medium and large scale manufacturing industries has successfully implemented lean manufacturing techniques. Keeping in view the above-mentioned facts, different tools will be involved in the successful implementation of the lean approach. The present work is focused on the auto part manufacturing industry to improve the performance of the recliner assembly line. There is a number of lean manufacturing tools available, but the experience and complete knowledge of manufacturing processes are required to select an appropriate tool for a specific process. Fishbone diagrams (scrap, inventory, and waiting) have been drawn to identify the root cause of different. Effect of cycle time reduction on scrap and inventory is analyzed thoroughly in the case company. Results have shown that there is a decrease in inventory cost by 7 percent after the successful implementation of the lean tool.Keywords: lean tool, fish-bone diagram, cycle time reduction, case study
Procedia PDF Downloads 127