Search results for: analytical tools
5149 Development of Open Source Geospatial Certification Model Based on Geospatial Technology Competency Model
Authors: Tanzeel Ur Rehman Khan, Franz Josef Behr, Phillip Davis
Abstract:
Open source geospatial certifications are needed in geospatial technology education and industry sector. In parallel with proprietary software, free and open source software solutions become important in geospatial technology research and play an important role for the growth of the geospatial industry. ESRI, GISCI (GIS Certification Institute), ASPRS (American Society of Photogrammetry and remote sensing), and Meta spatial are offering certifications on proprietary and open source software. These are portfolio and competency based certifications depending on GIS Body of Knowledge (Bok). The analysis of these certification approaches might lead to the discovery of some gaps in them and will open a new way to develop certifications related to the geospatial open source (OS). This new certification will investigate the different geospatial competencies according to open source tools that help to identify geospatial professionals and strengthen the geospatial academic content. The goal of this research is to introduce a geospatial certification model based on geospatial technology competency model (GTCM).The developed certification will not only incorporate the importance of geospatial education and production of the geospatial competency-based workforce in universities and companies (private or public) as well as describe open source solutions with tools and technology. Job analysis, market analysis, survey analysis of this certification opens a new horizon for business as well.Keywords: geospatial certification, open source, geospatial technology competency model, geoscience
Procedia PDF Downloads 5665148 Hard Coatings Characterization Based on Chromium Nitrides: Applications for Wood Machining
Authors: B. Chemani, H. Aknouche, A. Zerizer, R. Marchal
Abstract:
The phenomena occurring during machining are related to the internal friction of the material that deforms and the friction the flake on the rake face of tool. Various researches have been conducted to improve the wear resistance of the tool by thin film deposition. This work aims to present an experimental approach related to wood machining technique to evaluate the wear for the case of ripping Aleppo pine, a species well established in the Mediterranean in general and in Algeria in particular. The study will be done on tungsten carbide cutting tools widely used in woodworking and coated with chrome nitride (CrN) and Chromium Nitride enriched Aluminium (CrAlN) with percentage different of aluminum sputtered through frame magnetron mark Nordiko 3500. The deposition conditions are already optimized by previous studies. The wear tests were performed in the laboratory of ENSAM Cluny (France) on a numerical control ripper of recordi type. This comparative study of the behavior of tools, coated and uncoated, showed that the addition of the aluminum chromium nitride films does not improve the tool ability to resist abrasive wear that is predominant when ripping the Aleppo pine. By against the aluminum addition improves the crystallization of chromium nitride films.Keywords: Aleppo pine, PVD, coatings, CrAlN, wear
Procedia PDF Downloads 5685147 Electromyography Controlled Robotic Toys for Autistic Children
Authors: Uvais Qidwai, Mohamed Shakir
Abstract:
This paper presents an initial study related to the use of robotic toys as teaching and therapeutic aid tools for teachers and care-givers as well as parents of children with various levels of autism spectrum disorder (ASD). Some of the most common features related to the behavior of a child with ASD are his/her social isolation, living in their own world, not being physically active, and not willing to learn new things. While the teachers, parents, and all other related care-givers do their best to improve the condition of these kids, it is usually quite an uphill task. However, one remarkable observation that has been reported by several teachers dealing with ASD children is the fact that the same children do get attracted to toys with lights and sounds. Hence, this project targets the development/modifications of such existing toys into appropriate behavior training tools which the care-givers can use as they would desire. Initially, the remote control is in hand of the trainer, but after some time, the child is entrusted with the control of the robotic toy to test for the level of interest. It has been found during the course of this study that children with quite low learning activity got extremely interested in the robot and even advanced to controlling the robot with the Electromyography (EMG). It has been observed that the children did show some hesitation in the beginning 5 minutes of the very first sessions of such interaction but were very comfortable afterwards which has been considered as a very strong indicator of the potential of this technique in teaching and rehabilitation of children with ASD or similar brain disorders.Keywords: Autism Spectrum Disorder (ASD), robotic toys, IR control, electromyography, LabVIEW based remote control
Procedia PDF Downloads 4445146 Behaviour of Rc Column under Biaxial Cyclic Loading-State of the Art
Authors: L. Pavithra, R. Sharmila, Shivani Sridhar
Abstract:
Columns severe structural damage needs proportioning a significant portion of earthquake energy can be dissipated yielding in the beams. Presence of axial load along with cyclic loading has a significant influence on column. The objective of this paper is to present the analytical results of columns subjected to biaxial cyclic loading.Keywords: RC column, Seismic behaviour, cyclic behaviour, biaxial testing, ductile behaviour
Procedia PDF Downloads 3665145 Effective Validation Model and Use of Mobile-Health Apps for Elderly People
Authors: Leonardo Ramirez Lopez, Edward Guillen Pinto, Carlos Ramos Linares
Abstract:
The controversy brought about by the increasing use of mHealth apps and their effectiveness for disease prevention and diagnosis calls for immediate control. Although a critical topic in research areas such as medicine, engineering, economics, among others, this issue lacks reliable implementation models. However, projects such as Open Web Application Security Project (OWASP) and various studies have helped to create useful and reliable apps. This research is conducted under a quality model to optimize two mHealth apps for older adults. Results analysis on the use of two physical activity monitoring apps - AcTiv (physical activity) and SMCa (energy expenditure) - is positive and ideal. Through a theoretical and practical analysis, precision calculations and personal information control of older adults for disease prevention and diagnosis were performed. Finally, apps are validated by a physician and, as a result, they may be used as health monitoring tools in physical performance centers or any other physical activity. The results obtained provide an effective validation model for this type of mobile apps, which, in turn, may be applied by other software developers that along with medical staff would offer digital healthcare tools for elderly people.Keywords: model, validation, effective, healthcare, elderly people, mobile app
Procedia PDF Downloads 2185144 Continuous Improvement of Teaching Quality through Course Evaluation by the Students
Authors: Valerie Follonier, Henrike Hamelmann, Jean-Michel Jullien
Abstract:
The Distance Learning University in Switzerland (UniDistance) is offering bachelor and master courses as well as further education programs. The professors and their assistants work at traditional Swiss universities and are giving their courses at UniDistance following a blended learning and flipped classroom approach. A standardized course evaluation by the students has been established as a component of a quality improvement process. The students’ feedback enables the stakeholders to identify areas of improvement, initiate professional development for the teaching teams and thus continuously augment the quality of instruction. This paper describes the evaluation process, the tools involved and how the approach involving all stakeholders helps forming a culture of quality in teaching. Additionally, it will present the first evaluation results following the new process. Two software tools have been developed to support all stakeholders in the process of the semi-annual formative evaluation. The first tool allows to create the survey and to assign it to the relevant courses and students. The second tool presents the results of the evaluation to the stakeholders, providing specific features for the teaching teams, the dean, the directorate and EDUDL+ (Educational development unit distance learning). The survey items were selected in accordance with the e-learning strategy of the institution and are formulated to support the professional development of the teaching teams. By reviewing the results the teaching teams become aware of the opinion of the students and are asked to write a feedback for the attention of their dean. The dean reviews the results of the faculty and writes a general report about the situation of the faculty and the possible improvements intended. Finally, EDUDL+ writes a final report summarising the evaluation results. A mechanism of adjustable warnings allows it to generate quality indicators for each module. These are summarised for each faculty and globally for the whole institution in order to increase the vigilance of the responsible. The quality process involves changing the indicators regularly to focus on different areas each semester, to facilitate the professional development of the teaching teams and to progressively augment the overall teaching quality of the institution.Keywords: continuous improvement process, course evaluation, distance learning, software tools, teaching quality
Procedia PDF Downloads 2595143 Formex Algebra Adaptation into Parametric Design Tools: Dome Structures
Authors: Réka Sárközi, Péter Iványi, Attila B. Széll
Abstract:
The aim of this paper is to present the adaptation of the dome construction tool for formex algebra to the parametric design software Grasshopper. Formex algebra is a mathematical system, primarily used for planning structural systems such like truss-grid domes and vaults, together with the programming language Formian. The goal of the research is to allow architects to plan truss-grid structures easily with parametric design tools based on the versatile formex algebra mathematical system. To produce regular structures, coordinate system transformations are used and the dome structures are defined in spherical coordinate system. Owing to the abilities of the parametric design software, it is possible to apply further modifications on the structures and gain special forms. The paper covers the basic dome types, and also additional dome-based structures using special coordinate-system solutions based on spherical coordinate systems. It also contains additional structural possibilities like making double layer grids in all geometry forms. The adaptation of formex algebra and the parametric workflow of Grasshopper together give the possibility of quick and easy design and optimization of special truss-grid domes.Keywords: parametric design, structural morphology, space structures, spherical coordinate system
Procedia PDF Downloads 2545142 Linguistic Analysis of the Concept ‘Relation’ in Russian and English Languages
Authors: Nadezhda Obvintceva
Abstract:
The article gives the analysis of the concept ‘relation’ from the point of view of its realization in Russian and English languages on the basis of dictionaries articles. The analysis reveals the main difference of representation of this concept in both languages. It is the number of lexemes that express its general meanings. At the end of the article the author gives an explanation of possible causes of the difference and touches upon the issue about analytical phenomena in the vocabulary.Keywords: concept, comparison, lexeme, meaning, relation, semantics
Procedia PDF Downloads 4985141 Failure Probability Assessment of Concrete Spherical Domes Subjected to Ventilation Controlled Fires Using BIM Tools
Authors: A. T. Kassem
Abstract:
Fires areconsidered a common hazardous action that any building may face. Most buildings’ structural elements are designed, taking into consideration precautions for fire safety, using deterministic design approaches. Public and highly important buildings are commonly designed considering standard fire rating and, in many cases, contain large compartments with central domes. Real fire scenarios are not commonly brought into action in structural design of buildings because of complexities in both scenarios and analysis tools. This paper presents a modern approach towards analysis of spherical domes in real fire condition via implementation of building information modelling, and adopting a probabilistic approach. BIMhas been implemented to bridge the gap between various software packages enabling them to function interactively to model both real fire and corresponding structural response. Ventilation controlled fires scenarios have been modeled using both “Revit” and “Pyrosim”. Monte Carlo simulation has been adopted to engage the probabilistic analysis approach in dealing with various parameters. Conclusions regarding failure probability and fire endurance, in addition to the effects of various parameters, have been extracted.Keywords: concrete, spherical domes, ventilation controlled fires, BIM, monte carlo simulation, pyrosim, revit
Procedia PDF Downloads 955140 Examining the Coverage of CO2-Related Indicators in a Sample of Sustainable Rating Systems
Authors: Wesam Rababa, Jamal Al-Qawasmi
Abstract:
The global climate is negatively impacted by CO2 emissions, which are mostly produced by buildings. Several green building rating systems (GBRS) have been proposed to impose low-carbon criteria in order to address this problem. The Green Globes certification is one such system that evaluates a building's sustainability level by assessing different categories of environmental impact and emerging concepts aimed at reducing environmental harm. Therefore, assessment tools at the national level are crucial in the developing world, where specific local conditions require a more precise evaluation. This study analyzed eight sustainable building assessment systems from different regions of the world, comparing a comprehensive list of CO2-related indicators with a various assessment system for conducting coverage analysis. The results show that GBRS includes both direct and indirect indicators in this regard. It reveals deep variation between examined practices, and a lack of consensus not only on the type and the optimal number of indicators used in a system, but also on the depth and breadth of coverage of various sustainable building SB attributes. Generally, the results show that most of the examined systems reflect a low comprehensive coverage, the highest of which is found in materials category. On the other hand, the most of the examined systems reveal a very low representative coverage.Keywords: Assessment tools, CO2-related indicators, Comparative study, Green Building Rating Systems
Procedia PDF Downloads 585139 Revolutionizing Gaming Setup Design: Utilizing Generative and Iterative Methods to Prop and Environment Design, Transforming the Landscape of Game Development Through Automation and Innovation
Authors: Rashmi Malik, Videep Mishra
Abstract:
The practice of generative design has become a transformative approach for an efficient way of generating multiple iterations for any design project. The conventional way of modeling the game elements is very time-consuming and requires skilled artists to design. A 3D modeling tool like 3D S Max, Blender, etc., is used traditionally to create the game library, which will take its stipulated time to model. The study is focused on using the generative design tool to increase the efficiency in game development at the stage of prop and environment generation. This will involve procedural level and customized regulated or randomized assets generation. The paper will present the system design approach using generative tools like Grasshopper (visual scripting) and other scripting tools to automate the process of game library modeling. The script will enable the generation of multiple products from the single script, thus creating a system that lets designers /artists customize props and environments. The main goal is to measure the efficacy of the automated system generated to create a wide variety of game elements, further reducing the need for manual content creation and integrating it into the workflow of AAA and Indie Games.Keywords: iterative game design, generative design, gaming asset automation, generative game design
Procedia PDF Downloads 705138 Pose-Dependency of Machine Tool Structures: Appearance, Consequences, and Challenges for Lightweight Large-Scale Machines
Authors: S. Apprich, F. Wulle, A. Lechler, A. Pott, A. Verl
Abstract:
Large-scale machine tools for the manufacturing of large work pieces, e.g. blades, casings or gears for wind turbines, feature pose-dependent dynamic behavior. Small structural damping coefficients lead to long decay times for structural vibrations that have negative impacts on the production process. Typically, these vibrations are handled by increasing the stiffness of the structure by adding mass. That is counterproductive to the needs of sustainable manufacturing as it leads to higher resource consumption both in material and in energy. Recent research activities have led to higher resource efficiency by radical mass reduction that rely on control-integrated active vibration avoidance and damping methods. These control methods depend on information describing the dynamic behavior of the controlled machine tools in order to tune the avoidance or reduction method parameters according to the current state of the machine. The paper presents the appearance, consequences and challenges of the pose-dependent dynamic behavior of lightweight large-scale machine tool structures in production. The paper starts with the theoretical introduction of the challenges of lightweight machine tool structures resulting from reduced stiffness. The statement of the pose-dependent dynamic behavior is corroborated by the results of the experimental modal analysis of a lightweight test structure. Afterwards, the consequences of the pose-dependent dynamic behavior of lightweight machine tool structures for the use of active control and vibration reduction methods are explained. Based on the state of the art on pose-dependent dynamic machine tool models and the modal investigation of an FE-model of the lightweight test structure, the criteria for a pose-dependent model for use in vibration reduction are derived. The description of the approach for a general pose-dependent model of the dynamic behavior of large lightweight machine tools that provides the necessary input to the aforementioned vibration avoidance and reduction methods to properly tackle machine vibrations is the outlook of the paper.Keywords: dynamic behavior, lightweight, machine tool, pose-dependency
Procedia PDF Downloads 4595137 On-Line Super Critical Fluid Extraction, Supercritical Fluid Chromatography, Mass Spectrometry, a Technique in Pharmaceutical Analysis
Authors: Narayana Murthy Akurathi, Vijaya Lakshmi Marella
Abstract:
The literature is reviewed with regard to online Super critical fluid extraction (SFE) coupled directly with supercritical fluid chromatography (SFC) -mass spectrometry that have typically more sensitive than conventional LC-MS/MS and GC-MS/MS. It is becoming increasingly interesting to use on-line techniques that combine sample preparation, separation and detection in one analytical set up. This provides less human intervention, uses small amount of sample and organic solvent and yields enhanced analyte enrichment in a shorter time. The sample extraction is performed under light shielding and anaerobic conditions, preventing the degradation of thermo labile analytes. It may be able to analyze compounds over a wide polarity range as SFC generally uses carbon dioxide which was collected as a by-product of other chemical reactions or is collected from the atmosphere as it contributes no new chemicals to the environment. The diffusion of solutes in supercritical fluids is about ten times greater than that in liquids and about three times less than in gases which results in a decrease in resistance to mass transfer in the column and allows for fast high resolution separations. The drawback of SFC when using carbon dioxide as mobile phase is that the direct introduction of water samples poses a series of problems, water must therefore be eliminated before it reaches the analytical column. Hundreds of compounds analysed simultaneously by simple enclosing in an extraction vessel. This is mainly applicable for pharmaceutical industry where it can analyse fatty acids and phospholipids that have many analogues as their UV spectrum is very similar, trace additives in polymers, cleaning validation can be conducted by putting swab sample in an extraction vessel, analysing hundreds of pesticides with good resolution.Keywords: super critical fluid extraction (SFE), super critical fluid chromatography (SFC), LCMS/MS, GCMS/MS
Procedia PDF Downloads 3915136 Gas Flow, Time, Distance Dynamic Modelling
Authors: A. Abdul-Ameer
Abstract:
The equations governing the distance, pressure- volume flow relationships for the pipeline transportation of gaseous mixtures, are considered. A derivation based on differential calculus, for an element of this system model, is addressed. Solutions, yielding the input- output response following pressure changes, are reviewed. The technical problems associated with these analytical results are identified. Procedures resolving these difficulties providing thereby an attractive, simple, analysis route are outlined. Computed responses, validating thereby calculated predictions, are presented.Keywords: pressure, distance, flow, dissipation, models
Procedia PDF Downloads 4735135 Exploring Gender-Base Salary Disparities and Equities Among University Presidents
Authors: Daniel Barkley, Jianyi Zhu
Abstract:
This study investigates base salary differentials and gender equity among university presidents across 427 U.S. colleges and universities. While endowments typically do not directly determine university presidents' base salaries, our analysis reveals a noteworthy pattern: endowments explain more than half of the variance in female university presidents' base salaries, compared to a mere 0.69 percent for males. Moreover, female presidents' base salaries tend to rise much faster than male base salaries with increasing university endowments. This disparate impact of endowments on base salaries implies an endowment threshold for achieving gender pay equity. We develop an analytical model predicting an endowment threshold for achieving gender equality and empirically estimate this equity threshold using data from over 427 institutions. Surprisingly, the fields of science and athletics have emerged as sources of gender-neutral base pay. Both male and female university presidents with STEM backgrounds command higher base salaries than those without such qualifications. Additionally, presidents of universities affiliated with Power 5 conferences consistently receive higher base salaries regardless of gender. Consistent with the theory of human capital accumulation, the duration of the university presidency incrementally raises base salaries for both genders but at a diminishing rate. Curiously, prior administrative leadership experience as a vice president, provost, dean, or department chair does not significantly influence base salaries for either gender. By providing empirical evidence and analytical models predicting an endowment threshold for achieving gender equality in base salaries, the study offers valuable insights for policymakers, university administrators, and other stakeholders. These findings hold crucial policy implications, informing strategies to promote gender equality in executive compensation within higher education institutions.Keywords: higher education, endowments, base salaries, university presidents
Procedia PDF Downloads 575134 Role of Estrogen Receptor-alpha in Mammary Carcinoma by Single Nucleotide Polymorphisms and Molecular Docking: An In-silico Analysis
Authors: Asif Bilal, Fouzia Tanvir, Sibtain Ahmad
Abstract:
Estrogen receptor alpha, also known as estrogen receptor-1, is highly involved in risk of mammary carcinoma. The objectives of this study were to identify non-synonymous SNPs of estrogen receptor and their association with breast cancer and to identify the chemotherapeutic responses of phytochemicals against it via in-silico study design. For this purpose, different online tools. to identify pathogenic SNPs the tools were SIFT, Polyphen, Polyphen-2, fuNTRp, SNAP2, for finding disease associated SNPs the tools SNP&GO, PhD-SNP, PredictSNP, MAPP, SNAP, MetaSNP, PANTHER, and to check protein stability Mu-Pro, I-Mutant, and CONSURF were used. Post-translational modifications (PTMs) were detected by Musitedeep, Protein secondary structure by SOPMA, protein to protein interaction by STRING, molecular docking by PyRx. Seven SNPs having rsIDs (rs760766066, rs779180038, rs956399300, rs773683317, rs397509428, rs755020320, and rs1131692059) showing mutations on I229T, R243C, Y246H, P336R, Q375H, R394S, and R394H, respectively found to be completely deleterious. The PTMs found were 96 times Glycosylation; 30 times Ubiquitination, a single time Acetylation; and no Hydroxylation and Phosphorylation were found. The protein secondary structure consisted of Alpha helix (Hh) is (28%), Extended strand (Ee) is (21%), Beta turn (Tt) is 7.89% and Random coil (Cc) is (44.11%). Protein-protein interaction analysis revealed that it has strong interaction with Myeloperoxidase, Xanthine dehydrogenase, carboxylesterase 1, Glutathione S-transferase Mu 1, and with estrogen receptors. For molecular docking we used Asiaticoside, Ilekudinuside, Robustoflavone, Irinoticane, Withanolides, and 9-amin0-5 as ligands that extract from phytochemicals and docked with this protein. We found that there was great interaction (from -8.6 to -9.7) of these ligands of phytochemicals at ESR1 wild and two mutants (I229T and R394S). It is concluded that these SNPs found in ESR1 are involved in breast cancer and given phytochemicals are highly helpful against breast cancer as chemotherapeutic agents. Further in vitro and in vivo analysis should be performed to conduct these interactions.Keywords: breast cancer, ESR1, phytochemicals, molecular docking
Procedia PDF Downloads 695133 A Tool to Measure Efficiency and Trust Towards eXplainable Artificial Intelligence in Conflict Detection Tasks
Authors: Raphael Tuor, Denis Lalanne
Abstract:
The ATM research community is missing suitable tools to design, test, and validate new UI prototypes. Important stakes underline the implementation of both DSS and XAI methods into current systems. ML-based DSS are gaining in relevance as ATFM becomes increasingly complex. However, these systems only prove useful if a human can understand them, and thus new XAI methods are needed. The human-machine dyad should work as a team and should understand each other. We present xSky, a configurable benchmark tool that allows us to compare different versions of an ATC interface in conflict detection tasks. Our main contributions to the ATC research community are (1) a conflict detection task simulator (xSky) that allows to test the applicability of visual prototypes on scenarios of varying difficulty and outputting relevant operational metrics (2) a theoretical approach to the explanations of AI-driven trajectory predictions. xSky addresses several issues that were identified within available research tools. Researchers can configure the dimensions affecting scenario difficulty with a simple CSV file. Both the content and appearance of the XAI elements can be customized in a few steps. As a proof-of-concept, we implemented an XAI prototype inspired by the maritime field.Keywords: air traffic control, air traffic simulation, conflict detection, explainable artificial intelligence, explainability, human-automation collaboration, human factors, information visualization, interpretability, trajectory prediction
Procedia PDF Downloads 1605132 Explanatory Variables for Crash Injury Risk Analysis
Authors: Guilhermina Torrao
Abstract:
An extensive number of studies have been conducted to determine the factors which influence crash injury risk (CIR); however, uncertainties inherent to selected variables have been neglected. A review of existing literature is required to not only obtain an overview of the variables and measures but also ascertain the implications when comparing studies without a systematic view of variable taxonomy. Therefore, the aim of this literature review is to examine and report on peer-reviewed studies in the field of crash analysis and to understand the implications of broad variations in variable selection in CIR analysis. The objective of this study is to demonstrate the variance in variable selection and classification when modeling injury risk involving occupants of light vehicles by presenting an analytical review of the literature. Based on data collected from 64 journal publications reported over the past 21 years, the analytical review discusses the variables selected by each study across an organized list of predictors for CIR analysis and provides a better understanding of the contribution of accident and vehicle factors to injuries acquired by occupants of light vehicles. A cross-comparison analysis demonstrates that almost half the studies (48%) did not consider vehicle design specifications (e.g., vehicle weight), whereas, for those that did, the vehicle age/model year was the most selected explanatory variable used by 41% of the literature studies. For those studies that included speed risk factor in their analyses, the majority (64%) used the legal speed limit data as a ‘proxy’ of vehicle speed at the moment of a crash, imposing limitations for CIR analysis and modeling. Despite the proven efficiency of airbags in minimizing injury impact following a crash, only 22% of studies included airbag deployment data. A major contribution of this study is to highlight the uncertainty linked to explanatory variable selection and identify opportunities for improvements when performing future studies in the field of road injuries.Keywords: crash, exploratory, injury, risk, variables, vehicle
Procedia PDF Downloads 1355131 Performance Enhancement of Autopart Manufacturing Industry Using Lean Manufacturing Strategies: A Case Study
Authors: Raman Kumar, Jasgurpreet Singh Chohan, Chander Shekhar Verma
Abstract:
Today, the manufacturing industries respond rapidly to new demands and compete in this continuously changing environment, thus seeking out new methods allowing them to remain competitive and flexible simultaneously. The aim of the manufacturing organizations is to reduce manufacturing costs and wastes through system simplification, organizational potential, and proper infrastructural planning by using modern techniques like lean manufacturing. In India, large number of medium and large scale manufacturing industries has successfully implemented lean manufacturing techniques. Keeping in view the above-mentioned facts, different tools will be involved in the successful implementation of the lean approach. The present work is focused on the auto part manufacturing industry to improve the performance of the recliner assembly line. There is a number of lean manufacturing tools available, but the experience and complete knowledge of manufacturing processes are required to select an appropriate tool for a specific process. Fishbone diagrams (scrap, inventory, and waiting) have been drawn to identify the root cause of different. Effect of cycle time reduction on scrap and inventory is analyzed thoroughly in the case company. Results have shown that there is a decrease in inventory cost by 7 percent after the successful implementation of the lean tool.Keywords: lean tool, fish-bone diagram, cycle time reduction, case study
Procedia PDF Downloads 1275130 Colloquialism in Audiovisual Translation: English Subtitling of the Lebanese Film Capernaum as a Case Study
Authors: Fatima Saab
Abstract:
This paper attempts to study colloquialism in audio-visual translation, with particular emphasis given to investigating the difficulties and challenges encountered by subtitlers in translating Lebanese colloquial into English. To achieve the main objectives of this study, ample and thorough cultural and translational analysis of examples drawn from the subtitled movie Capernaum are presented in order to identify the strategies used to overcome cultural barriers and differences and to show the process of decision-making by the translator. Also, special attention is given to explain the technicalities in translating subtitles and how they affect the translation process. The research is a descriptive analytical study whereby the writer sets out empirical observations, consisting of descriptive and analytical examination of the difficulties and problems associated with translating Arabic colloquialisms, specifically Lebanese, into English in the subtitled film, Capernaum. The research methodology utilizes a qualitative approach to group the selected data into the subtitling strategies presented by Gottlieb under the domesticating or foreignizing strategies according to Venuti's Model. It is shown that producing the same meanings to a foreign audience is not an easy task. The background of cultural elements and the stories that make up the history and mindset of the Lebanese and Arabic peoples leads to the use of the transfer and paraphrase methodologies most of the time (81% of the sample used for analysis). The research shows that translating and subtitling colloquialism needs special skills by the translators to overcome the challenges imposed by the limited presentation space as well as cultural differences. Translation of colloquial Arabic/Lebanese can be achieved to a certain extent and delivering the meaning and effect of the source language culture is accomplished in as much as the translator investigates and relates to the target culture.Keywords: Lebanese colloquial, audio-visual translation, subtitling, Capernaum
Procedia PDF Downloads 1485129 The Effects of Adlerian Supervision on Enhancing Career Consultants’ Case Conceptualization
Authors: Lin Shang Neng
Abstract:
Due to rapid changes in the societal environment, career development and planning have become increasingly crucial, leading more individuals to seek the assistance of career consultations. However, the training process for career consultants often emphasizes the application of assessment tools and guidance in job-seeking behavior. The abilities of case conceptualization and consulting skills require further in-service supervision. This study aims to inquire about the supervised experiences of employment specialists at the Employment Service Center of the Taiwan Ministry of Labor or career consultants who held private clinics for at least three years. The research participants were continuously supervised by the Adlerian approach twice a month for at least one year, helping them integrate the whole picture of the client through Lifestyle Assessment (the qualitative way, specific diagnosis) and other Adlerian assessment tools (the quantitative way, general diagnosis.) The supervisor was familiar with Adlerian Psychology and certified by the North American Society of Adlerian Psychology. The research method involves semi-structured interviews and qualitative analysis. For the ethical considerations, the participants were invited to interview after the supervision sessions finished. The findings of this research were discussed with possible implications, like how they applied Adlerian Psychology to their career consultations, especially to case conceptualizations and consulting skills. Recommendations for further research and training for career consultants are also discussed.Keywords: supervision, Adlerian psychology, case conceptualization, career consultant
Procedia PDF Downloads 785128 Challenges in Achieving Profitability for MRO Companies in the Aviation Industry: An Analytical Approach
Authors: Nur Sahver Uslu, Ali̇ Hakan Büyüklü
Abstract:
Maintenance, Repair, and Overhaul (MRO) costs are significant in the aviation industry. On the other hand, companies that provide MRO services to the aviation industry but are not dominant in the sector, need to determine the right strategies for sustainable profitability in a competitive environment. This study examined the operational real data of a small medium enterprise (SME) MRO company where analytical methods are not widely applied. The company's customers were divided into two categories: airline companies and non-airline companies, and the variables that best explained profitability were analyzed with Logistic Regression for each category and the results were compared. First, data reduction was applied to the transformed variables that went through the data cleaning and preparation stages, and the variables to be included in the model were decided. The misclassification rates for the logistic regression results concerning both customer categories are similar, indicating consistent model performance across different segments. Less profit margin is obtained from airline customers, which can be explained by the variables part description, time to quotation (TTQ), turnaround time (TAT), manager, part cost, and labour cost. The higher profit margin obtained from non-airline customers is explained only by the variables part description, part cost, and labour cost. Based on the two models, it can be stated that it is significantly more challenging for the MRO company, which is the subject of our study, to achieve profitability from Airline customers. While operational processes and organizational structure also affect the profit from airline customers, only the type of parts and costs determine the profit for non-airlines.Keywords: aircraft, aircraft components, aviation, data analytics, data science, gini index, maintenance, repair, and overhaul, MRO, logistic regression, profit, variable clustering, variable reduction
Procedia PDF Downloads 335127 Measuring the Resilience of e-Governments Using an Ontology
Authors: Onyekachi Onwudike, Russell Lock, Iain Phillips
Abstract:
The variability that exists across governments, her departments and the provisioning of services has been areas of concern in the E-Government domain. There is a need for reuse and integration across government departments which are accompanied by varying degrees of risks and threats. There is also the need for assessment, prevention, preparation, response and recovery when dealing with these risks or threats. The ability of a government to cope with the emerging changes that occur within it is known as resilience. In order to forge ahead with concerted efforts to manage reuse and integration induced risks or threats to governments, the ambiguities contained within resilience must be addressed. Enhancing resilience in the E-Government domain is synonymous with reducing risks governments face with provisioning of services as well as reuse of components across departments. Therefore, it can be said that resilience is responsible for the reduction in government’s vulnerability to changes. In this paper, we present the use of the ontology to measure the resilience of governments. This ontology is made up of a well-defined construct for the taxonomy of resilience. A specific class known as ‘Resilience Requirements’ is added to the ontology. This class embraces the concept of resilience into the E-Government domain ontology. Considering that the E-Government domain is a highly complex one made up of different departments offering different services, the reliability and resilience of the E-Government domain have become more complex and critical to understand. We present questions that can help a government access how prepared they are in the face of risks and what steps can be taken to recover from them. These questions can be asked with the use of queries. The ontology focuses on developing a case study section that is used to explore ways in which government departments can become resilient to the different kinds of risks and threats they may face. A collection of resilience tools and resources have been developed in our ontology to encourage governments to take steps to prepare for emergencies and risks that a government may face with the integration of departments and reuse of components across government departments. To achieve this, the ontology has been extended by rules. We present two tools for understanding resilience in the E-Government domain as a risk analysis target and the output of these tools when applied to resilience in the E-Government domain. We introduce the classification of resilience using the defined taxonomy and modelling of existent relationships based on the defined taxonomy. The ontology is constructed on formal theory and it provides a semantic reference framework for the concept of resilience. Key terms which fall under the purview of resilience with respect to E-Governments are defined. Terms are made explicit and the relationships that exist between risks and resilience are made explicit. The overall aim of the ontology is to use it within standards that would be followed by all governments for government-based resilience measures.Keywords: E-Government, Ontology, Relationships, Resilience, Risks, Threats
Procedia PDF Downloads 3375126 Seismic Response of Viscoelastic Dampers for Steel Structures
Authors: Ali Khoshraftar, S. A. Hashemi
Abstract:
This paper is focused on the advantages of Viscoelastic Dampers (VED) to be used as energy-absorbing devices in buildings. The properties of VED are briefly described. The analytical studies of the model structures exhibiting the structural response reduction due to these viscoelastic devices are presented. Computer simulation of the damped response of a multi-storey steel frame structure shows significant reduction in floor displacement levels.Keywords: dampers, seismic evaluation, steel frames, viscoelastic
Procedia PDF Downloads 4845125 The Power of Geography in the Multipolar World Order
Authors: Norbert Csizmadia
Abstract:
The paper is based on a thorough investigation regarding the recent global, social and geographical processes. The ‘Geofusion’ book series by the author guides the readers with the help of newly illustrated “associative” geographic maps of the global world in the 21st century through the quest for the winning nations, communities, leaders and powers of this age. Hence, the above mentioned represent the research objectives, the preliminary findings of which are presented in this paper. The most significant recognition is that scientists who are recognized as explorers, geostrategists of this century, in this case, are expected to present guidelines for our new world full of global social and economic challenges. To do so, new maps are needed which do not miss the wisdom and tools of the old but complement them with the new structure of knowledge. Using the lately discovered geographic and economic interrelations, the study behind this presentation tries to give a prognosis of the global processes. The methodology applied contains the survey and analysis of many recent publications worldwide regarding geostrategic, cultural, geographical, social, and economic surveys structured into global networks. In conclusion, the author presents the result of the study, which is a collage of the global map of the 21st century as mentioned above, and it can be considered as a potential contribution to the recent scientific literature on the topic. In summary, this paper displays the results of several-year-long research giving the audience an image of how economic navigation tools can help investors, politicians and travelers to get along in the changing new world.Keywords: geography, economic geography, geo-fusion, geostrategy
Procedia PDF Downloads 1315124 Strategies to Combat the Covid-19 Epidemic
Authors: Marziye Hadian, Alireza Jabbari
Abstract:
Background: The World Health Organization has identified COVID-19 as a public health emergency and is urging governments to stop the virus transmission by adopting appropriate policies. In this regard, the countries have taken different approaches to cutting the chain or controlling the spread of the disease. Methods: The present study was a systematize review of publications relating to prevention strategies for covid-19 disease. The study was carried out based on the PRISMA guidelines and CASP for articles and AACODS for grey literature. Finding: The study findings showed that in order to confront the COVID-19 epidemic, in general, there are three approaches of "mitigation", "active control" and "suppression" and four strategies of "quarantine", "isolation", "social distance" as well as "lockdown" in both individual and social dimensions to deal with epidemics that the choice of each approach requires specific strategies and has different effects when it comes to controlling and inhibiting the disease. Conclusion: The only way to control the disease is to change your behavior and lifestyle. In addition to prevention strategies, use of masks, observance of personal hygiene principles such as regular hand washing and non-contact of contaminated hands with the face, as well as observance of public health principles such as control of sneezing and coughing, safe extermination of personal protective equipment, etc. have not been included in the category of prevention tools. However, it has a great impact on controlling the epidemic, especially the new coronavirus epidemic.Keywords: novel corona virus, COVID-19, prevention tools, prevention strategies
Procedia PDF Downloads 1415123 Conduction Transfer Functions for the Calculation of Heat Demands in Heavyweight Facade Systems
Authors: Mergim Gasia, Bojan Milovanovica, Sanjin Gumbarevic
Abstract:
Better energy performance of the building envelope is one of the most important aspects of energy savings if the goals set by the European Union are to be achieved in the future. Dynamic heat transfer simulations are being used for the calculation of building energy consumption because they give more realistic energy demands compared to the stationary calculations that do not take the building’s thermal mass into account. Software used for these dynamic simulation use methods that are based on the analytical models since numerical models are insufficient for longer periods. The analytical models used in this research fall in the category of the conduction transfer functions (CTFs). Two methods for calculating the CTFs covered by this research are the Laplace method and the State-Space method. The literature review showed that the main disadvantage of these methods is that they are inadequate for heavyweight façade elements and shorter time periods used for the calculation. The algorithms for both the Laplace and State-Space methods are implemented in Mathematica, and the results are compared to the results from EnergyPlus and TRNSYS since these software use similar algorithms for the calculation of the building’s energy demand. This research aims to check the efficiency of the Laplace and the State-Space method for calculating the building’s energy demand for heavyweight building elements and shorter sampling time, and it also gives the means for the improvement of the algorithms used by these methods. As the reference point for the boundary heat flux density, the finite difference method (FDM) is used. Even though the dynamic heat transfer simulations are superior to the calculation based on the stationary boundary conditions, they have their limitations and will give unsatisfactory results if not properly used.Keywords: Laplace method, state-space method, conduction transfer functions, finite difference method
Procedia PDF Downloads 1335122 An Engaged Approach to Developing Tools for Measuring Caregiver Knowledge and Caregiver Engagement in Juvenile Type 1 Diabetes
Authors: V. Howard, R. Maguire, S. Corrigan
Abstract:
Background: Type 1 Diabetes (T1D) is a chronic autoimmune disease, typically diagnosed in childhood. T1D puts an enormous strain on families; controlling blood-glucose in children is difficult and the consequences of poor control for patient health are significant. Successful illness management and better health outcomes can be dependent on quality of caregiving. On diagnosis, parent-caregivers face a steep learning curve as T1D care requires a significant level of knowledge to inform complex decision making throughout the day. The majority of illness management is carried out in the home setting, independent of clinical health providers. Parent-caregivers vary in their level of knowledge and their level of engagement in applying this knowledge in the practice of illness management. Enabling researchers to quantify these aspects of the caregiver experience is key to identifying targets for psychosocial support interventions, which are desirable for reducing stress and anxiety in this highly burdened cohort, and supporting better health outcomes in children. Currently, there are limited tools available that are designed to capture this information. Where tools do exist, they are not comprehensive and do not adequately capture the lived experience. Objectives: Development of quantitative tools, informed by lived experience, to enable researchers gather data on parent-caregiver knowledge and engagement, which accurately represents the experience/cohort and enables exploration of questions that are of real-world value to the cohort themselves. Methods: This research employed an engaged approach to address the problem of quantifying two key aspects of caregiver diabetes management: Knowledge and engagement. The research process was multi-staged and iterative. Stage 1: Working from a constructivist standpoint, literature was reviewed to identify relevant questionnaires, scales and single-item measures of T1D caregiver knowledge and engagement, and harvest candidate questionnaire items. Stage 2: Aggregated findings from the review were circulated among a PPI (patient and public involvement) expert panel of caregivers (n=6), for discussion and feedback. Stage 3: In collaboration with the expert panel, data were interpreted through the lens of lived experience to create a long-list of candidate items for novel questionnaires. Items were categorized as either ‘knowledge’ or ‘engagement’. Stage 4: A Delphi-method process (iterative surveys) was used to prioritize question items and generate novel questions that further captured the lived experience. Stage 5: Both questionnaires were piloted to refine wording of text to increase accessibility and limit socially desirable responding. Stage 6: Tools were piloted using an online survey that was deployed using an online peer-support group for caregivers for Juveniles with T1D. Ongoing Research: 123 parent-caregivers completed the survey. Data analysis is ongoing to establish face and content validity qualitatively and through exploratory factor analysis. Reliability will be established using an alternative-form method and Cronbach’s alpha will assess internal consistency. Work will be completed by early 2024. Conclusion: These tools will enable researchers to gain deeper insights into caregiving practices among parents of juveniles with T1D. Development was driven by lived experience, illustrating the value of engaged research at all levels of the research process.Keywords: caregiving, engaged research, juvenile type 1 diabetes, quantified engagement and knowledge
Procedia PDF Downloads 555121 An Analytical Study on the Politics of Defection in India
Authors: Diya Sarkar, Prafulla C. Mishra
Abstract:
In a parliamentary system, party discipline is the impulse; when it falls short, the government usually falls. Conceivably, the platform of Indian politics suffers with innumerous practical disorders. The politics of defection is one such specie entailing gross miscarriage of fair conduct turning politics into a game of thrones (powers). This practice of political nomaditude can trace its seed in the womb of British House of Commons. Therein, if a legislator was found to cross the floor, the party considered him disloyal. In other words, the legislator lost his allegiance to his former party by joining another party. This very phenomenon, in practice has a two way traffic i.e. ruling party to the opposition party or vice versa. The democracies like USA, Australia and Canada were also aware of this fashion of swapping loyalties. There have been several instances of great politicians changing party allegiance, for example Winston Churchill, Ramsay McDonald, William Gladstone etc. Nevertheless, it is interesting to cite that irrespective of such practice of changing party allegiance, none of the democracies in the west ever desired or felt the need to legislatively ban defections. But, exceptionally India can be traced to have passed anti-defection laws. The politics of defection had been a unique popular phenomenon on the floor of Indian Parliamentary system gradually gulping the democratic essence and synchronization of the Federation. This study is both analytical and doctrinal, which tries to examine whether representative democracy has lost its essence due to political nomadism. The present study also analyzes the classical as well as contemporary pulse of floor crossing amidst dynastic politics in a representative democracy. It will briefly discuss the panorama of defections under the Indian federal structure in the light of the anti-defection law and an attempt has been made to add valuable suggestions to streamline remedy for the still prevalent political defections.Keywords: constitutional law, defection, democracy, polarization, political anti-trust
Procedia PDF Downloads 3765120 Predicting and Obtaining New Solvates of Curcumin, Demethoxycurcumin and Bisdemethoxycurcumin Based on the Ccdc Statistical Tools and Hansen Solubility Parameters
Authors: J. Ticona Chambi, E. A. De Almeida, C. A. Andrade Raymundo Gaiotto, A. M. Do Espírito Santo, L. Infantes, S. L. Cuffini
Abstract:
The solubility of active pharmaceutical ingredients (APIs) is challenging for the pharmaceutical industry. The new multicomponent crystalline forms as cocrystal and solvates present an opportunity to improve the solubility of APIs. Commonly, the procedure to obtain multicomponent crystalline forms of a drug starts by screening the drug molecule with the different coformers/solvents. However, it is necessary to develop methods to obtain multicomponent forms in an efficient way and with the least possible environmental impact. The Hansen Solubility Parameters (HSPs) is considered a tool to obtain theoretical knowledge of the solubility of the target compound in the chosen solvent. H-Bond Propensity (HBP), Molecular Complementarity (MC), Coordination Values (CV) are tools used for statistical prediction of cocrystals developed by the Cambridge Crystallographic Data Center (CCDC). The HSPs and the CCDC tools are based on inter- and intra-molecular interactions. The curcumin (Cur), target molecule, is commonly used as an anti‐inflammatory. The demethoxycurcumin (Demcur) and bisdemethoxycurcumin (Bisdcur) are natural analogues of Cur from turmeric. Those target molecules have differences in their solubilities. In this way, the work aimed to analyze and compare different tools for multicomponent forms prediction (solvates) of Cur, Demcur and Biscur. The HSP values were calculated for Cur, Demcur, and Biscur using the chemical group contribution methods and the statistical optimization from experimental data. The HSPmol software was used. From the HSPs of the target molecules and fifty solvents (listed in the HSP books), the relative energy difference (RED) was determined. The probability of the target molecules would be interacting with the solvent molecule was determined using the CCDC tools. A dataset of fifty molecules of different organic solvents was ranked for each prediction method and by a consensus ranking of different combinations: HSP, CV, HBP and MC values. Based on the prediction, 15 solvents were selected as Dimethyl Sulfoxide (DMSO), Tetrahydrofuran (THF), Acetonitrile (ACN), 1,4-Dioxane (DOX) and others. In a starting analysis, the slow evaporation technique from 50°C at room temperature and 4°C was used to obtain solvates. The single crystals were collected by using a Bruker D8 Venture diffractometer, detector Photon100. The data processing and crystal structure determination were performed using APEX3 and Olex2-1.5 software. According to the results, the HSPs (theoretical and optimized) and the Hansen solubility sphere for Cur, Demcur and Biscur were obtained. With respect to prediction analyses, a way to evaluate the predicting method was through the ranking and the consensus ranking position of solvates already reported in the literature. It was observed that the combination of HSP-CV obtained the best results when compared to the other methods. Furthermore, as a result of solvent selected, six new solvates, Cur-DOX, Cur-DMSO, Bicur-DOX, Bircur-THF, Demcur-DOX, Demcur-ACN and a new Biscur hydrate, were obtained. Crystal structures were determined for Cur-DOX, Biscur-DOX, Demcur-DOX and Bicur-Water. Moreover, the unit-cell parameter information for Cur-DMSO, Biscur-THF and Demcur-ACN were obtained. The preliminary results showed that the prediction method is showing a promising strategy to evaluate the possibility of forming multicomponent. It is currently working on obtaining multicomponent single crystals.Keywords: curcumin, HSPs, prediction, solvates, solubility
Procedia PDF Downloads 63