Search results for: quantitative methods
16373 Regression Model Evaluation on Depth Camera Data for Gaze Estimation
Authors: James Purnama, Riri Fitri Sari
Abstract:
We investigate the machine learning algorithm selection problem in the term of a depth image based eye gaze estimation, with respect to its essential difficulty in reducing the number of required training samples and duration time of training. Statistics based prediction accuracy are increasingly used to assess and evaluate prediction or estimation in gaze estimation. This article evaluates Root Mean Squared Error (RMSE) and R-Squared statistical analysis to assess machine learning methods on depth camera data for gaze estimation. There are 4 machines learning methods have been evaluated: Random Forest Regression, Regression Tree, Support Vector Machine (SVM), and Linear Regression. The experiment results show that the Random Forest Regression has the lowest RMSE and the highest R-Squared, which means that it is the best among other methods.Keywords: gaze estimation, gaze tracking, eye tracking, kinect, regression model, orange python
Procedia PDF Downloads 53616372 Clinical and Epidemiological Profile of Patients with Chronic Obstructive Pulmonary Disease in a Medical Institution from the City of Medellin, Colombia
Authors: Camilo Andres Agudelo-Velez, Lina María Martinez-Sanchez, Natalia Perilla-Hernandez, Maria De Los Angeles Rodriguez-Gazquez, Felipe Hernandez-Restrepo, Dayana Andrea Quintero-Moreno, Camilo Ruiz-Mejia, Isabel Cristina Ortiz-Trujillo, Monica Maria Zuluaga-Quintero
Abstract:
Chronic obstructive pulmonary disease is common condition, characterized by a persistent blockage of airflow, partially reversible and progressive, that represents 5% of total deaths around the world, and it is expected to become the third leading cause of death by 2030. Objective: To establish the clinical and epidemiological profile of patients with chronic obstructive pulmonary disease in a medical institution from the city of Medellin, Colombia. Methods: A cross-sectional study was performed, with a sample of 50 patients with a diagnosis of chronic obstructive pulmonary disease in a private institution in Medellin, during 2015. The software SPSS vr. 20 was used for the statistical analysis. For the quantitative variables, averages, standard deviations, and maximun and minimun values were calculated, while for ordinal and nominal qualitative variables, proportions were estimated. Results: The average age was 73.5±9.3 years, 52% of the patients were women, 50% of them had retired, 46% ere married and 80% lived in the city of Medellín. The mean time of diagnosis was 7.8±1.3 years and 100% of the patients were treated at the internal medicine service. The most common clinical features were: 36% were classified as class D for the disease, 34% had a FEV1 <30%, 88% had a history of smoking and 52% had oxygen therapy at home. Conclusion: It was found that class D was the most common, and the majority of the patients had a history of smoking, indicating the need to strengthen promotion and prevention strategies in this regard.Keywords: pulmonary disease, chronic obstructive, pulmonary medicine, oxygen inhalation therapy
Procedia PDF Downloads 44116371 Project Progress Prediction in Software Devlopment Integrating Time Prediction Algorithms and Large Language Modeling
Authors: Dong Wu, Michael Grenn
Abstract:
Managing software projects effectively is crucial for meeting deadlines, ensuring quality, and managing resources well. Traditional methods often struggle with predicting project timelines accurately due to uncertain schedules and complex data. This study addresses these challenges by combining time prediction algorithms with Large Language Models (LLMs). It makes use of real-world software project data to construct and validate a model. The model takes detailed project progress data such as task completion dynamic, team Interaction and development metrics as its input and outputs predictions of project timelines. To evaluate the effectiveness of this model, a comprehensive methodology is employed, involving simulations and practical applications in a variety of real-world software project scenarios. This multifaceted evaluation strategy is designed to validate the model's significant role in enhancing forecast accuracy and elevating overall management efficiency, particularly in complex software project environments. The results indicate that the integration of time prediction algorithms with LLMs has the potential to optimize software project progress management. These quantitative results suggest the effectiveness of the method in practical applications. In conclusion, this study demonstrates that integrating time prediction algorithms with LLMs can significantly improve the predictive accuracy and efficiency of software project management. This offers an advanced project management tool for the industry, with the potential to improve operational efficiency, optimize resource allocation, and ensure timely project completion.Keywords: software project management, time prediction algorithms, large language models (LLMS), forecast accuracy, project progress prediction
Procedia PDF Downloads 7616370 Performance Evaluation of Various Segmentation Techniques on MRI of Brain Tissue
Authors: U.V. Suryawanshi, S.S. Chowhan, U.V Kulkarni
Abstract:
Accuracy of segmentation methods is of great importance in brain image analysis. Tissue classification in Magnetic Resonance brain images (MRI) is an important issue in the analysis of several brain dementias. This paper portraits performance of segmentation techniques that are used on Brain MRI. A large variety of algorithms for segmentation of Brain MRI has been developed. The objective of this paper is to perform a segmentation process on MR images of the human brain, using Fuzzy c-means (FCM), Kernel based Fuzzy c-means clustering (KFCM), Spatial Fuzzy c-means (SFCM) and Improved Fuzzy c-means (IFCM). The review covers imaging modalities, MRI and methods for noise reduction and segmentation approaches. All methods are applied on MRI brain images which are degraded by salt-pepper noise demonstrate that the IFCM algorithm performs more robust to noise than the standard FCM algorithm. We conclude with a discussion on the trend of future research in brain segmentation and changing norms in IFCM for better results.Keywords: image segmentation, preprocessing, MRI, FCM, KFCM, SFCM, IFCM
Procedia PDF Downloads 33016369 Chromatographic Preparation and Performance on Zinc Ion Imprinted Monolithic Column and Its Adsorption Property
Authors: X. Han, S. Duan, C. Liu, C. Zhou, W. Zhu, L. Kong
Abstract:
The ionic imprinting technique refers to the three-dimensional rigid structure with the fixed pore sizes, which was formed by the binding interactions of ions and functional monomers and used ions as the template, it has a high level of recognition to the ionic template. The preparation of monolithic column by the in-situ polymerization need to put the compound of template, functional monomers, cross-linking agent and initiating agent into the solution, dissolve it and inject to the column tube, and then the compound will have a polymerization reaction at a certain temperature, after the synthetic reaction, we washed out the unread template and solution. The monolithic columns are easy to prepare, low consumption and cost-effective with fast mass transfer, besides, they have many chemical functions. But the monolithic columns have some problems in the practical application, such as low-efficiency, quantitative analysis cannot be performed accurately because of the peak shape is wide and has tailing phenomena; the choice of polymerization systems is limited and the lack of theoretical foundations. Thus the optimization of components and preparation methods is an important research direction. During the preparation of ionic imprinted monolithic columns, pore-forming agent can make the polymer generate the porous structure, which can influence the physical properties of polymer, what’ s more, it can directly decide the stability and selectivity of polymerization reaction. The compounds generated in the pre-polymerization reaction could directly decide the identification and screening capabilities of imprinted polymer; thus the choice of pore-forming agent is quite critical in the preparation of imprinted monolithic columns. This article mainly focuses on the research that when using different pore-forming agents, the impact of zinc ion imprinted monolithic column on the enrichment performance of zinc ion.Keywords: high performance liquid chromatography (HPLC), ionic imprinting, monolithic column, pore-forming agent
Procedia PDF Downloads 21216368 Evaluation of Public Library Adult Programs: Use of Servqual and Nippa Assessment Standards
Authors: Anna Ching-Yu Wong
Abstract:
This study aims to identify the quality and effectiveness of the adult programs provided by the public library using the ServQUAL Method and the National Library Public Programs Assessment guidelines (NIPPA, June 2019). ServQUAl covers several variables, namely: tangible, reliability, responsiveness, assurance, and empathy. NIPPA guidelines focus on program characteristics, particularly on the outcomes – the level of satisfaction from program participants. The reached populations were adults who participated in library adult programs at a small-town public library in Kansas. This study was designed as quantitative evaluative research which analyzed the quality and effectiveness of the library adult programs by analyzing the role of each factor based on ServQUAL and the NIPPA's library program assessment guidelines. Data were collected from November 2019 to January 2020 using a questionnaire with a Likert Scale. The data obtained were analyzed in a descriptive quantitative manner. The impact of this research can provide information about the quality and effectiveness of existing programs and can be used as input to develop strategies for developing future adult programs. Overall the result of ServQUAL measurement is in very good quality, but still, areas need improvement and emphasis in each variable: Tangible Variables still need improvement in indicators of the temperature and space of the meeting room. Reliability Variable still needs improvement in the timely delivery of the programs. Responsiveness Variable still needs improvement in terms of the ability of the presenters to convey trust and confidence from participants. Assurance Variables still need improvement in the indicator of knowledge and skills of program presenters. Empathy Variable still needs improvement in terms of the presenters' willingness to provide extra assistance. The result of program outcomes measurement based on NIPPA guidelines is very positive. Over 96% of participants indicated that the programs were informative and fun. They learned new knowledge and new skills and would recommend the programs to their friends and families. They believed that together, the library and participants build stronger and healthier communities.Keywords: ServQual model, ServQual in public libraries, library program assessment, NIPPA library programs assessment
Procedia PDF Downloads 9516367 Utilizing Extended Reality in Disaster Risk Reduction Education: A Scoping Review
Authors: Stefano Scippo, Damiana Luzzi, Stefano Cuomo, Maria Ranieri
Abstract:
Background: In response to the rise in natural disasters linked to climate change, numerous studies on Disaster Risk Reduction Education (DRRE) have emerged since the '90s, mainly using a didactic transmission-based approach. Effective DRRE should align with an interactive, experiential, and participatory educational model, which can be costly and risky. A potential solution is using simulations facilitated by eXtended Reality (XR). Research Question: This study aims to conduct a scoping review to explore educational methodologies that use XR to enhance knowledge among teachers, students, and citizens about environmental risks, natural disasters (including climate-related ones), and their management. Method: A search string of 66 keywords was formulated, spanning three domains: 1) education and target audience, 2) environment and natural hazards, and 3) technologies. On June 21st, 2023, the search string was used across five databases: EBSCOhost, IEEE Xplore, PubMed, Scopus, and Web of Science. After deduplication and removing papers without abstracts, 2,152 abstracts (published between 2013 and 2023) were analyzed and 2,062 papers were excluded, followed by the exclusion of 56 papers after full-text scrutiny. Excluded studies focused on unrelated technologies, non-environmental risks, and lacked educational outcomes or accessible texts. Main Results: The 34 reviewed papers were analyzed for context, risk type, research methodology, learning objectives, XR technology use, outcomes, and educational affordances of XR. Notably, since 2016, there has been a rise in scientific publications, focusing mainly on seismic events (12 studies) and floods (9), with a significant contribution from Asia (18 publications), particularly Japan (7 studies). Methodologically, the studies were categorized into empirical (26) and non-empirical (8). Empirical studies involved user or expert validation of XR tools, while non-empirical studies included systematic reviews and theoretical proposals without experimental validation. Empirical studies were further classified into quantitative, qualitative, or mixed-method approaches. Six qualitative studies involved small groups of users or experts, while 20 quantitative or mixed-method studies used seven different research designs, with most (17) employing a quasi-experimental, one-group post-test design, focusing on XR technology usability over educational effectiveness. Non-experimental studies had methodological limitations, making their results hypothetical and in need of further empirical validation. Educationally, the learning objectives centered on knowledge and skills for surviving natural disaster emergencies. All studies recommended XR technologies for simulations or serious games but did not develop comprehensive educational frameworks around these tools. XR-based tools showed potential superiority over traditional methods in teaching risk and emergency management skills. However, conclusions were more valid in studies with experimental designs; otherwise, they remained hypothetical without empirical evidence. The educational affordances of XR, mainly user engagement, were confirmed by the studies. Authors’ Conclusions: The analyzed literature lacks specific educational frameworks for XR in DRRE, focusing mainly on survival knowledge and skills. There is a need to expand educational approaches to include uncertainty education, developing competencies that encompass knowledge, skills, and attitudes like risk perception.Keywords: disaster risk reduction education, educational technologies, scoping review, XR technologies
Procedia PDF Downloads 2316366 Estimation and Comparison of Delay at Signalized Intersections Based on Existing Methods
Authors: Arpita Saha, Satish Chandra, Indrajit Ghosh
Abstract:
Delay implicates the time loss of a traveler while crossing an intersection. Efficiency of traffic operation at signalized intersections is assessed in terms of delay caused to an individual vehicle. Highway Capacity Manual (HCM) method and Webster’s method are the most widely used in India for delay estimation purpose. However, in India, traffic is highly heterogeneous in nature with extremely poor lane discipline. Therefore, to explore best delay estimation technique for Indian condition, a comparison was made. In this study, seven signalized intersections from three different cities where chosen. Data was collected for both during morning and evening peak hours. Only under saturated cycles were considered for this study. Delay was estimated based on the field data. With the help of Simpson’s 1/3 rd rule, delay of under saturated cycles was estimated by measuring the area under the curve of queue length and cycle time. Moreover, the field observed delay was compared with the delay estimated using HCM, Webster, Probabilistic, Taylor’s expansion and Regression methods. The drawbacks of the existing delay estimation methods to be use in Indian heterogeneous traffic conditions were figured out, and best method was proposed. It was observed that direct estimation of delay using field measured data is more accurate than existing conventional and modified methods.Keywords: delay estimation technique, field delay, heterogeneous traffic, signalised intersection
Procedia PDF Downloads 29816365 Enhancement of Mulberry Leaf Yield and Water Productivity in Eastern Dry Zone of Karnataka, India
Authors: Narayanappa Devakumar, Chengalappa Seenappa
Abstract:
The field experiments were conducted during Rabi 2013 and summer 2014 at College of Sericulture, Chintamani, Chickaballapur district, Karnataka, India to find out the response of mulberry to different methods, levels of irrigation and mulching. The results showed that leaf yield and water productivity of mulberry were significantly influenced by different methods, levels of irrigation and mulching. Subsurface drip with lower level of irrigation at 0.8 CPE (Cumulative Pan Evaporation) recorded higher leaf yield and water productivity (42857 kg ha-1 yr-1and 364.41 kg hacm-1) than surface drip with higher level of irrigation at 1.0 CPE (38809 kg ha-1 yr-1 and 264.10 kg hacm-1) and micro spray jet (39931 kg ha-1 yr-1 and 271.83 kg hacm-1). Further, subsurface drip recorded minimum water used to produce one kg of leaf and to earn one rupee of profit (283 L and 113 L) compared to surface drip (390 L and 156 L) and micro spray jet (379 L and 152 L) irrigation methods. Mulberry leaf yield increased and water productivity decreased with increased levels of irrigation. However, these results indicated that irrigation of mulberry with subsurface drip increased leaf yield and water productivity by saving 20% of irrigation water than surface drip and micro spray jet irrigation methods in Eastern Dry Zone (EDZ) of Karnataka.Keywords: cumulative pan evaporation, mulaberry, subsurface drip irrigation, water productivity
Procedia PDF Downloads 27916364 Capacity Enhancement for Agricultural Workers in Mangosteen Product
Authors: Cholpassorn Sitthiwarongchai, Chutikarn Sriviboon
Abstract:
The two primary objectives of this research were (1) to examine the current knowledge and actual circumstance of agricultural workers about mangosteen product processing; and (2) to analyze and evaluate ways to develop capacity of mangosteen product processing. The population of this study was 15,125 people who work in the agricultural sector, in this context, mangosteen production, in the eastern part of Thailand that included Chantaburi Province, Rayong Province, Trad Province and Pracheenburi Province. The sample size based on Yamane’s calculation with 95% reliability was therefore 392 samples. Mixed method was employed included questionnaire and focus group discussion with Connoisseurship Model used in order to collect quantitative and qualitative data. Key informants were used in the focus group including agricultural business owners, academic people in agro food processing, local academics, local community development staff, OTOP subcommittee, and representatives of agro processing industry professional organizations. The study found that the majority of the respondents agreed with a high level (in five-rating scale) towards most of variables of knowledge management in agro food processing. The result of the current knowledge and actual circumstance of agricultural human resource in an arena of mangosteen product processing revealed that mostly, the respondents agreed at a high level to establish 7 variables. The guideline to developing the body of knowledge in order to enhance the capacity of the agricultural workers in mangosteen product processing was delivered in the focus group discussion. The discussion finally contributed to an idea to produce manuals for mangosteen product processing methods, with 4 products chosen: (1) mangosteen soap, (2) mangosteen juice, (3) mangosteen toffee, and (4) mangosteen preserves or jam.Keywords: capacity enhancement, agricultural workers, mangosteen product processing, marketing management
Procedia PDF Downloads 21116363 Socio-Economic and Environmental Impact of Urban Sprawl: A Case Study Adigrat City, Tigray, Ethiopia
Authors: Fikre Belay Tekulu
Abstract:
This thesis presents the socio-economic and environmental impacts of urban sprawl in the case of Adigrat city, Tigray Region, Ethiopia. The main objective of this research is to assess major causes, trends and socio-economic and environmental impacts of the urban sprawl of Adigrat city. The study employed both quantitative and qualitative methods as questionnaires, interviews and observation used for data collection. Simple random sampling has been used to select the participants. The land use and land cover change for agricultural land and forest and grassland resource analysis is done with the aid of GIS. Urban sprawl is mainly caused by the rapid population growth, increase in the living and property cost in the core of the city, land demand and land speculation and the growth of transport and an increase in income of people and demand of more living space. The study indicates 15726.24 hectares (515.49 per cent) of new land added to the city jurisdiction from its adjacent Gantafeshum Wereda between 1986 and 2018. The population of Adigrat city increased by 9.045 per cent per year, while the city expanded 16.01 per cent per annum and the LCR was 0.0233 hectares per person between 1986 and 2018.Built-up area increased by 35.27 per cent per annum, while agricultural land, forests and grassland cover decreased by 1.68 per cent and 1.26 per cent per annum respectively in the last thirty three years. This rapid growth of urban sprawl brought social-economic and environmental change in the city that has been observed by the city residents. Therefore, the city administration should need strong, integrated, effective and efficient work, with its neighbor rural area and also done timely preparation, implementation, supervision, and evaluation of the structural plan of the city to bring out sustainable development of the city.Keywords: cause, , trends, urban sprawl, land use land cover, GIS
Procedia PDF Downloads 13416362 The Use of Software and Internet Search Engines to Develop the Encoding and Decoding Skills of a Dyslexic Learner: A Case Study
Authors: Rabih Joseph Nabhan
Abstract:
This case study explores the impact of two major computer software programs Learn to Speak English and Learn English Spelling and Pronunciation, and some Internet search engines such as Google on mending the decoding and spelling deficiency of Simon X, a dyslexic student. The improvement in decoding and spelling may result in better reading comprehension and composition writing. Some computer programs and Internet materials can help regain the missing awareness and consequently restore his self-confidence and self-esteem. In addition, this study provides a systematic plan comprising a set of activities (four computer programs and Internet materials) which address the problem from the lowest to the highest levels of phoneme and phonological awareness. Four methods of data collection (accounts, observations, published tests, and interviews) create the triangulation to validly and reliably collect data before the plan, during the plan, and after the plan. The data collected are analyzed quantitatively and qualitatively. Sometimes the analysis is either quantitative or qualitative, and some other times a combination of both. Tables and figures are utilized to provide a clear and uncomplicated illustration of some data. The improvement in the decoding, spelling, reading comprehension, and composition writing skills that occurred is proved through the use of authentic materials performed by the student under study. Such materials are a comparison between two sample passages written by the learner before and after the plan, a genuine computer chat conversation, and the scores of the academic year that followed the execution of the plan. Based on these results, the researcher recommends further studies on other Lebanese dyslexic learners using the computer to mend their language problem in order to design and make a most reliable software program that can address this disability more efficiently and successfully.Keywords: analysis, awareness, dyslexic, software
Procedia PDF Downloads 22216361 Identity of Cultural Food: A Case Study of Traditional Mon Cuisine in Bangkok, Thailand
Authors: Saruda Nitiworakarn
Abstract:
This research aims to identify traditional Mon cuisines as well as gather and classify traditional cuisines of Mon communities in Bangkok. The studying of this research is used by methodology of the quantitative research. Using the questionnaire as the method in collecting information from sampling totally amount of 450 persons analyzed via frequency, percentage and mean value. The results showed that a variety of traditional Mon cuisines of Bangkok could split into 6 categories of meat diet with 54 items and 6 categories of desserts with 19 items.Keywords: cultural identity, traditional food, Mon cuisine, Thailand
Procedia PDF Downloads 30516360 Password Cracking on Graphics Processing Unit Based Systems
Authors: N. Gopalakrishna Kini, Ranjana Paleppady, Akshata K. Naik
Abstract:
Password authentication is one of the widely used methods to achieve authentication for legal users of computers and defense against attackers. There are many different ways to authenticate users of a system and there are many password cracking methods also developed. This paper is mainly to propose how best password cracking can be performed on a CPU-GPGPU based system. The main objective of this work is to project how quickly a password can be cracked with some knowledge about the computer security and password cracking if sufficient security is not incorporated to the system.Keywords: GPGPU, password cracking, secret key, user authentication
Procedia PDF Downloads 28916359 Influence of Non-Carcinogenic Risk on Public Health
Authors: Gulmira Umarova
Abstract:
The data on the assessment of the influence of environmental risk to the health of the population of Uralsk in the West region of Kazakhstan were presented. Calculation of non-carcinogenic risks was performed for such air pollutants as sulfur dioxide, nitrogen oxides, hydrogen sulfide, carbon monoxide. Here with the critical organs and systems, which are affected by the above-mentioned substances were taken into account. As well as indicators of primary and general morbidity by classes of diseases among the population were considered. The quantitative risk of the influence of substances on organs and systems is established by results of the calculation.Keywords: environment, health, morbidity, non-carcinogenic risk
Procedia PDF Downloads 11916358 The Role of Principals’ Emotional Intelligence on School Leadership Effectiveness
Authors: Daniel Gebreslassie Mekonnen
Abstract:
Effective leadership has a crucial role in excelling in the overall success of a school. Today there is much attention given to school leadership, without which schools can never be successful. Therefore, the study was aimed at investigating the role of principals’ leadership styles and their emotional intelligence on the work motivation and job performance of teachers in Addis Ababa, Ethiopia. The study, thus, first examined the relationship between work motivation and job performance of the teachers in relation to the perceived leadership styles and emotional intelligence of principals. Second, it assessed the mean differences and the interaction effects of the principals’ leadership styles and emotional intelligence on the work motivation and job performance of the teachers. Finally, the study investigated whether principals’ leadership styles and emotional intelligence variables had significantly predicted the work motivation and job performance of teachers. As a means, a quantitative approach and descriptive research design were employed to conduct the study. Three hundred sixteen teachers were selected using multistage sampling techniques as participants of the study from the eight sub-cities in Addis Ababa. The main data-gathering instruments used in this study were the path-goal leadership questionnaire, emotional competence inventory, multidimensional work motivation scale, and job performance appraisal scale. The quantitative data were analyzed by using the statistical techniques of Pearson–product-moment correlation analysis, two-way analysis of variance, and stepwise multiple regression analysis. Major findings of the study have revealed that the work motivation and job performance of the teachers were significantly correlated with the perceived participative leadership style, achievement-oriented leadership style, and emotional intelligence of principals. Moreover, the emotional intelligence of the principals was found to be the best predictor of the teachers’ work motivation, whereas the achievement-oriented leadership style of the principals was identified as the best predictor of the job performance of the teachers. Furthermore, the interaction effects of all four path-goal leadership styles vis-a-vis the emotional intelligence of the principals have shown differential effects on the work motivation and job performance of teachers. Thus, it is reasonable to conclude that emotional intelligence is the sine qua non of effective school leadership. Hence, this study would be useful for policymakers and educational leaders to come up with policies that would enhance the role of emotional intelligence on school leadership effectiveness. Finally, pertinent recommendations were drawn from the findings and the conclusions of the study.Keywords: emotional intelligence, leadership style, job performance, work motivation
Procedia PDF Downloads 9516357 Finite Element and Split Bregman Methods for Solving a Family of Optimal Control Problem with Partial Differential Equation Constraint
Authors: Mahmoud Lot
Abstract:
In this article, we will discuss the solution of elliptic optimal control problem. First, by using the nite element method, we obtain the discrete form of the problem. The obtained discrete problem is actually a large scale constrained optimization problem. Solving this optimization problem with traditional methods is difficult and requires a lot of CPU time and memory. But split Bergman method converts the constrained problem to an unconstrained, and hence it saves time and memory requirement. Then we use the split Bregman method for solving this problem, and examples show the speed and accuracy of split Bregman methods for solving these types of problems. We also use the SQP method for solving the examples and compare with the split Bregman method.Keywords: Split Bregman Method, optimal control with elliptic partial differential equation constraint, finite element method
Procedia PDF Downloads 15016356 Understanding Tacit Knowledge and Its Role in Military Organizations: Methods of Managing Tacit Knowledge
Authors: M. Erhan Orhan, Onur Ozdemir
Abstract:
Expansion of area of operation and increasing diversity of threats forced the military organizations to change in many ways. However, tacit knowledge still is the most fundamental component of organizational knowledge. Since it is human oriented and in warfare human stands at the core of the organization. Therefore, military organizations should find effective ways of systematically utilizing tacit knowledge. In this context, this article suggest some methods for turning tacit knowledge into explicit in military organizations.Keywords: tacit knowledge, military, knowledge management, warfare, technology
Procedia PDF Downloads 48616355 Assesments of Some Environment Variables on Fisheries at Two Levels: Global and Fao Major Fishing Areas
Authors: Hyelim Park, Juan Martin Zorrilla
Abstract:
Climate change influences very widely and in various ways ocean ecosystem functioning. The consequences of climate change on marine ecosystems are an increase in temperature and irregular behavior of some solute concentrations. These changes would affect fisheries catches in several ways. Our aim is to assess the quantitative contribution change of fishery catches along the time and express them through four environment variables: Sea Surface Temperature (SST4) and the concentrations of Chlorophyll (CHL), Particulate Inorganic Carbon (PIC) and Particulate Organic Carbon (POC) at two spatial scales: Global and the nineteen FAO Major Fishing Areas divisions. Data collection was based on the FAO FishStatJ 2014 database as well as MODIS Aqua satellite observations from 2002 to 2012. Some data had to be corrected and interpolated using some existing methods. As the results, a multivariable regression model for average Global fisheries captures contained temporal mean of SST4, standard deviation of SST4, standard deviation of CHL and standard deviation of PIC. Global vector auto-regressive (VAR) model showed that SST4 was a statistical cause of global fishery capture. To accommodate varying conditions in fishery condition and influence of climate change variables, a model was constructed for each FAO major fishing area. From the management perspective it should be recognized some limitations of the FAO marine areas division that opens to possibility to the discussion of the subdivision of the areas into smaller units. Furthermore, it should be treated that the contribution changes of fishery species and the possible environment factor for specific species at various scale levels.Keywords: fisheries-catch, FAO FishStatJ, MODIS Aqua, sea surface temperature (SST), chlorophyll, particulate inorganic carbon (PIC), particulate organic carbon (POC), VAR, granger causality
Procedia PDF Downloads 48216354 Studying Second Language Development from a Complex Dynamic Systems Perspective
Authors: L. Freeborn
Abstract:
This paper discusses the application of complex dynamic system theory (DST) to the study of individual differences in second language development. This transdisciplinary framework allows researchers to view the trajectory of language development as a dynamic, non-linear process. A DST approach views language as multi-componential, consisting of multiple complex systems and nested layers. These multiple components and systems continuously interact and influence each other at both the macro- and micro-level. Dynamic systems theory aims to explain and describe the development of the language system, rather than make predictions about its trajectory. Such a holistic and ecological approach to second language development allows researchers to include various research methods from neurological, cognitive, and social perspectives. A DST perspective would involve in-depth analyses as well as mixed methods research. To illustrate, a neurobiological approach to second language development could include non-invasive neuroimaging techniques such as electroencephalography (EEG) and functional magnetic resonance imaging (fMRI) to investigate areas of brain activation during language-related tasks. A cognitive framework would further include behavioural research methods to assess the influence of intelligence and personality traits, as well as individual differences in foreign language aptitude, such as phonetic coding ability and working memory capacity. Exploring second language development from a DST approach would also benefit from including perspectives from the field of applied linguistics, regarding the teaching context, second language input, and the role of affective factors such as motivation. In this way, applying mixed research methods from neurobiological, cognitive, and social approaches would enable researchers to have a more holistic view of the dynamic and complex processes of second language development.Keywords: dynamic systems theory, mixed methods, research design, second language development
Procedia PDF Downloads 13416353 A Comparative Study on Creep Modeling in Composites
Authors: Roham Rafiee, Behzad Mazhari
Abstract:
Composite structures, having incredible properties, have gained considerable popularity in the last few decades. Among all types, polymer matrix composites are being used extensively due to their unique characteristics including low weight, convenient fabrication process and low cost. Having polymer as matrix, these type of composites show different creep behavior when compared to metals and even other types of composites since most polymers undergo creep even in room temperature. One of the most challenging topics in creep is to introduce new techniques for predicting long term creep behavior of materials. Depending on the material which is being studied the appropriate method would be different. Methods already proposed for predicting long term creep behavior of polymer matrix composites can be divided into five categories: (1) Analytical Modeling, (2) Empirical Modeling, (3) Superposition Based Modeling (Semi-empirical), (4) Rheological Modeling, (5) Finite Element Modeling. Each of these methods has individual characteristics. Studies have shown that none of the mentioned methods can predict long term creep behavior of all PMC composites in all circumstances (loading, temperature, etc.) but each of them has its own priority in different situations. The reason to this issue can be found in theoretical basis of these methods. In this study after a brief review over the background theory of each method, they are compared in terms of their applicability in predicting long-term behavior of composite structures. Finally, the explained materials are observed through some experimental studies executed by other researchers.Keywords: creep, comparative study, modeling, composite materials
Procedia PDF Downloads 44016352 Solid Waste Management through Mushroom Cultivation: An Eco Friendly Approach
Authors: Mary Josephine
Abstract:
Waste of certain process can be the input source of other sectors in order to reduce environmental pollution. Today there are more and more solid wastes are generated, but only very small amount of those are recycled. So, the threatening of environmental pressure to public health is very serious. The methods considered for the treatment of solid waste are biogas tanks or processing to make animal feed and fertilizer, however, they did not perform well. An alternative approach is growing mushrooms on waste residues. This is regarded as an environmental friendly solution with potential economic benefit. The substrate producers do their best to produce quality substrate at low cost. Apart from other methods, this can be achieved by employing biologically degradable wastes used as the resource material component of the substrate. Mushroom growing is a significant tool for the restoration, replenishment and remediation of Earth’s overburdened ecosphere. One of the rational methods of waste utilization involves locally available wastes. The present study aims to find out the yield of mushroom grown on locally available waste for free and to conserve our environment by recycling wastes.Keywords: biodegradable, environment, mushroom, remediation
Procedia PDF Downloads 39516351 Domain-Specific Languages Evaluation: A Literature Review and Experience Report
Authors: Sofia Meacham
Abstract:
In this abstract paper, the Domain-Specific Languages (DSL) evaluation will be presented based on existing literature and years of experience developing DSLs for several domains. The domains we worked on ranged from AI, business applications, and finances/accounting to health. In general, DSLs have been utilised in many domains to provide tailored and efficient solutions to address specific problems. Although they are a reputable method among highly technical circles and have also been used by non-technical experts with success, according to our knowledge, there isn’t a commonly accepted method for evaluating them. There are some methods that define criteria that are adaptations from the general software engineering quality criteria. Other literature focuses on the DSL usability aspect of evaluation and applies methods such as Human-Computer Interaction (HCI) and goal modeling. All these approaches are either hard to introduce, such as the goal modeling, or seem to ignore the domain-specific focus of the DSLs. From our experience, the DSLs have domain-specificity in their core, and consequently, the methods to evaluate them should also include domain-specific criteria in their core. The domain-specific criteria would require synergy between the domain experts and the DSL developers in the same way that DSLs cannot be developed without domain-experts involvement. Methods from agile and other software engineering practices, such as co-creation workshops, should be further emphasised and explored to facilitate this direction. Concluding, our latest experience and plans for DSLs evaluation will be presented and open for discussion.Keywords: domain-specific languages, DSL evaluation, DSL usability, DSL quality metrics
Procedia PDF Downloads 10116350 Continuous Improvement Model for Creative Industries Development
Authors: Rolandas Strazdas, Jurate Cerneviciute
Abstract:
Creative industries are defined as those industries which produce tangible or intangible artistic and creative output and have a potential for income generation by exploitingcultural assets and producing knowledge-based goods and services (both traditional and contemporary). With the emergence of an entire sector of creative industriestriggered by the development of creative products managingcreativity-based business processes becomes a critical issue. Diverse managerial practices and models on effective management of creativity have beenexamined in scholarly literature. Even thoughthese studies suggest how creativity in organisations can be nourished, they do not sufficiently relate the proposed practices to the underlying business processes. The article analyses a range of business process improvement methods such as PDCA, DMAIC, DMADV and TOC. The strengths and weaknesses of these methods aimed to improvethe innovation development process are identified. Based on the analysis of the existing improvement methods, a continuous improvement model was developed and presented in the article.Keywords: continuous improvement, creative industries, improvement model, process mapping
Procedia PDF Downloads 46516349 Prediction of the Lateral Bearing Capacity of Short Piles in Clayey Soils Using Imperialist Competitive Algorithm-Based Artificial Neural Networks
Authors: Reza Dinarvand, Mahdi Sadeghian, Somaye Sadeghian
Abstract:
Prediction of the ultimate bearing capacity of piles (Qu) is one of the basic issues in geotechnical engineering. So far, several methods have been used to estimate Qu, including the recently developed artificial intelligence methods. In recent years, optimization algorithms have been used to minimize artificial network errors, such as colony algorithms, genetic algorithms, imperialist competitive algorithms, and so on. In the present research, artificial neural networks based on colonial competition algorithm (ANN-ICA) were used, and their results were compared with other methods. The results of laboratory tests of short piles in clayey soils with parameters such as pile diameter, pile buried length, eccentricity of load and undrained shear resistance of soil were used for modeling and evaluation. The results showed that ICA-based artificial neural networks predicted lateral bearing capacity of short piles with a correlation coefficient of 0.9865 for training data and 0.975 for test data. Furthermore, the results of the model indicated the superiority of ICA-based artificial neural networks compared to back-propagation artificial neural networks as well as the Broms and Hansen methods.Keywords: artificial neural network, clayey soil, imperialist competition algorithm, lateral bearing capacity, short pile
Procedia PDF Downloads 15116348 Pharmacophore-Based Modeling of a Series of Human Glutaminyl Cyclase Inhibitors to Identify Lead Molecules by Virtual Screening, Molecular Docking and Molecular Dynamics Simulation Study
Authors: Ankur Chaudhuri, Sibani Sen Chakraborty
Abstract:
In human, glutaminyl cyclase activity is highly abundant in neuronal and secretory tissues and is preferentially restricted to hypothalamus and pituitary. The N-terminal modification of β-amyloids (Aβs) peptides by the generation of a pyro-glutamyl (pGlu) modified Aβs (pE-Aβs) is an important process in the initiation of the formation of neurotoxic plaques in Alzheimer’s disease (AD). This process is catalyzed by glutaminyl cyclase (QC). The expression of QC is characteristically up-regulated in the early stage of AD, and the hallmark of the inhibition of QC is the prevention of the formation of pE-Aβs and plaques. A computer-aided drug design (CADD) process was employed to give an idea for the designing of potentially active compounds to understand the inhibitory potency against human glutaminyl cyclase (QC). This work elaborates the ligand-based and structure-based pharmacophore exploration of glutaminyl cyclase (QC) by using the known inhibitors. Three dimensional (3D) quantitative structure-activity relationship (QSAR) methods were applied to 154 compounds with known IC50 values. All the inhibitors were divided into two sets, training-set, and test-sets. Generally, training-set was used to build the quantitative pharmacophore model based on the principle of structural diversity, whereas the test-set was employed to evaluate the predictive ability of the pharmacophore hypotheses. A chemical feature-based pharmacophore model was generated from the known 92 training-set compounds by HypoGen module implemented in Discovery Studio 2017 R2 software package. The best hypothesis was selected (Hypo1) based upon the highest correlation coefficient (0.8906), lowest total cost (463.72), and the lowest root mean square deviation (2.24Å) values. The highest correlation coefficient value indicates greater predictive activity of the hypothesis, whereas the lower root mean square deviation signifies a small deviation of experimental activity from the predicted one. The best pharmacophore model (Hypo1) of the candidate inhibitors predicted comprised four features: two hydrogen bond acceptor, one hydrogen bond donor, and one hydrophobic feature. The Hypo1 was validated by several parameters such as test set activity prediction, cost analysis, Fischer's randomization test, leave-one-out method, and heat map of ligand profiler. The predicted features were then used for virtual screening of potential compounds from NCI, ASINEX, Maybridge and Chembridge databases. More than seven million compounds were used for this purpose. The hit compounds were filtered by drug-likeness and pharmacokinetics properties. The selective hits were docked to the high-resolution three-dimensional structure of the target protein glutaminyl cyclase (PDB ID: 2AFU/2AFW) to filter these hits further. To validate the molecular docking results, the most active compound from the dataset was selected as a reference molecule. From the density functional theory (DFT) study, ten molecules were selected based on their highest HOMO (highest occupied molecular orbitals) energy and the lowest bandgap values. Molecular dynamics simulations with explicit solvation systems of the final ten hit compounds revealed that a large number of non-covalent interactions were formed with the binding site of the human glutaminyl cyclase. It was suggested that the hit compounds reported in this study could help in future designing of potent inhibitors as leads against human glutaminyl cyclase.Keywords: glutaminyl cyclase, hit lead, pharmacophore model, simulation
Procedia PDF Downloads 13016347 Tourism as Benefactor to Peace amidst the Structural Conflict: An Exploratory Case Study of Nepal
Authors: Pranil Kumar Upadhayaya
Abstract:
While peace is dividend to tourism, tourism can also be a vital force for world peace. The existing body of knowledge on a tripartite complex nexus between tourism, peace and conflict reveals that tourism is benefactor to peace and sensitive to conflict. By contextualizing the ongoing sporadic structural conflict in the transitional phase in the aftermath of a decade long (1996-2006), Maoist armed conflict in Nepal, the purpose of this study is to explore the potentials of tourism in peace-building. The outcomes of this research paper is based on the mixed methods of research (qualitative and quantitative). Though the armed conflict ended with the comprehensive peace agreement in 2006 but there is constant manifestations of non-violent structural conflicts, which continue to threaten the sustainability of tourism industry. With the persistent application of coping strategies, tourism is found resilient during the ongoing structural political conflict. The strong coping abilities of the private sector of tourism industry have also intersected with peace-building efforts with more reactive and less proactive (pro-peace) engagements. This paper ascertains about the application of the ‘theory of tourism security’ by Nepalese tourism industry while coping with conflict and reviving, and sustaining. It reveals that the multiple verities of tourism at present has heterogeneous degree of peace potentials. The opportunities of ‘peace through tourism’ can be promoted subject to its molding with responsible, sustainable and participatory characteristics. This paper comes out with pragmatic policy recommendations for strengthening the position of tourism as a true peace-builder: (a) a broad shift from mainstream conventional tourism to the community based rural with local participation and ownership to fulfill Nepal’s potentials for peace, and (b) building and applications of the managerial and operational codes of conducts for owners and workers (labor unions) at all tourism enterprises and strengthen their practices.Keywords: code of conduct, community based tourism, conflict, peace-building, tourism
Procedia PDF Downloads 26416346 Business Intelligent to a Decision Support Tool for Green Entrepreneurship: Meso and Macro Regions
Authors: Anishur Rahman, Maria Areias, Diogo Simões, Ana Figeuiredo, Filipa Figueiredo, João Nunes
Abstract:
The circular economy (CE) has gained increased awareness among academics, businesses, and decision-makers as it stimulates resource circularity in the production and consumption systems. A large epistemological study has explored the principles of CE, but scant attention eagerly focused on analysing how CE is evaluated, consented to, and enforced using economic metabolism data and business intelligent framework. Economic metabolism involves the ongoing exchange of materials and energy within and across socio-economic systems and requires the assessment of vast amounts of data to provide quantitative analysis related to effective resource management. Limited concern, the present work has focused on the regional flows pilot region from Portugal. By addressing this gap, this study aims to promote eco-innovation and sustainability in the regions of Intermunicipal Communities Região de Coimbra, Viseu Dão Lafões and Beiras e Serra da Estrela, using this data to find precise synergies in terms of material flows and give companies a competitive advantage in form of valuable waste destinations, access to new resources and new markets, cost reduction and risk sharing benefits. In our work, emphasis on applying artificial intelligence (AI) and, more specifically, on implementing state-of-the-art deep learning algorithms is placed, contributing to construction a business intelligent approach. With the emergence of new approaches generally highlighted under the sub-heading of AI and machine learning (ML), the methods for statistical analysis of complex and uncertain production systems are facing significant changes. Therefore, various definitions of AI and its differences from traditional statistics are presented, and furthermore, ML is introduced to identify its place in data science and the differences in topics such as big data analytics and in production problems that using AI and ML are identified. A lifecycle-based approach is then taken to analyse the use of different methods in each phase to identify the most useful technologies and unifying attributes of AI in manufacturing. Most of macroeconomic metabolisms models are mainly direct to contexts of large metropolis, neglecting rural territories, so within this project, a dynamic decision support model coupled with artificial intelligence tools and information platforms will be developed, focused on the reality of these transition zones between the rural and urban. Thus, a real decision support tool is under development, which will surpass the scientific developments carried out to date and will allow to overcome imitations related to the availability and reliability of data.Keywords: circular economy, artificial intelligence, economic metabolisms, machine learning
Procedia PDF Downloads 7016345 Factors that Contribute to the Improvement of the Sense of Self-Efficacy of Special Educators in Inclusive Settings in Greece
Authors: Sotiria Tzivinikou, Dimitra Kagkara
Abstract:
Teacher’s sense of self-efficacy can affect significantly both teacher’s and student’s performance. More specific, self-efficacy is associated with the learning outcomes as well as student’s motivation and self-efficacy. For example, teachers with high sense of self-efficacy are more open to innovations and invest more effort in teaching. In addition to this, effective inclusive education is associated with higher levels of teacher’s self-efficacy. Pre-service teachers with high levels of self-efficacy could handle student’s behavior better and more effectively assist students with special educational needs. Teacher preparation programs are also important, because teacher’s efficacy beliefs are shaped early in learning, as a result the quality of teacher’s education programs can affect the sense of self-efficacy of pre-service teachers. Usually, a number of pre-service teachers do not consider themselves well prepared to work with students with special educational needs and do not have the appropriate sense of self-efficacy. This study aims to investigate the factors that contribute to the improvement of the sense of self-efficacy of pre-service special educators by using an academic practicum training program. The sample of this study is 159 pre-service special educators, who also participated in the academic practicum training program. For the purpose of this study were used quantitative methods for data collection and analysis. Teacher’s self-efficacy was assessed by the teachers themselves with the completion of a questionnaire which was based on the scale of Teacher’s Sense of Efficacy Scale. Pre and post measurements of teacher’s self-efficacy were taken. The results of the survey are consistent with those of the international literature. The results indicate that a significant number of pre-service special educators do not hold the appropriate sense of self-efficacy regarding teaching students with special educational needs. Moreover, a quality academic training program constitutes a crucial factor for the improvement of the sense of self-efficacy of pre-service special educators, as additional for the provision of high quality inclusive education.Keywords: inclusive education, pre-service, self-efficacy, training program
Procedia PDF Downloads 24716344 A Comparison of Energy Calculations for a Single-Family Detached Home with Two Energy Simulation Methods
Authors: Amir Sattari
Abstract:
For newly produced houses and energy renovations, an energy calculation needs to be conducted. This is done to verify whether the energy consumption criteria of the house -to reach the energy targets by 2020 and 2050- are in-line with the norms. The main purpose of this study is to confirm whether easy to use energy calculation software or hand calculations used by small companies or individuals give logical results compared to advanced energy simulation program used by researchers or bigger companies. There are different methods for calculating energy consumption. In this paper, two energy calculation programs are used and the relation of energy consumption with solar radiation is compared. A hand calculation is also done to validate whether the hand calculations are still reasonable. The two computer programs which have been used are TMF Energi (the easy energy calculation variant used by small companies or individuals) and IDA ICE - Indoor Climate and Energy (the advanced energy simulation program used by researchers or larger companies). The calculations are done for a standard house from the Swedish house supplier Fiskarhedenvillan. The method is based on having the same conditions and inputs in the different calculation forms so that the results can be compared and verified. The house has been faced differently to see how the orientation affects energy consumption in different methods. The results for the simulations are close to each other and the hand calculation differs from the computer programs by only 5%. Even if solar factors differ due to the orientation of the house, energy calculation results from different computer programs and even hand calculation methods are in line with each other.Keywords: energy calculation, energy consumption, energy simulation, IDA ICE, TMF energi
Procedia PDF Downloads 113