Search results for: expert testimony
143 Integrating Data Mining with Case-Based Reasoning for Diagnosing Sorghum Anthracnose
Authors: Mariamawit T. Belete
Abstract:
Cereal production and marketing are the means of livelihood for millions of households in Ethiopia. However, cereal production is constrained by technical and socio-economic factors. Among the technical factors, cereal crop diseases are the major contributing factors to the low yield. The aim of this research is to develop an integration of data mining and knowledge based system for sorghum anthracnose disease diagnosis that assists agriculture experts and development agents to make timely decisions. Anthracnose diagnosing systems gather information from Melkassa agricultural research center and attempt to score anthracnose severity scale. Empirical research is designed for data exploration, modeling, and confirmatory procedures for testing hypothesis and prediction to draw a sound conclusion. WEKA (Waikato Environment for Knowledge Analysis) was employed for the modeling. Knowledge based system has come across a variety of approaches based on the knowledge representation method; case-based reasoning (CBR) is one of the popular approaches used in knowledge-based system. CBR is a problem solving strategy that uses previous cases to solve new problems. The system utilizes hidden knowledge extracted by employing clustering algorithms, specifically K-means clustering from sampled anthracnose dataset. Clustered cases with centroid value are mapped to jCOLIBRI, and then the integrator application is created using NetBeans with JDK 8.0.2. The important part of a case based reasoning model includes case retrieval; the similarity measuring stage, reuse; which allows domain expert to transfer retrieval case solution to suit for the current case, revise; to test the solution, and retain to store the confirmed solution to the case base for future use. Evaluation of the system was done for both system performance and user acceptance. For testing the prototype, seven test cases were used. Experimental result shows that the system achieves an average precision and recall values of 70% and 83%, respectively. User acceptance testing also performed by involving five domain experts, and an average of 83% acceptance is achieved. Although the result of this study is promising, however, further study should be done an investigation on hybrid approach such as rule based reasoning, and pictorial retrieval process are recommended.Keywords: sorghum anthracnose, data mining, case based reasoning, integration
Procedia PDF Downloads 81142 Systematic Mapping Study of Digitization and Analysis of Manufacturing Data
Authors: R. Clancy, M. Ahern, D. O’Sullivan, K. Bruton
Abstract:
The manufacturing industry is currently undergoing a digital transformation as part of the mega-trend Industry 4.0. As part of this phase of the industrial revolution, traditional manufacturing processes are being combined with digital technologies to achieve smarter and more efficient production. To successfully digitally transform a manufacturing facility, the processes must first be digitized. This is the conversion of information from an analogue format to a digital format. The objective of this study was to explore the research area of digitizing manufacturing data as part of the worldwide paradigm, Industry 4.0. The formal methodology of a systematic mapping study was utilized to capture a representative sample of the research area and assess its current state. Specific research questions were defined to assess the key benefits and limitations associated with the digitization of manufacturing data. Research papers were classified according to the type of research and type of contribution to the research area. Upon analyzing 54 papers identified in this area, it was noted that 23 of the papers originated in Germany. This is an unsurprising finding as Industry 4.0 is originally a German strategy with supporting strong policy instruments being utilized in Germany to support its implementation. It was also found that the Fraunhofer Institute for Mechatronic Systems Design, in collaboration with the University of Paderborn in Germany, was the most frequent contributing Institution of the research papers with three papers published. The literature suggested future research directions and highlighted one specific gap in the area. There exists an unresolved gap between the data science experts and the manufacturing process experts in the industry. The data analytics expertise is not useful unless the manufacturing process information is utilized. A legitimate understanding of the data is crucial to perform accurate analytics and gain true, valuable insights into the manufacturing process. There lies a gap between the manufacturing operations and the information technology/data analytics departments within enterprises, which was borne out by the results of many of the case studies reviewed as part of this work. To test the concept of this gap existing, the researcher initiated an industrial case study in which they embedded themselves between the subject matter expert of the manufacturing process and the data scientist. Of the papers resulting from the systematic mapping study, 12 of the papers contributed a framework, another 12 of the papers were based on a case study, and 11 of the papers focused on theory. However, there were only three papers that contributed a methodology. This provides further evidence for the need for an industry-focused methodology for digitizing and analyzing manufacturing data, which will be developed in future research.Keywords: analytics, digitization, industry 4.0, manufacturing
Procedia PDF Downloads 111141 Cultural Heritage Resources for Tourism, Two Countries – Two Approaches: A Comparative Analysis of Cultural Tourism Products in Turkey and Austria
Authors: Irfan Arikan, George Christian Steckenbauer
Abstract:
Turkey and Austria are examples for highly developed tourism destinations, where tourism providers use cultural heritage and regional natural resources to develop modern tourism products in order to be successful on increasingly competitive international tourism markets. The use and exploitation of these resources follow on the one hand international standards of tourism marketing (as ‘sustainability’). Therefore, we find highly comparable internationalized products in these destinations (like hotel products, museums, spas etc.). On the other hand, development standards and processes strongly depend on local, regional and national cultures, which influence the way how people work, cooperate, think and create. Thus, cultural factors also influence the attitude towards cultural heritage and natural resources and the way, how these resources are used for the creation of tourism products. This leads to differences in the development of tourism products on several levels: 1. In the selection of cultural heritage and natural resources for the product development process 2. In the processes, how tourism products are created 3. In the way, how providers and marketing organisations work with tourism products based on cultural heritage or natural resources. Aim of this paper is to discover differences in these dimensions by analysing and comparing examples of tourism products in Turkey and Austria, both countries with a highly developed, high professional tourism industry and rich experience of stakeholders in tourism industry in the field of product development and marketing. The cases are selected from the following fields: + Cultural tourism / heritage tourism + City tourism + Industrial heritage tourism + Nature and outdoor tourism + Health tourism The cases are analysed based on available secondary data (as several cases are scientifically described) and expert interviews with local and regional stakeholders of tourism industry and tourism experts. The available primary and secondary data will be analysed and displayed in a comparative structure that allows to derive answers to the above stated research question. The result of the project therefore will be a more precise picture about the influence of cultural differences on the use and exploitation of resources in the field of tourism that allows developing recommendations for tourism industry, which must be taken into consideration to assure cultural and natural resources are treated in a sustainable and responsible way. The authors will edit these culture-cross recommendations in form of a ‘check-list’ that can be used as a ‘guideline’ for tourism professionals in the field of product development and marketing and therefore connects theoretical research to the field of practical application and closes the gap between academic research and the field of tourism practice.Keywords: cultural heritage, natural resources, Austria, Turkey
Procedia PDF Downloads 492140 Transformation of Periodic Fuzzy Membership Function to Discrete Polygon on Circular Polar Coordinates
Authors: Takashi Mitsuishi
Abstract:
Fuzzy logic has gained acceptance in the recent years in the fields of social sciences and humanities such as psychology and linguistics because it can manage the fuzziness of words and human subjectivity in a logical manner. However, the major field of application of the fuzzy logic is control engineering as it is a part of the set theory and mathematical logic. Mamdani method, which is the most popular technique for approximate reasoning in the field of fuzzy control, is one of the ways to numerically represent the control afforded by human language and sensitivity and has been applied in various practical control plants. Fuzzy logic has been gradually developing as an artificial intelligence in different applications such as neural networks, expert systems, and operations research. The objects of inference vary for different application fields. Some of these include time, angle, color, symptom and medical condition whose fuzzy membership function is a periodic function. In the defuzzification stage, the domain of the membership function should be unique to obtain uniqueness its defuzzified value. However, if the domain of the periodic membership function is determined as unique, an unintuitive defuzzified value may be obtained as the inference result using the center of gravity method. Therefore, the authors propose a method of circular-polar-coordinates transformation and defuzzification of the periodic membership functions in this study. The transformation to circular polar coordinates simplifies the domain of the periodic membership function. Defuzzified value in circular polar coordinates is an argument. Furthermore, it is required that the argument is calculated from a closed plane figure which is a periodic membership function on the circular polar coordinates. If the closed plane figure is continuous with the continuity of the membership function, a significant amount of computation is required. Therefore, to simplify the practice example and significantly reduce the computational complexity, we have discretized the continuous interval and the membership function in this study. In this study, the following three methods are proposed to decide the argument from the discrete polygon which the continuous plane figure is transformed into. The first method provides an argument of a straight line passing through the origin and through the coordinate of the arithmetic mean of each coordinate of the polygon (physical center of gravity). The second one provides an argument of a straight line passing through the origin and the coordinate of the geometric center of gravity of the polygon. The third one provides an argument of a straight line passing through the origin bisecting the perimeter of the polygon (or the closed continuous plane figure).Keywords: defuzzification, fuzzy membership function, periodic function, polar coordinates transformation
Procedia PDF Downloads 363139 Predicting Personality and Psychological Distress Using Natural Language Processing
Authors: Jihee Jang, Seowon Yoon, Gaeun Son, Minjung Kang, Joon Yeon Choeh, Kee-Hong Choi
Abstract:
Background: Self-report multiple choice questionnaires have been widely utilized to quantitatively measure one’s personality and psychological constructs. Despite several strengths (e.g., brevity and utility), self-report multiple-choice questionnaires have considerable limitations in nature. With the rise of machine learning (ML) and Natural language processing (NLP), researchers in the field of psychology are widely adopting NLP to assess psychological constructs to predict human behaviors. However, there is a lack of connections between the work being performed in computer science and that psychology due to small data sets and unvalidated modeling practices. Aims: The current article introduces the study method and procedure of phase II, which includes the interview questions for the five-factor model (FFM) of personality developed in phase I. This study aims to develop the interview (semi-structured) and open-ended questions for the FFM-based personality assessments, specifically designed with experts in the field of clinical and personality psychology (phase 1), and to collect the personality-related text data using the interview questions and self-report measures on personality and psychological distress (phase 2). The purpose of the study includes examining the relationship between natural language data obtained from the interview questions, measuring the FFM personality constructs, and psychological distress to demonstrate the validity of the natural language-based personality prediction. Methods: The phase I (pilot) study was conducted on fifty-nine native Korean adults to acquire the personality-related text data from the interview (semi-structured) and open-ended questions based on the FFM of personality. The interview questions were revised and finalized with the feedback from the external expert committee, consisting of personality and clinical psychologists. Based on the established interview questions, a total of 425 Korean adults were recruited using a convenience sampling method via an online survey. The text data collected from interviews were analyzed using natural language processing. The results of the online survey, including demographic data, depression, anxiety, and personality inventories, were analyzed together in the model to predict individuals’ FFM of personality and the level of psychological distress (phase 2).Keywords: personality prediction, psychological distress prediction, natural language processing, machine learning, the five-factor model of personality
Procedia PDF Downloads 78138 Impact of Pandemics on Cities and Societies
Authors: Deepak Jugran
Abstract:
Purpose: The purpose of this study is to identify how past Pandemics shaped social evolution and cities. Methodology: A historical and comparative analysis of major historical pandemics in human history their origin, transmission route, biological response and the aftereffects. A Comprehensive pre & post pandemic scenario and focuses selectively on major issues and pandemics that have deepest & lasting impact on society with available secondary data. Results: Past pandemics shaped the behavior of human societies and their cities and made them more resilient biologically, intellectually & socially endorsing the theory of “Survival of the fittest” by Sir Charles Darwin. Pandemics & Infectious diseases are here to stay and as a human society, we need to strengthen our collective response & preparedness besides evolving mechanisms for strict controls on inter-continental movements of people, & especially animals who become carriers for these viruses. Conclusion: Pandemics always resulted in great mortality, but they also improved the overall individual human immunology & collective social response; at the same time, they also improved the public health system of cities, health delivery systems, water, sewage distribution system, institutionalized various welfare reforms and overall collective social response by the societies. It made human beings more resilient biologically, intellectually, and socially hence endorsing the theory of “AGIL” by Prof Talcott Parsons. Pandemics & infectious diseases are here to stay and as humans, we need to strengthen our city response & preparedness besides evolving mechanisms for strict controls on inter-continental movements of people, especially animals who always acted as carriers for these novel viruses. Pandemics over the years acted like natural storms, mitigated the prevailing social imbalances and laid the foundation for scientific discoveries. We understand that post-Covid-19, institutionalized city, state and national mechanisms will get strengthened and the recommendations issued by the various expert groups which were ignored earlier will now be implemented for reliable anticipation, better preparedness & help to minimize the impact of Pandemics. Our analysis does not intend to present chronological findings of pandemics but rather focuses selectively on major pandemics in history, their causes and how they wiped out an entire city’s population and influenced the societies, their behavior and facilitated social evolution.Keywords: pandemics, Covid-19, social evolution, cities
Procedia PDF Downloads 112137 Tumor Size and Lymph Node Metastasis Detection in Colon Cancer Patients Using MR Images
Authors: Mohammadreza Hedyehzadeh, Mahdi Yousefi
Abstract:
Colon cancer is one of the most common cancer, which predicted to increase its prevalence due to the bad eating habits of peoples. Nowadays, due to the busyness of people, the use of fast foods is increasing, and therefore, diagnosis of this disease and its treatment are of particular importance. To determine the best treatment approach for each specific colon cancer patients, the oncologist should be known the stage of the tumor. The most common method to determine the tumor stage is TNM staging system. In this system, M indicates the presence of metastasis, N indicates the extent of spread to the lymph nodes, and T indicates the size of the tumor. It is clear that in order to determine all three of these parameters, an imaging method must be used, and the gold standard imaging protocols for this purpose are CT and PET/CT. In CT imaging, due to the use of X-rays, the risk of cancer and the absorbed dose of the patient is high, while in the PET/CT method, there is a lack of access to the device due to its high cost. Therefore, in this study, we aimed to estimate the tumor size and the extent of its spread to the lymph nodes using MR images. More than 1300 MR images collected from the TCIA portal, and in the first step (pre-processing), histogram equalization to improve image qualities and resizing to get the same image size was done. Two expert radiologists, which work more than 21 years on colon cancer cases, segmented the images and extracted the tumor region from the images. The next step is feature extraction from segmented images and then classify the data into three classes: T0N0، T3N1 و T3N2. In this article, the VGG-16 convolutional neural network has been used to perform both of the above-mentioned tasks, i.e., feature extraction and classification. This network has 13 convolution layers for feature extraction and three fully connected layers with the softmax activation function for classification. In order to validate the proposed method, the 10-fold cross validation method used in such a way that the data was randomly divided into three parts: training (70% of data), validation (10% of data) and the rest for testing. It is repeated 10 times, each time, the accuracy, sensitivity and specificity of the model are calculated and the average of ten repetitions is reported as the result. The accuracy, specificity and sensitivity of the proposed method for testing dataset was 89/09%, 95/8% and 96/4%. Compared to previous studies, using a safe imaging technique (MRI) and non-use of predefined hand-crafted imaging features to determine the stage of colon cancer patients are some of the study advantages.Keywords: colon cancer, VGG-16, magnetic resonance imaging, tumor size, lymph node metastasis
Procedia PDF Downloads 59136 Comparison of Parametric and Bayesian Survival Regression Models in Simulated and HIV Patient Antiretroviral Therapy Data: Case Study of Alamata Hospital, North Ethiopia
Authors: Zeytu G. Asfaw, Serkalem K. Abrha, Demisew G. Degefu
Abstract:
Background: HIV/AIDS remains a major public health problem in Ethiopia and heavily affecting people of productive and reproductive age. We aimed to compare the performance of Parametric Survival Analysis and Bayesian Survival Analysis using simulations and in a real dataset application focused on determining predictors of HIV patient survival. Methods: A Parametric Survival Models - Exponential, Weibull, Log-normal, Log-logistic, Gompertz and Generalized gamma distributions were considered. Simulation study was carried out with two different algorithms that were informative and noninformative priors. A retrospective cohort study was implemented for HIV infected patients under Highly Active Antiretroviral Therapy in Alamata General Hospital, North Ethiopia. Results: A total of 320 HIV patients were included in the study where 52.19% females and 47.81% males. According to Kaplan-Meier survival estimates for the two sex groups, females has shown better survival time in comparison with their male counterparts. The median survival time of HIV patients was 79 months. During the follow-up period 89 (27.81%) deaths and 231 (72.19%) censored individuals registered. The average baseline cluster of differentiation 4 (CD4) cells count for HIV/AIDS patients were 126.01 but after a three-year antiretroviral therapy follow-up the average cluster of differentiation 4 (CD4) cells counts were 305.74, which was quite encouraging. Age, functional status, tuberculosis screen, past opportunistic infection, baseline cluster of differentiation 4 (CD4) cells, World Health Organization clinical stage, sex, marital status, employment status, occupation type, baseline weight were found statistically significant factors for longer survival of HIV patients. The standard error of all covariate in Bayesian log-normal survival model is less than the classical one. Hence, Bayesian survival analysis showed better performance than classical parametric survival analysis, when subjective data analysis was performed by considering expert opinions and historical knowledge about the parameters. Conclusions: Thus, HIV/AIDS patient mortality rate could be reduced through timely antiretroviral therapy with special care on the potential factors. Moreover, Bayesian log-normal survival model was preferable than the classical log-normal survival model for determining predictors of HIV patients survival.Keywords: antiretroviral therapy (ART), Bayesian analysis, HIV, log-normal, parametric survival models
Procedia PDF Downloads 196135 Creating Futures: Using Fictive Scripting Methods for Institutional Strategic Planning
Authors: Christine Winberg, James Garraway
Abstract:
Many key university documents, such as vision and mission statements and strategic plans, are aspirational and future-oriented. There is a wide range of future-oriented methods that are used in planning applications, ranging from mathematical modelling to expert opinions. Many of these methods have limitations, and planners using these tools might, for example, make the technical-rational assumption that their plans will unfold in a logical and inevitable fashion, thus underestimating the many complex forces that are at play in planning for an unknown future. This is the issue that this study addresses. The overall project aim was to assist a new university of technology in developing appropriate responses to its social responsibility, graduate employability and research missions in its strategic plan. The specific research question guiding the research activities and approach was: how might the use of innovative future-oriented planning tools enable or constrain a strategic planning process? The research objective was to engage collaborating groups in the use of an innovative tool to develop and assess future scenarios, for the purpose of developing deeper understandings of possible futures and their challenges. The scenario planning tool chosen was ‘fictive scripting’, an analytical technique derived from Technology Forecasting and Innovation Studies. Fictive scripts are future projections that also take into account the present shape of the world and current developments. The process thus began with a critical diagnosis of the present, highlighting its tensions and frictions. The collaborative groups then developed fictive scripts, each group producing a future scenario that foregrounded different institutional missions, their implications and possible consequences. The scripts were analyzed with a view to identifying their potential contribution to the university’s strategic planning exercise. The unfolding fictive scripts revealed a number of insights in terms of unexpected benefits, unexpected challenges, and unexpected consequences. These insights were not evident in previous strategic planning exercises. The contribution that this study offers is to show how better choices can be made and potential pitfalls avoided through a systematic foresight exercise. When universities develop strategic planning documents, they are looking into the future. In this paper it is argued that the use of appropriate tools for future-oriented exercises, can help planners to understand more fully what achieving desired outcomes might entail, what challenges might be encountered, and what unexpected consequences might ensue.Keywords: fictive scripts, scenarios, strategic planning, technological forecasting
Procedia PDF Downloads 121134 Strategic Public Procurement: A Lever for Social Entrepreneurship and Innovation
Authors: B. Orser, A. Riding, Y. Li
Abstract:
To inform government about how gender gaps in SME ( small and medium-sized enterprise) contracting might be redressed, the research question was: What are the key obstacles to, and response strategies for, increasing the engagement of women business owners among SME suppliers to the government of Canada? Thirty-five interviews with senior policymakers, supplier diversity organization executives, and expert witnesses to the Canadian House of Commons, Standing Committee on Government Operations and Estimates. Qualitative data were conducted and analysed using N’Vivo 11 software. High order response categories included: (a) SME risk mitigation strategies, (b) SME procurement program design, and (c) performance measures. Primary obstacles cited were government red tape and long and complicated requests for proposals (RFPs). The majority of 'common' complaints occur when SMEs have questions about the federal procurement process. Witness responses included use of outcome-based rather than prescriptive procurement practices, more agile procurement, simplified RFPs, making payment within 30 days a procurement priority. Risk mitigation strategies included provision of procurement officers to assess risks and opportunities for businesses and development of more agile procurement procedures and processes. Recommendations to enhance program design included: improved definitional consistency of qualifiers and selection criteria, better co-ordination across agencies; clarification about how SME suppliers benefit from federal contracting; goal setting; specification of categories that are most suitable for women-owned businesses; and, increasing primary contractor awareness about the importance of subcontract relationships. Recommendations also included third-party certification of eligible firms and the need to enhance SMEs’ financial literacy to reduce financial errors. Finally, there remains the need for clear and consistent pre-program statistics to establish baselines (by sector, issuing department) performance measures, targets based on percentage of contracts granted, value of contract, percentage of target employee (women, indigenous), and community benefits including hiring local employees. The study advances strategies to enhance federal procurement programs to facilitate socio-economic policy objectives.Keywords: procurement, small business, policy, women
Procedia PDF Downloads 113133 A Literature Review of How Cognitive Disability Is Represented in Higher Education Research in the African Academy
Authors: Fadzayi M.Maruza
Abstract:
The conversation about diversity in the African academy focuses on the need for an international and ethnically diverse population of scholars and students. Operationalising the concept of cognitive diversity offers us an opportunity to broaden our conception of who can know and who can proclaim knowledge by availing new understandings of what knowledge is and how it is made. Limited attention is paid to the value of diversity generated by cognitive disabilities in the African academy. The inclusion of persons with minds labelled disabled in African academia requires an epistemology of disability to reform the still dominant notion of the expert and scholar as an able-bodied and hyper-rational in African academia. This review wants to explore how cognitive disabilities have been represented in higher education research in Africa or has the African academy reinforced ignorance by promoting an able-bodied academia. The review aims to tackle its exploratory objective by using Malcom Tights framework. The main questions this paper would focus on are: (I)What are the major disability themes and concerns discussed in the disability-related articles? (II)What are the major methods or methodologies used to address the topic in the papers? (III)What are the levels of analysis the papers focus on? (IV)How do higher education researchers define and represent cognitive disabilities in higher education research in Africa? To answer the exploratory questions that are aimed at mapping the disability-related higher education research landscape, Malcolm Tights’ framework is seen as most appropriate. In addition to a thematic categorization, that shall be made after reviewing of published empirical studies on disability in African higher education from the period 2010 – 2017. A synthesis of the findings and implications of African disability studies relating to students with cognitive disabilities in the African Academy will be provided using the categories suggested by Tight as a benchmark. Data for the proposed work shall be taken from well-reputed higher education journals between 2010 and 2017.Using the keyword ‘Disability’ in the titles, abstracts and keywords section of journal articles, a selection of disability-focused higher education articles shall be compiled for analysis regarding cognitive disability. It has to be noted as a limitation that the word Disability might not be sufficient to investigate the topic for there can be many more specific disabilities concerns the researchers would discuss. Therefore, the paper is only intended to give a bird’s eye view of cognitive disability in higher education research and therefore is not comprehensive. The paper is expected to shed some light for me, as a beginning researcher, and other researchers like myself as to what has been the focus of higher education researchers about cognitive disability in the African academy. Keywords: Cognitive diversity, cognitive disability, disability, higher education.Keywords: cognitive disability, cognitive diversity, disability, higher education
Procedia PDF Downloads 314132 Geographical Information System and Multi-Criteria Based Approach to Locate Suitable Sites for Industries to Minimize Agriculture Land Use Changes in Bangladesh
Authors: Nazia Muhsin, Tofael Ahamed, Ryozo Noguchi, Tomohiro Takigawa
Abstract:
One of the most challenging issues to achieve sustainable development on food security is land use changes. The crisis of lands for agricultural production mainly arises from the unplanned transformation of agricultural lands to infrastructure development i.e. urbanization and industrialization. Land use without sustainability assessment could have impact on the food security and environmental protections. Bangladesh, as the densely populated country with limited arable lands is now facing challenges to meet sustainable food security. Agricultural lands are using for economic growth by establishing industries. The industries are spreading from urban areas to the suburban areas and using the agricultural lands. To minimize the agricultural land losses for unplanned industrialization, compact economic zones should be find out in a scientific approach. Therefore, the purpose of the study was to find out suitable sites for industrial growth by land suitability analysis (LSA) by using Geographical Information System (GIS) and multi-criteria analysis (MCA). The goal of the study was to emphases both agricultural lands and industries for sustainable development in land use. The study also attempted to analysis the agricultural land use changes in a suburban area by statistical data of agricultural lands and primary data of the existing industries of the study place. The criteria were selected as proximity to major roads, and proximity to local roads, distant to rivers, waterbodies, settlements, flood-flow zones, agricultural lands for the LSA. The spatial dataset for the criteria were collected from the respective departments of Bangladesh. In addition, the elevation spatial dataset were used from the SRTM (Shuttle Radar Topography Mission) data source. The criteria were further analyzed with factors and constraints in ArcGIS®. Expert’s opinion were applied for weighting the criteria according to the analytical hierarchy process (AHP), a multi-criteria technique. The decision rule was set by using ‘weighted overlay’ tool to aggregate the factors and constraints with the weights of the criteria. The LSA found only 5% of land was most suitable for industrial sites and few compact lands for industrial zones. The developed LSA are expected to help policy makers of land use and urban developers to ensure the sustainability of land uses and agricultural production.Keywords: AHP (analytical hierarchy process), GIS (geographic information system), LSA (land suitability analysis), MCA (multi-criteria analysis)
Procedia PDF Downloads 263131 Followership Styles in the U.S. Hospitality Workforce: A Multi-Generational Comparison Study
Authors: Yinghua Huang, Tsu-Hong Yen
Abstract:
The latest advance in leadership research has revealed that leadership is co-created through the combined action of leading and following. The role of followers is as important as leaders in the leadership process. However, the previous leadership studies often conceptualize leadership as a leader-centric process, while the role of followers is largely neglected in the literature. Until recently, followership studies receives more attention because the character and behavior of followers are as vital as the leader during the leadership process. Yet, there is a dearth of followership research in the context of tourism and hospitality industries. Therefore, this study seeks to fill in the gap of knowledge and investigate the followership styles in the U.S. hospitality workforce. In particular, the objectives of this study are to identify popular followership practices among hospitality employees and evaluate hospitality employees' followership styles using Kelley’s followership typology framework. This study also compared the generational differences in followership styles among hospitality employees. According to the U.S. Bureau of Labor Statistics, the workforce in the lodging and foodservice sectors consists of around 12% baby boomers, 29% Gen Xs, 23% Gen Ys, and 36% Gen Zs in 2019. The diversity of workforce demographics in the U.S. hospitality industry calls for more attention to understand the generational differences in followership styles and organizational performance. This study conducted an in-depth interview and a questionnaire survey to collect both qualitative and quantitative data. A snowball sampling method was used to recruit participants working in the hospitality industry in the San Francisco Bay Area, California, USA. A total of 120 hospitality employees participated in this study, including 22 baby boomers, 32 Gen Xs, 30 Gen Ys, and 36 Gen Zs. 45% of the participants were males, and 55% were female. The findings of this study identified good followership practices across the multi-generational participants. For example, a Gen Y participant said that 'followership involves learning and molding oneself after another person usually an expert in an area of interest. I think of followership as personal and professional development. I learn and get better by hands-on training and experience'. A Gen X participant said that 'I can excel by not being fearful of taking on unfamiliar tasks and accepting challenges.' Furthermore, this study identified five typologies of Kelley’s followership model among the participants: 45% exemplary followers, 13% pragmatist followers, 2% alienated followers, 18% passive followers, and 23% conformist followers. The generational differences in followership styles were also identified. The findings of this study contribute to the hospitality human resource literature by identifying the multi-generational perspectives of followership styles among hospitality employees. The findings provide valuable insights for hospitality leaders to understand their followers better. Hospitality leaders were suggested to adjust their leadership style and communication strategies based on employees' different followership styles.Keywords: followership, hospitality workforce, generational diversity, Kelley’s followership topology
Procedia PDF Downloads 129130 Building User Behavioral Models by Processing Web Logs and Clustering Mechanisms
Authors: Madhuka G. P. D. Udantha, Gihan V. Dias, Surangika Ranathunga
Abstract:
Today Websites contain very interesting applications. But there are only few methodologies to analyze User navigations through the Websites and formulating if the Website is put to correct use. The web logs are only used if some major attack or malfunctioning occurs. Web Logs contain lot interesting dealings on users in the system. Analyzing web logs has become a challenge due to the huge log volume. Finding interesting patterns is not as easy as it is due to size, distribution and importance of minor details of each log. Web logs contain very important data of user and site which are not been put to good use. Retrieving interesting information from logs gives an idea of what the users need, group users according to their various needs and improve site to build an effective and efficient site. The model we built is able to detect attacks or malfunctioning of the system and anomaly detection. Logs will be more complex as volume of traffic and the size and complexity of web site grows. Unsupervised techniques are used in this solution which is fully automated. Expert knowledge is only used in validation. In our approach first clean and purify the logs to bring them to a common platform with a standard format and structure. After cleaning module web session builder is executed. It outputs two files, Web Sessions file and Indexed URLs file. The Indexed URLs file contains the list of URLs accessed and their indices. Web Sessions file lists down the indices of each web session. Then DBSCAN and EM Algorithms are used iteratively and recursively to get the best clustering results of the web sessions. Using homogeneity, completeness, V-measure, intra and inter cluster distance and silhouette coefficient as parameters these algorithms self-evaluate themselves to input better parametric values to run the algorithms. If a cluster is found to be too large then micro-clustering is used. Using Cluster Signature Module the clusters are annotated with a unique signature called finger-print. In this module each cluster is fed to Associative Rule Learning Module. If it outputs confidence and support as value 1 for an access sequence it would be a potential signature for the cluster. Then the access sequence occurrences are checked in other clusters. If it is found to be unique for the cluster considered then the cluster is annotated with the signature. These signatures are used in anomaly detection, prevent cyber attacks, real-time dashboards that visualize users, accessing web pages, predict actions of users and various other applications in Finance, University Websites, News and Media Websites etc.Keywords: anomaly detection, clustering, pattern recognition, web sessions
Procedia PDF Downloads 288129 Bread-Making Properties of Rice Flour Dough Using Fatty Acid Salt
Authors: T. Hamaishi, Y. Morinaga, H. Morita
Abstract:
Introduction: Rice consumption in Japan has decreased, and Japanese government has recommended use of rice flour in order to expand the consumption of rice. There are two major protein components present in flour, called gliadin and glutenin. Gluten forms when water is added to flour and is mixed. As mixing continues, glutenin interacts with gliadin to form viscoelastic matrix of gluten. Rice flour bread does not expand as much as wheat flour bread. Because rice flour is not included gluten, it cannot construct gluten network in the dough. In recent years, some food additives have been used for dough-improving agent in bread making, especially surfactants has effect in order to improve dough extensibility. Therefore, we focused to fatty acid salt which is one of anionic surfactants. Fatty acid salt is a salt consist of fatty acid and alkali, it is main components of soap. According to JECFA(FAO/WHO Joint Expert Committee on Food Additives), salts of Myristic(C14), Palmitic(C16) and Stearic(C18) could be used as food additive. They have been evaluated ADI was not specified. In this study, we investigated to improving bread-making properties of rice flour dough adding fatty acid salt. Materials and methods: The sample of fatty acid salt is myristic (C14) dissolved in KOH solution to a concentration of 350 mM and pH 10.5. Rice dough was consisted of 100 g of flour using rice flour and wheat gluten, 5 g of sugar, 1.7 g of salt, 1.7g of dry yeast, 80 mL of water and fatty acid salt. Mixing was performed for 500 times by using hand. The concentration of C14K in the dough was 10 % relative to flour weight. Amount of gluten in the dough was 20 %, 30 % relative to flour weight. Dough expansion ability test was performed to measure physical property of bread dough according to the methods of Baker’s Yeast by Japan Yeast Industry Association. In this test, 150 g of dough was filled from bottom of the cylinder and fermented at 30 °C,85 % humidity for 120 min on an incubator. The height of the expansion in the dough was measured and determined its expansion ability. Results and Conclusion: Expansion ability of rice dough with gluten content of 20 %, 30% showed 316 mL, 341 mL for 120 min. When C14K adding to the rice dough, dough expansion abilities were 314 mL, 368 mL for 120 min, there was no significant difference. Conventionally it has been known that the rice flour dough contain gluten of 20 %. The considerable improvement of dough expansion ability was achieved when added C14K to wheat flour. The experimental result shows that c14k adding to the rice dough with gluten content more than 20 % was not improving bread-making properties. In conclusion, rice bread made with gluten content more than 20 % without C14K has been suggested to contribute to the formation of the sufficient gluten network.Keywords: expansion ability, fatty acid salt, gluten, rice flour dough
Procedia PDF Downloads 244128 Assessing Professionalism, Communication, and Collaboration among Emergency Physicians by Implementing a 360-Degree Evaluation
Authors: Ahmed Al Ansari, Khalid Al Khalifa
Abstract:
Objective: Multisource feedback (MSF), also called the 360-Degree evaluation is an evaluation process by which questionnaires are distributed amongst medical peers and colleagues to assess physician performance from different sources other than the attending or the supervising physicians. The aim of this study was to design, implement, and evaluate a 360-Degree process in assessing emergency physicians trainee in the Kingdom of Bahrain. Method: The study was undertaken in Bahrain Defense Force Hospital which is a military teaching hospital in the Kingdom of Bahrain. Thirty emergency physicians (who represent the total population of the emergency physicians in our hospital) were assessed in this study. We developed an instrument modified from the Physician achievement review instrument PAR which was used to assess Physician in Alberta. We focused in our instrument to assess professionalism, communication skills and collaboration only. To achieve face and content validity, table of specification was constructed and a working group was involved in constructing the instrument. Expert opinion was considered as well. The instrument consisted of 39 items; were 15 items to assess professionalism, 13 items to assess communication skills, and 11 items to assess collaboration. Each emergency physicians was evaluated with 3 groups of raters, 4 Medical colleague emergency physicians, 4 medical colleague who are considered referral physicians from different departments, and 4 Coworkers from the emergency department. Independent administrative team was formed to carry on the responsibility of distributing the instruments and collecting them in closed envelopes. Each envelope was consisted of that instrument and a guide for the implementation of the MSF and the purpose of the study. Results: A total of 30 emergency physicians 16 males and 14 females who represent the total number of the emergency physicians in our hospital were assessed. The total collected forms is 269, were 105 surveys from coworkers working in emergency department, 93 surveys from medical colleague emergency physicians, and 116 surveys from referral physicians from different departments. The total mean response rates were 71.2%. The whole instrument was found to be suitable for factor analysis (KMO = 0.967; Bartlett test significant, p<0.00). Factor analysis showed that the data on the questionnaire decomposed into three factors which counted for 72.6% of the total variance: professionalism, collaboration, and communication. Reliability analysis indicated that the instrument full scale had high internal consistency (Cronbach’s α 0.98). The generalizability coefficients (Ep2) were 0.71 for the surveys. Conclusions: Based on the present results, the current instruments and procedures have high reliability, validity, and feasibility in assessing emergency physicians trainee in the emergency room.Keywords: MSF system, emergency, validity, generalizability
Procedia PDF Downloads 357127 Opportunities Forensics Biology in the Study of Sperm Traces after Washing
Authors: Saule Musabekova
Abstract:
Achievements of modern science, especially genetics, led to a sharp intensification of the process of proof. Footprints, subjected to destruction-related cause-effect relationships, are sources of evidentiary information on the circumstances it was committed and the persons committed it. Currently, with the overall growth in the number of crimes against sexual inviolability or sexual freedom, and increased the proportion of the crimes where to destroy the traces of the crime perpetrators different detergents are used. A characteristic feature of modern synthetic detergents is the presence of biological additives - enzymes that break down and gradually destroy stains of protein origin. To study the nature of the influence of modern washing powders semen stains were put kinds of fabrics and prepared in advance stained sperm of men of different groups according to ABO system. For research washing machines of known manufacturers of household appliances have been used with different production characteristics, in which the test was performed and the washing of various kinds of fabrics with semen stains. After washing the tissue with spots were tested for the presence of semen stains visually preserved, establishing in them surviving sperm or their elements, we studied the possibilities of the group diagnostics on the system ABO or molecular-genetic identification. The subsequent study of these spots by morphological method showed that 100% detection of morphological sperm cells - sperm is not possible. As a result, in 30% of further studies of these traces gave weakly positive results are obtained with an immunoassay test PSA SEMIQUANT. It is noted that the percentage of positive results obtained in the study of semen traces disposed on natural fiber fabrics is higher than sperm traces disposed on synthetic fabrics. Study traces of semen, confirmed by PSA - test 3% possible to establish a genetic profile of the person and obtain any positive findings of the molecular genetic examination. In other cases, it was not a sufficient amount of material for DNA identification. Results of research and the practical expert study found, in most cases, the conclusions of the identification of sperm traces do not seem possible. This a consequence of exposure to semen traces on the material evidence of biological additives contained in modern detergents and further the influence of other effective methods. Resulting in DNA has undergone irreversible changes (degradation) under the influence of external human factors. Using molecular genetic methods can partially solve the problems arising in the study of unlaundered physical evidence for the disclosure and investigation of crimes.Keywords: study of sperm, modern detergents, washing powders, forensic medicine
Procedia PDF Downloads 298126 Obesity and Cancer: Current Scientific Evidence and Policy Implications
Authors: Martin Wiseman, Rachel Thompson, Panagiota Mitrou, Kate Allen
Abstract:
Since 1997 World Cancer Research Fund (WCRF) International and the American Institute for Cancer Research (AICR) have been at the forefront of synthesising and interpreting the accumulated scientific literature on the link between diet, nutrition, physical activity and cancer, and deriving evidence-based Cancer Prevention Recommendations. The 2007 WCRF/AICR 2nd Expert Report was a landmark in the analysis of evidence linking diet, body weight and physical activity to cancer and led to the establishment of the Continuous Update Project (CUP). In 2018, as part of the CUP, WCRF/AICR will publish a new synthesis of the current evidence and update the Cancer Prevention Recommendations. This will ensure that everyone - from policymakers and health professionals to members of the public - has access to the most up-to-date information on how to reduce the risk of developing cancer. Overweight and obesity play a significant role in cancer risk, and rates of both are increasing in many parts of the world. This session will give an overview of new evidence relating obesity to cancer since the 2007 report. For example, since the 2007 Report, the number of cancers for which obesity is judged to be a contributory cause has increased from seven to eleven. The session will also shed light on the well-established mechanisms underpinning obesity and cancer links. Additionally, the session will provide an overview of diet and physical activity related factors that promote positive energy imbalance, leading to overweight and obesity. Finally, the session will highlight how policy can be used to address overweight and obesity at a population level, using WCRF International’s NOURISHING Framework. NOURISHING formalises a comprehensive package of policies to promote healthy diets and reduce obesity and non-communicable diseases; it is a tool for policymakers to identify where action is needed and assess if an approach is sufficiently comprehensive. The framework brings together ten policy areas across three domains: food environment, food system, and behaviour change communication. The framework is accompanied by a regularly updated database providing an extensive overview of implemented government policy actions from around the world. In conclusion, the session will provide an overview of obesity and cancer, highlighting the links seen in the epidemiology and exploring the mechanisms underpinning these, as well as the influences that help determine overweight and obesity. Finally, the session will illustrate policy approaches that can be taken to reduce overweight and obesity worldwide.Keywords: overweight, obesity, nutrition, cancer, mechanisms, policy
Procedia PDF Downloads 157125 Consequences to Financial Reporting by Implementing Sri Lanka Financial Reporting Standard 13 on Measuring the Fair Value of Financial Instruments: Evidence from Three Sri Lankan Organizations
Authors: Nayoma Ranawaka
Abstract:
The demand for the high quality internationally comparable financial information has been increased than ever with the expansion of economic activities beyond its national boundaries. Thus, the necessity of converging accounting practices across the world is now continuously discussed with greater emphasis. The global convergence to International Financial Reporting Standards has been one of the main objectives of the International Accounting Standards Setting Board (IASB) since its establishment in 2001. Accordingly, Sri Lanka has adopted IFRSs in 2012. Among the other standards as a newly introduced standard by the IASB, IFRS 13 plays a pivotal role as it deals with the Fair Value Accounting (FVA). Therefore, it is valuable to obtain knowledge about the consequences of implementing IFRS 13 in Sri Lanka and compare results across nations. According to the IFRS Jurisdictional provision of Sri Lanka, Institute of Chartered Accountants of Sri Lanka has taken official steps to adopt IFRS 13 by introducing SLFRS 13 with de jure convergence. Then this study was identified the de facto convergence of the SLFRS 13 in measuring the Fair Value of Financial Instruments in the Sri Lankan context. Accordingly, the objective of this study is to explore the consequences to financial reporting by implementing SLFRS 13 on measuring the financial instruments. In order to achieve the objective of the study expert interview and in-depth interviews with the interviewees from the selected three case studies and their independent auditor were carried out using customized three different interview guides. These three cases were selected from three different industries; Banking, Manufacturing and Finance. NVivo version 10 was used to analyze the data collected through in-depth interviews. Then the content analysis was carried out and conclusions were derived based on the findings. Contribution to the knowledge by this study can be identified in different aspects. Findings of this study facilitate accounting practitioners to get an overall picture of application of fair value standard in measuring the financial instruments and to identify the challenges and barriers to the adoption process. Further, assist auditors in carrying out their audit procedures to check the level of compliance to the fair value standard in measuring the financial instruments. Moreover, this would enable foreign investors in assessing the reliability of the financial statements of their target investments as a result of SLFRS 13 in measuring the FVs of the FIs. The findings of the study could be used to open new avenues of thinking for policy formulators to provide the necessary infrastructure to eliminate disparities exists among different regulatory bodies to facilitate full convergence and thereby growth of the economy. Further, this provides insights to the dynamics of FVA implementation that are also relevant for other developing countries.Keywords: convergence, fair value, financial instruments, IFRS 13
Procedia PDF Downloads 126124 The Role of Demographics and Service Quality in the Adoption and Diffusion of E-Government Services: A Study in India
Authors: Sayantan Khanra, Rojers P. Joseph
Abstract:
Background and Significance: This study is aimed at analyzing the role of demographic and service quality variables in the adoption and diffusion of e-government services among the users in India. The study proposes to examine the users' perception about e-Government services and investigate the key variables that are most salient to the Indian populace. Description of the Basic Methodologies: The methodology to be adopted in this study is Hierarchical Regression Analysis, which will help in exploring the impact of the demographic variables and the quality dimensions on the willingness to use e-government services in two steps. First, the impact of demographic variables on the willingness to use e-government services is to be examined. In the second step, quality dimensions would be used as inputs to the model for explaining variance in excess of prior contribution by the demographic variables. Present Status: Our study is in the data collection stage in collaboration with a highly reliable, authentic and adequate source of user data. Assuming that the population of the study comprises all the Internet users in India, a massive sample size of more than 10,000 random respondents is being approached. Data is being collected using an online survey questionnaire. A pilot survey has already been carried out to refine the questionnaire with inputs from an expert in management information systems and a small group of users of e-government services in India. The first three questions in the survey pertain to the Internet usage pattern of a respondent and probe whether the person has used e-government services. If the respondent confirms that he/she has used e-government services, then an aggregate of 15 indicators are used to measure the quality dimensions under consideration and the willingness of the respondent to use e-government services, on a five-point Likert scale. If the respondent reports that he/she has not used e-government services, then a few optional questions are asked to understand the reason(s) behind the same. Last four questions in the survey are dedicated to collect data related to the demographic variables. An indication of the Major Findings: Based on the extensive literature review carried out to develop several propositions; a research model is prescribed to start with. A major outcome expected at the completion of the study is the development of a research model that would help to understand the relationship involving the demographic variables and service quality dimensions, and the willingness to adopt e-government services, particularly in an emerging economy like India. Concluding Statement: Governments of emerging economies and other relevant agencies can use the findings from the study in designing, updating, and promoting e-government services to enhance public participation, which in turn, would help to improve efficiency, convenience, engagement, and transparency in implementing these services.Keywords: adoption and diffusion of e-government services, demographic variables, hierarchical regression analysis, service quality dimensions
Procedia PDF Downloads 267123 The Relationship between the Competence Perception of Student and Graduate Nurses and Their Autonomy and Critical Thinking Disposition
Authors: Zülfiye Bıkmaz, Aytolan Yıldırım
Abstract:
This study was planned as a descriptive regressive study in order to determine the relationship between the competency levels of working nurses, the levels of competency expected by nursing students, the critical thinking disposition of nurses, their perceived autonomy levels, and certain socio demographic characteristics. It is also a methodological study with regard to the intercultural adaptation of the Nursing Competence Scale (NCS) in both working and student samples. The sample of the study group of nurses at a university hospital for at least 6 months working properly and consists of 443 people filled out questionnaires. The student group, consisting of 543 individuals from the 4 public university nursing 3rd and 4th grade students. Data collection tools consisted of a questionnaire prepared in order to define the socio demographic, economic, and personal characteristics of the participants, the ‘Nursing Competency Scale’, the ‘Autonomy Subscale of the Sociotropy – Autonomy Scale’, and the ‘California Critical Thinking Disposition Inventory’. In data evaluation, descriptive statistics, nonparametric tests, Rasch analysis and correlation and regression tests were used. The language validity of the ‘NCS’ was performed by translation and back translation, and the context validity of the scale was performed with expert views. The scale, which was formed into its final structure, was applied in a pilot application from a group consisting of graduate and student nurses. The time constancy of the test was obtained by analysis testing retesting method. In order to reduce the time problems with the two half reliability method was used. The Cronbach Alfa coefficient of the scale was found to be 0.980 for the nurse group and 0.986 for the student group. Statistically meaningful relationships between competence and critical thinking and variables such as age, gender, marital status, family structure, having had critical thinking training, education level, class of the students, service worked in, employment style and position, and employment duration were found. Statistically meaningful relationships between autonomy and certain variables of the student group such as year, employment status, decision making style regarding self, total duration of employment, employment style, and education status were found. As a result, it was determined that the NCS which was adapted interculturally was a valid and reliable measurement tool and was found to be associated with autonomy and critical thinking.Keywords: nurse, nursing student, competence, autonomy, critical thinking, Rasch analysis
Procedia PDF Downloads 393122 The Emergence of Cold War Heritage: United Kingdom Cold War Bunkers and Sites
Authors: Peter Robinson, Milka Ivanova
Abstract:
Despite the growing interest in the Cold War period and heritage, little attention has been paid to the presentation and curatorship of Cold War heritage in eastern or western Europe. In 2021 Leeds Beckett University secured a British Academy Grant to explore visitor experiences, curatorship, emotion, and memory at Cold War-related tourist sites, comparing the perspectives of eastern and western European sites through research carried out in the UK and Bulgaria. The research explores the themes of curatorship, experience, and memory. Many of the sites included in the research in the UK-based part of the project are nuclear bunkers that have been decommissioned and are now open to visitors. The focus of this conference abstract is one of several perspectives drawn from a British Academy Grant-funded project exploring curatorship, visitor experience and nostalgia and memory in former cold war spaces in the UK, bringing together critical comparisons between western and eastern European sites. The project identifies specifically the challenges of ownership, preservation and presentation and discusses the challenges facing those who own, manage, and provide access to cold war museums and sites. The research is underpinned by contested issues of authenticity and ownership, discussing narrative accounts of those involved in caring for and managing these sites. The research project draws from interviews with key stakeholders, site observations, visitor surveys, and content analysis of Trip advisor posts. Key insights from the project include the external challenges owners and managers face from a lack of recognition of and funding for important Cold War sites in the UK that are at odds with interest shown in cold war sites by visitors to Cold War structures and landmarks. The challenges center on the lack of consistent approaches toward cold war heritage conservation, management, and ownership, lack of curatorial expertise and over-reliance on no-expert interpretation and presentation of heritage, the effect of the passage of time on personal connections to cold war heritage sites, the dissipating technological knowledge base, the challenging structure that does not lend themselves easily as visitor attractions or museums, the questionable authenticity of artifacts, the limited archival material, and quite often limited budgets. A particularly interesting insight focusing on nuclear bunkers has been on the difficulties in site reinterpretation because of the impossibility of fully exploring the enormity of nuclear war as a consistent threat of the Cold War. Further insights from the research highlight the secrecy of many of the sites as a key marketing strategy, particularly in relation to the nuclear bunker sites included in the project.Keywords: cold war, curatorship, heritage, nuclear bunkers.
Procedia PDF Downloads 77121 Dual-use UAVs in Armed Conflicts: Opportunities and Risks for Cyber and Electronic Warfare
Authors: Piret Pernik
Abstract:
Based on strategic, operational, and technical analysis of the ongoing armed conflict in Ukraine, this paper will examine the opportunities and risks of using small commercial drones (dual-use unmanned aerial vehicles, UAV) for military purposes. The paper discusses the opportunities and risks in the information domain, encompassing both cyber and electromagnetic interference and attacks. The paper will draw conclusions on a possible strategic impact to the battlefield outcomes in the modern armed conflicts by the widespread use of dual-use UAVs. This article will contribute to filling the gap in the literature by examining based on empirical data cyberattacks and electromagnetic interference. Today, more than one hundred states and non-state actors possess UAVs ranging from low cost commodity models, widely are dual-use, available and affordable to anyone, to high-cost combat UAVs (UCAV) with lethal kinetic strike capabilities, which can be enhanced with Artificial Intelligence (AI) and Machine Learning (ML). Dual-use UAVs have been used by various actors for intelligence, reconnaissance, surveillance, situational awareness, geolocation, and kinetic targeting. Thus they function as force multipliers enabling kinetic and electronic warfare attacks and provide comparative and asymmetric operational and tactical advances. Some go as far as argue that automated (or semi-automated) systems can change the character of warfare, while others observe that the use of small drones has not changed the balance of power or battlefield outcomes. UAVs give considerable opportunities for commanders, for example, because they can be operated without GPS navigation, makes them less vulnerable and dependent on satellite communications. They can and have been used to conduct cyberattacks, electromagnetic interference, and kinetic attacks. However, they are highly vulnerable to those attacks themselves. So far, strategic studies, literature, and expert commentary have overlooked cybersecurity and electronic interference dimension of the use of dual use UAVs. The studies that link technical analysis of opportunities and risks with strategic battlefield outcomes is missing. It is expected that dual use commercial UAV proliferation in armed and hybrid conflicts will continue and accelerate in the future. Therefore, it is important to understand specific opportunities and risks related to the crowdsourced use of dual-use UAVs, which can have kinetic effects. Technical countermeasures to protect UAVs differ depending on a type of UAV (small, midsize, large, stealth combat), and this paper will offer a unique analysis of small UAVs both from the view of opportunities and risks for commanders and other actors in armed conflict.Keywords: dual-use technology, cyber attacks, electromagnetic warfare, case studies of cyberattacks in armed conflicts
Procedia PDF Downloads 102120 An As-Is Analysis and Approach for Updating Building Information Models and Laser Scans
Authors: Rene Hellmuth
Abstract:
Factory planning has the task of designing products, plants, processes, organization, areas, and the construction of a factory. The requirements for factory planning and the building of a factory have changed in recent years. Regular restructuring of the factory building is becoming more important in order to maintain the competitiveness of a factory. Restrictions in new areas, shorter life cycles of product and production technology as well as a VUCA world (Volatility, Uncertainty, Complexity & Ambiguity) lead to more frequent restructuring measures within a factory. A building information model (BIM) is the planning basis for rebuilding measures and becomes an indispensable data repository to be able to react quickly to changes. Use as a planning basis for restructuring measures in factories only succeeds if the BIM model has adequate data quality. Under this aspect and the industrial requirement, three data quality factors are particularly important for this paper regarding the BIM model: up-to-dateness, completeness, and correctness. The research question is: how can a BIM model be kept up to date with required data quality and which visualization techniques can be applied in a short period of time on the construction site during conversion measures? An as-is analysis is made of how BIM models and digital factory models (including laser scans) are currently being kept up to date. Industrial companies are interviewed, and expert interviews are conducted. Subsequently, the results are evaluated, and a procedure conceived how cost-effective and timesaving updating processes can be carried out. The availability of low-cost hardware and the simplicity of the process are of importance to enable service personnel from facility mnagement to keep digital factory models (BIM models and laser scans) up to date. The approach includes the detection of changes to the building, the recording of the changing area, and the insertion into the overall digital twin. Finally, an overview of the possibilities for visualizations suitable for construction sites is compiled. An augmented reality application is created based on an updated BIM model of a factory and installed on a tablet. Conversion scenarios with costs and time expenditure are displayed. A user interface is designed in such a way that all relevant conversion information is available at a glance for the respective conversion scenario. A total of three essential research results are achieved: As-is analysis of current update processes for BIM models and laser scans, development of a time-saving and cost-effective update process and the conception and implementation of an augmented reality solution for BIM models suitable for construction sites.Keywords: building information modeling, digital factory model, factory planning, restructuring
Procedia PDF Downloads 114119 Designing a Combined Outpatient and Day Treatment Eating Disorder Program for Adolescents and Transitional Aged Youth: A Naturalistic Case Study
Authors: Deanne McArthur, Melinda Wall, Claire Hanlon, Dana Agnolin, Krista Davis, Melanie Dennis, Elizabeth Glidden, Anne Marie Smith, Claudette Thomson
Abstract:
Background and significance: Patients with eating disorders have traditionally been an underserviced population within the publicly-funded Canadian healthcare system. This situation was worsened by the COVID-19 pandemic and accompanying public health measures, such as “lockdowns” which led to increased isolation, changes in routine, and other disruptions. Illness severity and prevalence rose significantly with corresponding increases in patient suffering and poor outcomes. In Ontario, Canada, the provincial government responded by increasing funding for the treatment of eating disorders, including the launch of a new day program at an intermediate, regional health centre that already housed an outpatient treatment service. The funding was received in March 2022. The care team sought to optimize this opportunity by designing a program that would fit well within the resource-constrained context in Ontario. Methods: This case study will detail how the team consulted the literature and sought patient and family input to design a program that optimizes patient outcomes and supports for patients and families while they await treatment. Early steps include a review of the literature, expert consultation and patient and family focus groups. Interprofessional consensus was sought at each step with the team adopting a shared leadership and patient-centered approach. Methods will include interviews, observations and document reviews to detail a rich description of the process undertaken to design the program, including evaluation measures adopted. Interim findings pertaining to the early stages of the program-building process will be detailed as well as early lessons and ongoing evolution of the program and design process. Program implementation and outcome evaluation will continue throughout 2022 and early 2023 with further publication and presentation of study results expected in the summer of 2023. The aim of this study is to contribute to the body of knowledge pertaining to the design and implementation of eating disorder treatment services that combine outpatient and day treatment services in a resource-constrained context.Keywords: eating disorders, day program, interprofessional, outpatient, adolescents, transitional aged youth
Procedia PDF Downloads 108118 Real-Time Working Environment Risk Analysis with Smart Textiles
Authors: Jose A. Diaz-Olivares, Nafise Mahdavian, Farhad Abtahi, Kaj Lindecrantz, Abdelakram Hafid, Fernando Seoane
Abstract:
Despite new recommendations and guidelines for the evaluation of occupational risk assessments and their prevention, work-related musculoskeletal disorders are still one of the biggest causes of work activity disruption, productivity loss, sick leave and chronic work disability. It affects millions of workers throughout Europe, with a large-scale economic and social burden. These specific efforts have failed to produce significant results yet, probably due to the limited availability and high costs of occupational risk assessment at work, especially when the methods are complex, consume excessive resources or depend on self-evaluations and observations of poor accuracy. To overcome these limitations, a pervasive system of risk assessment tools in real time has been developed, which has the characteristics of a systematic approach, with good precision, usability and resource efficiency, essential to facilitate the prevention of musculoskeletal disorders in the long term. The system allows the combination of different wearable sensors, placed on different limbs, to be used for data collection and evaluation by a software solution, according to the needs and requirements in each individual working environment. This is done in a non-disruptive manner for both the occupational health expert and the workers. The creation of this solution allows us to attend different research activities that require, as an essential starting point, the recording of data with ergonomic value of very diverse origin, especially in real work environments. The software platform is here presented with a complimentary smart clothing system for data acquisition, comprised of a T-shirt containing inertial measurement units (IMU), a vest sensorized with textile electronics, a wireless electrocardiogram (ECG) and thoracic electrical bio-impedance (TEB) recorder and a glove sensorized with variable resistors, dependent on the angular position of the wrist. The collected data is processed in real-time through a mobile application software solution, implemented in commercially available Android-based smartphones and tablet platforms. Based on the collection of this information and its analysis, real-time risk assessment and feedback about postural improvement is possible, adapted to different contexts. The result is a tool which provides added value to ergonomists and occupational health agents, as in situ analysis of postural behavior can assist in a quantitative manner in the evaluation of work techniques and the occupational environment.Keywords: ergonomics, mobile technologies, risk assessment, smart textiles
Procedia PDF Downloads 117117 Life Cycle Assessment-Based Environmental Assessment of the Production and Maintenance of Wooden Windows
Authors: Pamela Del Rosario, Elisabetta Palumbo, Marzia Traverso
Abstract:
The building sector plays an important role in addressing pressing environmental issues such as climate change and resource scarcity. The energy performance of buildings is considerably affected by the external envelope. In fact, a considerable proportion of the building energy demand is due to energy losses through the windows. Nevertheless, according to literature, to pay attention only to the contribution of windows to the building energy performance, i.e., their influence on energy use during building operation, could result in a partial evaluation. Hence, it is important to consider not only the building energy performance but also the environmental performance of windows, and this not only during the operational stage but along its complete life cycle. Life Cycle Assessment (LCA) according to ISO 14040:2006 and ISO 14044:2006+A1:2018 is one of the most adopted and robust methods to evaluate the environmental performance of products throughout their complete life cycle. This life-cycle based approach avoids the shift of environmental impacts of a life cycle stage to another, allowing to allocate them to the stage in which they originated and to adopt measures that optimize the environmental performance of the product. Moreover, the LCA method is widely implemented in the construction sector to assess whole buildings as well as construction products and materials. LCA is regulated by the European Standards EN 15978:2011, at the building level, and EN 15804:2012+A2:2019, at the level of construction products and materials. In this work, the environmental performance of wooden windows was assessed by implementing the LCA method and adopting primary data. More specifically, the emphasis is given to embedded and operational impacts. Furthermore, correlations are made between these environmental impacts and aspects such as type of wood and window transmittance. In the particular case of the operational impacts, special attention is set on the definition of suitable maintenance scenarios that consider the potential climate influence on the environmental impacts. For this purpose, a literature review was conducted, and expert consultation was carried out. The study underlined the variability of the embedded environmental impacts of wooden windows by considering different wood types and transmittance values. The results also highlighted the need to define appropriate maintenance scenarios for precise assessment results. It was found that both the service life and the window maintenance requirements in terms of treatment and its frequency are highly dependent not only on the wood type and its treatment during the manufacturing process but also on the weather conditions of the place where the window is installed. In particular, it became evident that maintenance-related environmental impacts were the highest for climate regions with the lowest temperatures and the greatest amount of precipitation.Keywords: embedded impacts, environmental performance, life cycle assessment, LCA, maintenance stage, operational impacts, wooden windows
Procedia PDF Downloads 232116 Finite Element Modeling of Mass Transfer Phenomenon and Optimization of Process Parameters for Drying of Paddy in a Hybrid Solar Dryer
Authors: Aprajeeta Jha, Punyadarshini P. Tripathy
Abstract:
Drying technologies for various food processing operations shares an inevitable linkage with energy, cost and environmental sustainability. Hence, solar drying of food grains has become imperative choice to combat duo challenges of meeting high energy demand for drying and to address climate change scenario. But performance and reliability of solar dryers depend hugely on sunshine period, climatic conditions, therefore, offer a limited control over drying conditions and have lower efficiencies. Solar drying technology, supported by Photovoltaic (PV) power plant and hybrid type solar air collector can potentially overpower the disadvantages of solar dryers. For development of such robust hybrid dryers; to ensure quality and shelf-life of paddy grains the optimization of process parameter becomes extremely critical. Investigation of the moisture distribution profile within the grains becomes necessary in order to avoid over drying or under drying of food grains in hybrid solar dryer. Computational simulations based on finite element modeling can serve as potential tool in providing a better insight of moisture migration during drying process. Hence, present work aims at optimizing the process parameters and to develop a 3-dimensional (3D) finite element model (FEM) for predicting moisture profile in paddy during solar drying. COMSOL Multiphysics was employed to develop a 3D finite element model for predicting moisture profile. Furthermore, optimization of process parameters (power level, air velocity and moisture content) was done using response surface methodology in design expert software. 3D finite element model (FEM) for predicting moisture migration in single kernel for every time step has been developed and validated with experimental data. The mean absolute error (MAE), mean relative error (MRE) and standard error (SE) were found to be 0.003, 0.0531 and 0.0007, respectively, indicating close agreement of model with experimental results. Furthermore, optimized process parameters for drying paddy were found to be 700 W, 2.75 m/s at 13% (wb) with optimum temperature, milling yield and drying time of 42˚C, 62%, 86 min respectively, having desirability of 0.905. Above optimized conditions can be successfully used to dry paddy in PV integrated solar dryer in order to attain maximum uniformity, quality and yield of product. PV-integrated hybrid solar dryers can be employed as potential and cutting edge drying technology alternative for sustainable energy and food security.Keywords: finite element modeling, moisture migration, paddy grain, process optimization, PV integrated hybrid solar dryer
Procedia PDF Downloads 150115 Conflict Resolution in Fuzzy Rule Base Systems Using Temporal Modalities Inference
Authors: Nasser S. Shebka
Abstract:
Fuzzy logic is used in complex adaptive systems where classical tools of representing knowledge are unproductive. Nevertheless, the incorporation of fuzzy logic, as it’s the case with all artificial intelligence tools, raised some inconsistencies and limitations in dealing with increased complexity systems and rules that apply to real-life situations and hinders the ability of the inference process of such systems, but it also faces some inconsistencies between inferences generated fuzzy rules of complex or imprecise knowledge-based systems. The use of fuzzy logic enhanced the capability of knowledge representation in such applications that requires fuzzy representation of truth values or similar multi-value constant parameters derived from multi-valued logic, which set the basis for the three t-norms and their based connectives which are actually continuous functions and any other continuous t-norm can be described as an ordinal sum of these three basic ones. However, some of the attempts to solve this dilemma were an alteration to fuzzy logic by means of non-monotonic logic, which is used to deal with the defeasible inference of expert systems reasoning, for example, to allow for inference retraction upon additional data. However, even the introduction of non-monotonic fuzzy reasoning faces a major issue of conflict resolution for which many principles were introduced, such as; the specificity principle and the weakest link principle. The aim of our work is to improve the logical representation and functional modelling of AI systems by presenting a method of resolving existing and potential rule conflicts by representing temporal modalities within defeasible inference rule-based systems. Our paper investigates the possibility of resolving fuzzy rules conflict in a non-monotonic fuzzy reasoning-based system by introducing temporal modalities and Kripke's general weak modal logic operators in order to expand its knowledge representation capabilities by means of flexibility in classifying newly generated rules, and hence, resolving potential conflicts between these fuzzy rules. We were able to address the aforementioned problem of our investigation by restructuring the inference process of the fuzzy rule-based system. This is achieved by using time-branching temporal logic in combination with restricted first-order logic quantifiers, as well as propositional logic to represent classical temporal modality operators. The resulting findings not only enhance the flexibility of complex rule-base systems inference process but contributes to the fundamental methods of building rule bases in such a manner that will allow for a wider range of applicable real-life situations derived from a quantitative and qualitative knowledge representational perspective.Keywords: fuzzy rule-based systems, fuzzy tense inference, intelligent systems, temporal modalities
Procedia PDF Downloads 91114 Risk Assessment of Flood Defences by Utilising Condition Grade Based Probabilistic Approach
Authors: M. Bahari Mehrabani, Hua-Peng Chen
Abstract:
Management and maintenance of coastal defence structures during the expected life cycle have become a real challenge for decision makers and engineers. Accurate evaluation of the current condition and future performance of flood defence structures is essential for effective practical maintenance strategies on the basis of available field inspection data. Moreover, as coastal defence structures age, it becomes more challenging to implement maintenance and management plans to avoid structural failure. Therefore, condition inspection data are essential for assessing damage and forecasting deterioration of ageing flood defence structures in order to keep the structures in an acceptable condition. The inspection data for flood defence structures are often collected using discrete visual condition rating schemes. In order to evaluate future condition of the structure, a probabilistic deterioration model needs to be utilised. However, existing deterioration models may not provide a reliable prediction of performance deterioration for a long period due to uncertainties. To tackle the limitation, a time-dependent condition-based model associated with a transition probability needs to be developed on the basis of condition grade scheme for flood defences. This paper presents a probabilistic method for predicting future performance deterioration of coastal flood defence structures based on condition grading inspection data and deterioration curves estimated by expert judgement. In condition-based deterioration modelling, the main task is to estimate transition probability matrices. The deterioration process of the structure related to the transition states is modelled according to Markov chain process, and a reliability-based approach is used to estimate the probability of structural failure. Visual inspection data according to the United Kingdom Condition Assessment Manual are used to obtain the initial condition grade curve of the coastal flood defences. The initial curves then modified in order to develop transition probabilities through non-linear regression based optimisation algorithms. The Monte Carlo simulations are then used to evaluate the future performance of the structure on the basis of the estimated transition probabilities. Finally, a case study is given to demonstrate the applicability of the proposed method under no-maintenance and medium-maintenance scenarios. Results show that the proposed method can provide an effective predictive model for various situations in terms of available condition grading data. The proposed model also provides useful information on time-dependent probability of failure in coastal flood defences.Keywords: condition grading, flood defense, performance assessment, stochastic deterioration modelling
Procedia PDF Downloads 233