Search results for: medical software tools
9596 A Case Study on Theme-Based Approach in Health Technology Engineering Education: Customer Oriented Software Applications
Authors: Mikael Soini, Kari Björn
Abstract:
Metropolia University of Applied Sciences (MUAS) Information and Communication Technology (ICT) Degree Programme provides full-time Bachelor-level undergraduate studies. ICT Degree Programme has seven different major options; this paper focuses on Health Technology. In Health Technology, a significant curriculum change in 2014 enabled transition from fragmented curriculum including dozens of courses to a new integrated curriculum built around three 30 ECTS themes. This paper focuses especially on the second theme called Customer Oriented Software Applications. From students’ point of view, the goal of this theme is to get familiar with existing health related ICT solutions and systems, understand business around health technology, recognize social and healthcare operating principles and services, and identify customers and users and their special needs and perspectives. This also acts as a background for health related web application development. Built web application is tested, developed and evaluated with real users utilizing versatile user centred development methods. This paper presents experiences obtained from the first implementation of Customer Oriented Software Applications theme. Student feedback was gathered with two questionnaires, one in the middle of the theme and other at the end of the theme. Questionnaires had qualitative and quantitative parts. Similar questionnaire was implemented in the first theme; this paper evaluates how the theme-based integrated curriculum has progressed in Health Technology major by comparing results between theme 1 and 2. In general, students were satisfied for the implementation, timing and synchronization of the courses, and the amount of work. However there is still room for development. Student feedback and teachers’ observations have been and will be used to develop the content and operating principles of the themes and whole curriculum.Keywords: engineering education, integrated curriculum, learning and teaching methods, learning experience
Procedia PDF Downloads 3239595 Plasma Properties Effect on Fluorescent Tube Plasma Antenna Performance
Authors: A. N. Dagang, E. I. Ismail, Z. Zakaria
Abstract:
This paper presents the analysis on the performance of monopole antenna with fluorescent tubes. In this research, the simulation and experimental approach is conducted. The fluorescent tube with different length and size is designed using Computer Simulation Technology (CST) software and the characteristics of antenna parameter are simulated throughout the software. CST was used to simulate antenna parameters such as return loss, resonant frequency, gain and directivity. Vector Network Analyzer (VNA) was used to measure the return loss of plasma antenna in order to validate the simulation results. In the simulation and experiment, the supply frequency is set starting from 1 GHz to 10 GHz. The results show that the return loss of plasma antenna changes when size of fluorescent tubes is varied, correspond to the different plasma properties. It shows that different values of plasma properties such as plasma frequency and collision frequency gives difference result of return loss, gain and directivity. For the gain, the values range from 2.14 dB to 2.36 dB. The return loss of plasma antenna offers higher value range from -22.187 dB to -32.903 dB. The higher the values of plasma frequency and collision frequency, the higher return loss can be obtained. The values obtained are comparative to the conventional type of metal antenna.Keywords: plasma antenna, fluorescent tube, CST, plasma parameters
Procedia PDF Downloads 3919594 Using Risk Management Indicators in Decision Tree Analysis
Authors: Adel Ali Elshaibani
Abstract:
Risk management indicators augment the reporting infrastructure, particularly for the board and senior management, to identify, monitor, and manage risks. This enhancement facilitates improved decision-making throughout the banking organization. Decision tree analysis is a tool that visually outlines potential outcomes, costs, and consequences of complex decisions. It is particularly beneficial for analyzing quantitative data and making decisions based on numerical values. By calculating the expected value of each outcome, decision tree analysis can help assess the best course of action. In the context of banking, decision tree analysis can assist lenders in evaluating a customer’s creditworthiness, thereby preventing losses. However, applying these tools in developing countries may face several limitations, such as data availability, lack of technological infrastructure and resources, lack of skilled professionals, cultural factors, and cost. Moreover, decision trees can create overly complex models that do not generalize well to new data, known as overfitting. They can also be sensitive to small changes in the data, which can result in different tree structures and can become computationally expensive when dealing with large datasets. In conclusion, while risk management indicators and decision tree analysis are beneficial for decision-making in banks, their effectiveness is contingent upon how they are implemented and utilized by the board of directors, especially in the context of developing countries. It’s important to consider these limitations when planning to implement these tools in developing countries.Keywords: risk management indicators, decision tree analysis, developing countries, board of directors, bank performance, risk management strategy, banking institutions
Procedia PDF Downloads 649593 Incorporating Lexical-Semantic Knowledge into Convolutional Neural Network Framework for Pediatric Disease Diagnosis
Authors: Xiaocong Liu, Huazhen Wang, Ting He, Xiaozheng Li, Weihan Zhang, Jian Chen
Abstract:
The utilization of electronic medical record (EMR) data to establish the disease diagnosis model has become an important research content of biomedical informatics. Deep learning can automatically extract features from the massive data, which brings about breakthroughs in the study of EMR data. The challenge is that deep learning lacks semantic knowledge, which leads to impracticability in medical science. This research proposes a method of incorporating lexical-semantic knowledge from abundant entities into a convolutional neural network (CNN) framework for pediatric disease diagnosis. Firstly, medical terms are vectorized into Lexical Semantic Vectors (LSV), which are concatenated with the embedded word vectors of word2vec to enrich the feature representation. Secondly, the semantic distribution of medical terms serves as Semantic Decision Guide (SDG) for the optimization of deep learning models. The study evaluate the performance of LSV-SDG-CNN model on four kinds of Chinese EMR datasets. Additionally, CNN, LSV-CNN, and SDG-CNN are designed as baseline models for comparison. The experimental results show that LSV-SDG-CNN model outperforms baseline models on four kinds of Chinese EMR datasets. The best configuration of the model yielded an F1 score of 86.20%. The results clearly demonstrate that CNN has been effectively guided and optimized by lexical-semantic knowledge, and LSV-SDG-CNN model improves the disease classification accuracy with a clear margin.Keywords: convolutional neural network, electronic medical record, feature representation, lexical semantics, semantic decision
Procedia PDF Downloads 1299592 Using the Technological, Pedagogical, and Content Knowledge (TPACK) Model to Address College Instructors Weaknesses in Integration of Technology in Their Current Area Curricula
Authors: Junior George Martin
Abstract:
The purpose of this study was to explore college instructors’ integration of technology in their content area curriculum. The instructors indicated that they were in need of additional training to successfully integrate technology in their subject areas. The findings point to the implementation of a proposed the Technological, Pedagogical, and Content Knowledge (TPACK) model professional development workshop to satisfactorily address the weaknesses of the instructors in technology integration. The professional development workshop is proposed as a rational solution to adequately address the instructors’ inability to the successful integration of technology in their subject area in an effort to improve their pedagogy. The intense workshop would last for 5 days and will be designed to provide instructors with training in areas such as a use of technology applications and tools, and using modern methodologies to improve technology integration. Exposing the instructors to the specific areas identified will address the weaknesses they demonstrated during the study. Professional development is deemed the most appropriate intervention based on the opportunities it provides the instructors to access hands-on training to overcome their weaknesses. The purpose of the TPACK professional development workshop will be to improve the competence of the instructors so that they are adequately prepared to integrate technology successfully in their curricula. At the end of the period training, the instructors are expected to adopt strategies that will have a positive impact on the learning experiences of the students.Keywords: higher education, modern technology tools, professional development, technology integration
Procedia PDF Downloads 3159591 Prototyping the Problem Oriented Medical Record for Connected Health Based on TypeGraphQL
Authors: Sabah Mohammed, Jinan Fiaidhi, Darien Sawyer
Abstract:
Data integration of health through connected services can save lives in the event of a medical emergency or provide efficient and effective interventions for the benefit of the patients through the integration of bedside and bench side clinical research. Such integration will support all wind of change in healthcare by being predictive, pre-emptive, personalized, problem-oriented and participatory. Prototyping a healthcare system that enables data integration has been a big challenge for healthcare for a long time. However, an innovative solution started to emerge by focusing on problem lists where everything can connect the problem list forming a growing graph. This notion was introduced by Dr. Lawrence Weed in early 70’s, but the enabling technologies weren’t mature enough to provide a successful implementation prototype. In this article, we are describing our efforts in prototyping Dr. Lawrence Weed's problem-oriented medical record (POMR) and his patient case schema (SOAP) to shape a prototype for connected health. For this, we are using the TypeGraphQL API and our enterprise-based QL4POMR to describe a Web-Based gateway for healthcare services connectivity. Our prototype has reported success in connecting to the HL7 FHIR medical record and the OpenTarget biomedical repositories.Keywords: connected health, problem-oriented healthcare record, SOAP, QL4POMR, typegraphQL
Procedia PDF Downloads 1019590 An Experience of Translating an Excerpt from Sophie Adonon’s Echos de Femmes from French to English, Using Reverso.
Authors: Michael Ngongeh Mombe
Abstract:
This Paper seeks to investigate an assertion made by some colleagues that there is no need paying a human translator to translate their literary texts, that there are softwares such as Reverso that can be used to do the translation. The main objective of this study is to examine the veracity of this assertion using Reverso to translate a literary text without any post-editing by a human translator. The work is based on two theories: Skopos and Communicative theories of translation. The work is a documentary research where data were collected from published documents in libraries, on the internet and from the translation produced by Reverso. We made a comparative text analyses of both source and target texts in a bid to highlight the weaknesses and strengths of the software. Findings of this work revealed that those who advocate the use of only Machine translation do so in ignorance of the translation mistakes usually made by the software. From the review of all the 268 segments of translation, we found out that the translation produced by Reverso is fraught with errors. We therefore recommend the use of human translators to either do the translation of their literary texts or revise the translation produced by machine to conform to the skopos of the work. This paper is based on Reverso translation. Similar works in the near future will be based on the other translation softwares to determine their weaknesses and strengths.Keywords: machine translation, human translator, Reverso, literary text
Procedia PDF Downloads 1009589 Intelligent Platform for Photovoltaic Park Operation and Maintenance
Authors: Andreas Livera, Spyros Theocharides, Michalis Florides, Charalambos Anastassiou
Abstract:
A main challenge in the quest for ensuring quality of operation, especially for photovoltaic (PV) systems, is to safeguard the reliability and optimal performance by detecting and diagnosing potential failures and performance losses at early stages or before the occurrence through real-time monitoring, supervision, fault detection, and predictive maintenance. The purpose of this work is to present the functionalities and results related to the development and validation of a software platform for PV assets diagnosis and maintenance. The platform brings together proprietary hardware sensors and software algorithms to enable the early detection and prediction of the most common and critical faults in PV systems. It was validated using field measurements from operating PV systems. The results showed the effectiveness of the platform for detecting faults and losses (e.g., inverter failures, string disconnections, and potential induced degradation) at early stages, forecasting PV power production while also providing recommendations for maintenance actions. Increased PV energy yield production and revenue can be thus achieved while also minimizing operation and maintenance (O&M) costs.Keywords: failure detection and prediction, operation and maintenance, performance monitoring, photovoltaic, platform, recommendations, predictive maintenance
Procedia PDF Downloads 579588 The Correlation between the Anxiety of the Family Members of the Patients Referring to the Emergency Department and Their Views on the Communication Skills of Nurses
Authors: Mahnaz Seyedoshohadaee
Abstract:
Background and Aims: Hospitalization of one of the family members in the hospital, especially in the emergency department, causes anxiety and psychological problems in family members and others. The way nurses interact with patients and their companions can play an important role in controlling and managing their anxiety. This study aims to determine the relationship between the anxiety of family members of patients referring to emergency departments and their views on the communication skills of nurses. Materials and Methods: The current research was a descriptive-correlation cross-sectional study on 263 family members of patients referred to the department. The emergency of two selected medical training centers affiliated with Iran University of Medical Sciences was performed. The samples were selected continuously in 2018 based on the inclusion criteria. Information was collected using the Health Communication Questionnaire (HCCQ) and Beck Anxiety Questionnaire (BAI). To analyze the data, Pearson's correlation coefficient, independent t-tests, analysis of variance, and Kruskal-Wallis were used at a significance level of 0.05. The data was analyzed using SPSS version 16 statistical software. Results: The mean score of communication skills of emergency department nurses from the point of view of patients' companions was at a low level (74.36 with a standard deviation of 3.7). 3.75% of patients' companions had anxiety at a mild level. There was no statistically significant correlation between the anxieties of the patient's companions. The anxiety of the patient's companions had a statistically significant relationship with the educational level (P=0.039), economic status (P=0.033), and family relationship with the patient (P=0.001). Also, the average anxiety score in children was significantly higher than that of patients' wives (P=0.008). The triage level of the patient also had a statistically significant relationship with the anxiety of the patient's companions (P>0.001). Conclusion: Most of the family members of the patients referred to the emergency room experienced mild anxiety. Also, from their point of view, the communication skills of emergency nurses were at a weak level. Despite the fact that there was no statistically significant relationship between the patient's family member's anxiety and their opinion about nurses' communication skills in this study, it seems that the weak communication skills of nurses from the patient's family member's point of view need special attention. The results of the present study can provide the necessary grounds for planning to improve the communication skills of nurses and also control the anxiety of patient caregivers through in-service training or other incentive mechanisms.Keywords: anxiety, family, emergency department, communication skills, nurse
Procedia PDF Downloads 639587 Feature Engineering Based Detection of Buffer Overflow Vulnerability in Source Code Using Deep Neural Networks
Authors: Mst Shapna Akter, Hossain Shahriar
Abstract:
One of the most important challenges in the field of software code audit is the presence of vulnerabilities in software source code. Every year, more and more software flaws are found, either internally in proprietary code or revealed publicly. These flaws are highly likely exploited and lead to system compromise, data leakage, or denial of service. C and C++ open-source code are now available in order to create a largescale, machine-learning system for function-level vulnerability identification. We assembled a sizable dataset of millions of opensource functions that point to potential exploits. We developed an efficient and scalable vulnerability detection method based on deep neural network models that learn features extracted from the source codes. The source code is first converted into a minimal intermediate representation to remove the pointless components and shorten the dependency. Moreover, we keep the semantic and syntactic information using state-of-the-art word embedding algorithms such as glove and fastText. The embedded vectors are subsequently fed into deep learning networks such as LSTM, BilSTM, LSTM-Autoencoder, word2vec, BERT, and GPT-2 to classify the possible vulnerabilities. Furthermore, we proposed a neural network model which can overcome issues associated with traditional neural networks. Evaluation metrics such as f1 score, precision, recall, accuracy, and total execution time have been used to measure the performance. We made a comparative analysis between results derived from features containing a minimal text representation and semantic and syntactic information. We found that all of the deep learning models provide comparatively higher accuracy when we use semantic and syntactic information as the features but require higher execution time as the word embedding the algorithm puts on a bit of complexity to the overall system.Keywords: cyber security, vulnerability detection, neural networks, feature extraction
Procedia PDF Downloads 949586 Current Practices of Permitted Daily Exposure (PDE) Calculation and Selection
Authors: Annie Ramanbhai Mecwan
Abstract:
Cleaning validation in a pharmaceutical manufacturing facility is documented evidence that a cleaning process has effectively removed contaminants, residues from previous drug products and cleaning agents below a pre-defined threshold from the reusable tools and parts of equipment. In shared manufacturing facilities more than one drug product is prepared. After cleaning of reusable tools and parts of equipment after one drug product manufacturing, there are chances that some residues of drug substance from previously manufactured drug products may be retained on the equipment and can carried forward to the next drug product and thus cause cross-contamination. Health-based limits through the derivation of a safe threshold value called permitted daily exposure (PDE) for the residues of drug substances should be employed to identify the risks posed at these manufacturing facilities. The PDE represents a substance-specific dose that is unlikely to cause an adverse effect if an individual is exposed to or below this dose every day for a lifetime. There are different practices to calculate PDE. Data for all APIs in the public domain are considered to calculate PDE value though, company to company may vary the final PDE value based on different toxicologist’s perspective or their subjective evaluation. Hence, Regulatory agencies should take responsibility for publishing PDE values for all APIs as it is done for elemental PDEs. This will harmonize the PDE values all over the world and prevent the unnecessary load on manufacturers for cleaning validationKeywords: active pharmaceutical ingredient, good manufacturing practice, NOAEL, no observed adverse effect level, permitted daily exposure
Procedia PDF Downloads 959585 Building Data Infrastructure for Public Use and Informed Decision Making in Developing Countries-Nigeria
Authors: Busayo Fashoto, Abdulhakeem Shaibu, Justice Agbadu, Samuel Aiyeoribe
Abstract:
Data has gone from just rows and columns to being an infrastructure itself. The traditional medium of data infrastructure has been managed by individuals in different industries and saved on personal work tools; one of such is the laptop. This hinders data sharing and Sustainable Development Goal (SDG) 9 for infrastructure sustainability across all countries and regions. However, there has been a constant demand for data across different agencies and ministries by investors and decision-makers. The rapid development and adoption of open-source technologies that promote the collection and processing of data in new ways and in ever-increasing volumes are creating new data infrastructure in sectors such as lands and health, among others. This paper examines the process of developing data infrastructure and, by extension, a data portal to provide baseline data for sustainable development and decision making in Nigeria. This paper employs the FAIR principle (Findable, Accessible, Interoperable, and Reusable) of data management using open-source technology tools to develop data portals for public use. eHealth Africa, an organization that uses technology to drive public health interventions in Nigeria, developed a data portal which is a typical data infrastructure that serves as a repository for various datasets on administrative boundaries, points of interest, settlements, social infrastructure, amenities, and others. This portal makes it possible for users to have access to datasets of interest at any point in time at no cost. A skeletal infrastructure of this data portal encompasses the use of open-source technology such as Postgres database, GeoServer, GeoNetwork, and CKan. These tools made the infrastructure sustainable, thus promoting the achievement of SDG 9 (Industries, Innovation, and Infrastructure). As of 6th August 2021, a wider cross-section of 8192 users had been created, 2262 datasets had been downloaded, and 817 maps had been created from the platform. This paper shows the use of rapid development and adoption of technologies that facilitates data collection, processing, and publishing in new ways and in ever-increasing volumes. In addition, the paper is explicit on new data infrastructure in sectors such as health, social amenities, and agriculture. Furthermore, this paper reveals the importance of cross-sectional data infrastructures for planning and decision making, which in turn can form a central data repository for sustainable development across developing countries.Keywords: data portal, data infrastructure, open source, sustainability
Procedia PDF Downloads 1039584 Coastal Adaptation to Climate Change: A Review of EU Tools, Legislation, National Strategies and Projects in the Mediterranean Basin
Authors: Dimitris Kokkinos, Panagiotis Prinos
Abstract:
In the last three decades, climate change has been studied extensively from scientific community, and its consequences are more than clear all around the world. Most countries have carried out a great effort to reduce global warming rates with the ratification and implementation of several international treaties. Moreover, many of them have already adopted national plans in order to adapt to climate change effects and mitigate human and economic losses. Coastal environments, with their inherent physical sensitivity, will face important challenges as a result of projected changes in climate conditions and hundreds of millions of people will be affected. Coastal zones are of high social and economic value and this research focuses on the Mediterranean basin, which is a densely populated and highly urbanized area. With 40% of its land used for human activity and the inevitability of the impacts of the climate change, it is obvious that some form of adaptation measures will be necessary. In this regard, the EU tools, policies and legislation concerning adaptation to climate change are presented. Additionally, the National Adaptation Strategies of State members of the Mediterranean basin are compared and analyzed concerning the coastal areas, along with an overview of projects and programs results focused on coastal issues at different spatial scales. The purpose of this research is to stress the differences between Mediterranean State members at methodologies implemented, to highlight the possible gaps in co-ordination and to emphasize on research initiatives that EU can build upon moving towards an integrated adaptation planning on a region-wide basis.Keywords: coastal adaptation, Mediterranean Basin, climate change, coastal environments
Procedia PDF Downloads 3129583 NanoSat MO Framework: Simulating a Constellation of Satellites with Docker Containers
Authors: César Coelho, Nikolai Wiegand
Abstract:
The advancement of nanosatellite technology has opened new avenues for cost-effective and faster space missions. The NanoSat MO Framework (NMF) from the European Space Agency (ESA) provides a modular and simpler approach to the development of flight software and operations of small satellites. This paper presents a methodology using the NMF together with Docker for simulating constellations of satellites. By leveraging Docker containers, the software environment of individual satellites can be easily replicated within a simulated constellation. This containerized approach allows for rapid deployment, isolation, and management of satellite instances, facilitating comprehensive testing and development in a controlled setting. By integrating the NMF lightweight simulator in the container, a comprehensive simulation environment was achieved. A significant advantage of using Docker containers is their inherent scalability, enabling the simulation of hundreds or even thousands of satellites with minimal overhead. Docker's lightweight nature ensures efficient resource utilization, allowing for deployment on a single host or across a cluster of hosts. This capability is crucial for large-scale simulations, such as in the case of mega-constellations, where multiple traditional virtual machines would be impractical due to their higher resource demands. This ability for easy horizontal scaling based on the number of simulated satellites provides tremendous flexibility to different mission scenarios. Our results demonstrate that leveraging Docker containers with the NanoSat MO Framework provides a highly efficient and scalable solution for simulating satellite constellations, offering not only significant benefits in terms of resource utilization and operational flexibility but also enabling testing and validation of ground software for constellations. The findings underscore the importance of taking advantage of already existing technologies in computer science to create new solutions for future satellite constellations in space.Keywords: containerization, docker containers, NanoSat MO framework, satellite constellation simulation, scalability, small satellites
Procedia PDF Downloads 549582 Correlation between Sleeping Disturbance and Academic Achievement in University Female Students
Authors: Amel Fayed, Shaden AlSubaih, Nouf Al-Qahtani, Asmaa Gosty, Asma Aljuhaimi
Abstract:
Introduction: Sleep difficulties are vastly predominant among adults and affect different aspects of their life. Many literatures found out that females are more liable to suffer from sleeping problems. College students are typical example of people dealing with daily pressure and stress to fulfill the daily tasks and responsibilities. In addition to their ultimate goal of achieving excellent academic records which require their full concentration and effort. Consequently, many of them start complaining of sleep deprivations which can undesirably affect their academic achievements. This study was aiming to investigate how prevalent is sleeping disorders among different colleges in the university and its relation their academic achievements. Methods: A cross-sectional study of female university students at Princess Norah Bint Abdulrahman University using self-administered questionnaire was conducted. Insomnia Severity Index (ISI) was used to assess different grades of insomnia. Students were requested to answer the questions evaluating their sleeping habits over the last two weeks. Participants reported their latest Grade Point Average (GPA). According to ISI, insomnia severity is reported as ‘No clinically significant’, ‘Subthreshold ‘,’ Clinical moderate insomnia’ and ‘Clinical severe’. Results: In the current study, 228 students participated; 172(75.4%) from medical colleges and 56 (24.6%) from non-medical colleges. About 80% of them claimed to have never taken any medications to help them sleep while only three students confirmed their regular use of sleep-inducing medications. About 16% of the students drink milk or other hot drinks to help them fall asleep. None of the students was suspected of having obstructive sleep apnea or apparent psychiatric disorder. According to ISI, 182 (79.8%) students suffered from subthreshold insomnia, 37 (16.2%) had clinical insomnia (moderate severity) and 9 (3.9%) of students had sleeping problems of non-clinically significance level. However, none of students was found to have severe clinical insomnia. Clinical moderate insomnia was reported in 15.1% of medical students and 19.6% of non-medical students. Moreover, about 82% of medical students suffered from subthreshold insomnia compared to 73.2% of non-medical students. This difference was not statistically significant (P=0.24). About 63% of medical students and 48% of non-medical students believed that high percentage of their colleagues are suffering from insomnias (p-value 0.08) The association between GPA and insomnia revealed that; 19.5% of low GPA group compared to 9.3% of high GPA group had clinical moderate insomnia. This association was not statistically significant (p=0.15). The correlation between the GPA and the ISI score was negative but not conclusive (r=-0.08, p-value = 0.29). More than 92% of all students agreed that sleeping problems affect their academic achievement to varying degrees. Conclusion: our results suggest that insomnia is commonly prevalent among female university students and might affect the students’ achievement. This study provides preliminary data about the quality of sleep among medical and non-medical university students which may be used to promote the healthy sleeping habits among female students.Keywords: academic achievement, females, insomnia, university student
Procedia PDF Downloads 3329581 Modern Agriculture and Industrialization Nexus in the Nigerian Context
Authors: Ese Urhie, Olabisi Popoola, Obindah Gershon, Olabanji Ewetan
Abstract:
Modern agriculture involves the use of improved tools and equipment (instead of crude and ineffective tools) like tractors, hand operated planters, hand operated fertilizer drills and combined harvesters - which increase agricultural productivity. Farmers in Nigeria still have huge potentials to enhance their productivity. The study argues that the increase in agricultural output due to increased productivity, orchestrated by modern agriculture will promote forward linkages and opportunities in the processing sub-sector; both the manufacturing of machines and the processing of raw materials. Depending on existing incentives, foreign investment could be attracted to augment local investment in the sector. The availability of raw materials in large quantity – which prices are competitive – will attract investment in other industries. In addition, potentials for backward linkages will also be created. In a nutshell, adopting the unbalanced growth theory in favour of the agricultural sector could engender industrialization in a country with untapped potentials. The paper highlights the numerous potentials of modern agriculture that are yet to be tapped in Nigeria and also provides a theoretical analysis of how the realization of such potentials could promote industrialization in the country. The study adopts the Lewis’ theory of structural–change model and Hirschman’s theory of unbalanced growth in the design of the analytical framework. The framework will be useful in empirical studies that will guide policy formulation.Keywords: modern agriculture, industrialization, structural change model, unbalanced growth
Procedia PDF Downloads 3109580 Comparative Analysis of Internal Combustion Engine Cooling Fins Using Ansys Software
Authors: Aakash Kumar R. G., Anees K. Ahamed, Raj M. Mohan
Abstract:
Effective engine cooling can improve the engine’s life and efficacy. The design of the fin of the cylinder head and block determines the cooling mechanism of air cooled engine. The heat conduction takes place through the engine parts and convection of heat from the surface of the fins takes place with air as the heat transferring medium. The air surrounding the cooling fins helps in removal of heat built up by the air cooled engine. If the heat removal rate is inadequate, it will result in lower engine efficiency and high thermal stresses in the engine. The main drawback of the air cooled engine is the low heat transfer rate of the cooling fins .This work is based on scrutiny of previous researches that involves enhancing of heat transfer rate of cooling fins. The current research is about augmentation of heat transfer rate of longitudinal rectangular fin profiles by varying the length of the fin and diameter of holes on the fins. Thermal and flow analysis is done for two different models of fins. One is simple fin without holes and the other is perforated (consist of holes). It can be inferred from the research that the fins with holes have a higher fin efficiency than the fins without holes. The geometry of the fin is done in CREO. The heat transfer analysis is done using ANSYS software.Keywords: fins, heat transfer, perforated fins, thermal analysis, thermal flux
Procedia PDF Downloads 3779579 Field Management Solutions Supporting Foreman Executive Tasks
Authors: Maroua Sbiti, Karim Beddiar, Djaoued Beladjine, Romuald Perrault
Abstract:
Productivity is decreasing in construction compared to the manufacturing industry. It seems that the sector is suffering from organizational problems and have low maturity regarding technological advances. High international competition due to the growing context of globalization, complex projects, and shorter deadlines increases these challenges. Field employees are more exposed to coordination problems than design officers. Execution collaboration is then a major issue that can threaten the cost, time, and quality completion of a project. Initially, this paper will try to identify field professional requirements as to address building management process weaknesses such as the unreliability of scheduling, the fickleness of monitoring and inspection processes, the inaccuracy of project’s indicators, inconsistency of building documents and the random logistic management. Subsequently, we will focus our attention on providing solutions to improve scheduling, inspection, and hours tracking processes using emerging lean tools and field mobility applications that bring new perspectives in terms of cooperation. They have shown a great ability to connect various field teams and make informations visual and accessible to planify accurately and eliminate at the source the potential defects. In addition to software as a service use, the adoption of the human resource module of the Enterprise Resource Planning system can allow a meticulous time accounting and thus make the faster decision making. The next step is to integrate external data sources received from or destined to design engineers, logisticians, and suppliers in a holistic system. Creating a monolithic system that consolidates planning, quality, procurement, and resources management modules should be our ultimate target to build the construction industry supply chain.Keywords: lean, last planner system, field mobility applications, construction productivity
Procedia PDF Downloads 1199578 Calculation of the Supersonic Air Intake with the Optimization of the Shock Wave System
Authors: Elena Vinogradova, Aleksei Pleshakov, Aleksei Yakovlev
Abstract:
During the flight of a supersonic aircraft under various conditions (altitude, Mach, etc.), it becomes necessary to coordinate the operating modes of the air intake and engine. On the supersonic aircraft, it’s been done by changing various control factors (the angle of rotation of the wedge panels and etc.). This paper investigates the possibility of using modern optimization methods to determine the optimal position of the supersonic air intake wedge panels in order to maximize the total pressure recovery coefficient. Modern software allows us to conduct auto-optimization, which determines the optimal position of the control elements of the investigated product to achieve its maximum efficiency. In this work, the flow in the supersonic aircraft inlet has investigated and optimized the operation of the flaps of the supersonic inlet in an aircraft in a 2-D setting. This work has done using ANSYS CFX software. The supersonic aircraft inlet is a flat adjustable external compression inlet. The braking surface is made in the form of a three-stage wedge. The IOSO NM software package was chosen for optimization. Change in the position of the panels of the input device is carried out by changing the angle between the first and second steps of the three-stage wedge. The position of the rest of the panels is changed automatically. Within the framework of the presented work, the position of the moving air intake panel was optimized under fixed flight conditions of the aircraft under a certain engine operating mode. As a result of the numerical modeling, the distribution of total pressure losses was obtained for various cases of the engine operation, depending on the incoming flow velocity and the flight altitude of the aircraft. The results make it possible to obtain the maximum total pressure recovery coefficient under given conditions. Also, the initial geometry was set with a certain angle between the first and second wedge panels. Having performed all the calculations, as well as the subsequent optimization of the aircraft input device, it can be concluded that the initial angle was set sufficiently close to the optimal angle.Keywords: optimal angle, optimization, supersonic air intake, total pressure recovery coefficient
Procedia PDF Downloads 2459577 Morphology Operation and Discrete Wavelet Transform for Blood Vessels Segmentation in Retina Fundus
Authors: Rita Magdalena, N. K. Caecar Pratiwi, Yunendah Nur Fuadah, Sofia Saidah, Bima Sakti
Abstract:
Vessel segmentation of retinal fundus is important for biomedical sciences in diagnosing ailments related to the eye. Segmentation can simplify medical experts in diagnosing retinal fundus image state. Therefore, in this study, we designed a software using MATLAB which enables the segmentation of the retinal blood vessels on retinal fundus images. There are two main steps in the process of segmentation. The first step is image preprocessing that aims to improve the quality of the image to be optimum segmented. The second step is the image segmentation in order to perform the extraction process to retrieve the retina’s blood vessel from the eye fundus image. The image segmentation methods that will be analyzed in this study are Morphology Operation, Discrete Wavelet Transform and combination of both. The amount of data that used in this project is 40 for the retinal image and 40 for manually segmentation image. After doing some testing scenarios, the average accuracy for Morphology Operation method is 88.46 % while for Discrete Wavelet Transform is 89.28 %. By combining the two methods mentioned in later, the average accuracy was increased to 89.53 %. The result of this study is an image processing system that can segment the blood vessels in retinal fundus with high accuracy and low computation time.Keywords: discrete wavelet transform, fundus retina, morphology operation, segmentation, vessel
Procedia PDF Downloads 1989576 Innovating Assessment: Exploring AI-Driven Scoring for Language Tests in Pre-Service Education Admissions
Authors: Lucie Bartosova
Abstract:
The rapid advancements in generative artificial intelligence (AI) have introduced transformative possibilities in education, particularly in assessment methodologies. This work provides an overview of the current state of the literature on AI-scoring methodologies for evaluating student-written responses. The focus is on how these innovations can be leveraged within large-scale assessments to address resource constraints such as limited assessors, time, and budget. Drawing from an initiative tied to a language test used for admitting candidates into a pre-service education program in the Faculty of Education at an Ontario university, the review explores the practical and ethical implications of integrating AI-driven tools into assessment processes. These tools are designed to automate the evaluation of learners’ written compositions, provide performance feedback, and support grading procedures. By synthesizing findings from recent research, the review highlights the effectiveness, reliability, and potential biases of AI in scoring, alongside considerations for transparency and fairness. This work emphasizes the dual role of generative AI as both a practical solution for scaling assessments and a subject of critical scrutiny to ensure its responsible implementation. The proposed integration of AI-scoring methodologies in our language test underscores the need to balance innovation with accountability, ensuring that AI tools enhance, rather than compromise, educational equity and rigor. OBJECTIVES OF YOUR RESEARCH To determine which generative AI model is most capable of evaluating written responses for university assessments based on specific criteria and to investigate potential biases within AI models to ensure fair assessments. METHODOLOGIES Evaluating generative AI models to determine their performance in assessing written responses against specific criteria. Collecting responses from previous assessments and annotating them with expert feedback to train and validate the AI models. MAIN CONTRIBUTIONS Introducing a tailored AI model to assess written responses on language tests. Offering a scalable and replicable model that informs broader applications of AI in educational assessments, contributing to policy-making and institutional best practices.Keywords: artificial intelligence, assessment practices, student written performance, automated essay scoring, language proficiency
Procedia PDF Downloads 129575 Agricultural Mechanization for Transformation
Authors: Lawrence Gumbe
Abstract:
Kenya Vision 2030 is the country's programme for transformation covering the period 2008 to 2030. Its objective is to help transform Kenya into a newly industrializing, middle-income, exceeding US$10000, country providing a high quality of life to all its citizens by 2030, in a clean and secure environment. Increased agricultural and production and productivity is crucial for the realization of Vision 2030. Mechanization of agriculture in order to achieve greater yields is the only way to achieve these objectives. There are contending groups and views on the strategy for agricultural mechanization. The first group are those who oppose the widespread adoption of advanced technologies (mostly internal combustion engines and tractors) in agricultural mechanization as entirely inappropriate in most situations in developing countries. This group argues that mechanically powered -agricultural mechanization often leads to displacement of labour and hence increased unemployment, and this results in a host of other socio-economic problems, amongst them, rural-urban migration, inequitable distribution of wealth and in many cases an increase in absolute poverty, balance of payments due to the need to import machinery, fuel and sometimes technical assistance to manage them. The second group comprises of those who view the use of the improved hand tools and animal powered technology as transitional step between the most rudimentary step in technological development (characterized by entire reliance on human muscle power) and the advanced technologies (characterized 'by reliance on tractors and other machinery). The third group comprises those who regard these intermediate technologies (ie. improved hand tools and draught animal technology in agriculture) as a ‘delaying’ tactic and they advocate the use of mechanical technologies as-the most appropriate. This group argues that alternatives to the mechanical technologies do not just exist as a practical matter, or, if they are available, they are inefficient and they cannot be compared to the mechanical technologies in terms of economics and productivity. The fourth group advocates a compromise between groups two and third above. This group views the improved hand tools and draught animal technology as more of an 18th century technology and the modem tractor and combine harvester as too advanced for developing countries. This group has been busy designing an ‘intermediate’, ‘appropriate’, ‘mini’, ‘micro’ tractor for use by farmers in developing countries. This paper analyses and concludes on the different agricultural mechanization strategies available to Kenya and other third world countriesKeywords: agriculture, mechanazation, transformation, industrialization
Procedia PDF Downloads 3429574 Numerical Solution of Magneto-Hydrodynamic Flow of a Viscous Fluid in the Presence of Nanoparticles with Fractional Derivatives through a Cylindrical Tube
Authors: Muhammad Abdullah, Asma Rashid Butt, Nauman Raza
Abstract:
Biomagnetic fluids like blood play key role in different applications of medical science and bioengineering. In this paper, the magnetohydrodynamic flow of a viscous fluid with magnetic particles through a cylindrical tube is investigated. The fluid is electrically charged in the presence of a uniform external magnetic field. The movement in the fluid is produced due to the cylindrical tube. Initially, the fluid and tube are at rest and at time t=0⁺, the tube starts to move along its axis. To obtain the mathematical model of flow with fractional derivatives fractional calculus approach is used. The solution of the flow model is obtained by using Laplace transformation. The Simon's numerical algorithm is employed to obtain inverse Laplace transform. The hybrid technique, we are employing has less computational effort as compared to other methods. The numerical calculations have been performed with Mathcad software. As the special cases of our problem, the solution of flow model with ordinary derivatives and flow without magnetic particles has been procured. Finally, the impact of non-integer fractional parameter alpha, Hartmann number Ha, and Reynolds number Re on flow and magnetic particles velocity is analyzed and depicted by graphs.Keywords: viscous fluid, magnetic particles, fractional calculus, laplace transformation
Procedia PDF Downloads 2109573 Agricultural Knowledge Management System Design, Use, and Consequence for Knowledge Sharing and Integration
Authors: Dejen Alemu, Murray E. Jennex, Temtim Assefa
Abstract:
This paper is investigated to understand the design, the use, and the consequence of Knowledge Management System (KMS) for knowledge systems sharing and integration. A KMS for knowledge systems sharing and integration is designed to meet the challenges raised by knowledge management researchers and practitioners: the technical, the human, and social factors. Agricultural KMS involves various members coming from different Communities of Practice (CoPs) who possess their own knowledge of multiple practices which need to be combined in the system development. However, the current development of the technology ignored the indigenous knowledge of the local communities, which is the key success factor for agriculture. This research employed the multi-methodological approach to KMS research in action research perspective which consists of four strategies: theory building, experimentation, observation, and system development. Using the KMS development practice of Ethiopian agricultural transformation agency as a case study, this research employed an interpretive analysis using primary qualitative data acquired through in-depth semi-structured interviews and participant observations. The Orlikowski's structuration model of technology has been used to understand the design, the use, and the consequence of the KMS. As a result, the research identified three basic components for the architecture of the shared KMS, namely, the people, the resources, and the implementation subsystems. The KMS were developed using web 2.0 tools to promote knowledge sharing and integration among diverse groups of users in a distributed environment. The use of a shared KMS allows users to access diverse knowledge from a number of users in different groups of participants, enhances the exchange of different forms of knowledge and experience, and creates high interaction and collaboration among participants. The consequences of a shared KMS on the social system includes, the elimination of hierarchical structure, enhance participation, collaboration, and negotiation among users from different CoPs having common interest, knowledge and skill development, integration of diverse knowledge resources, and the requirement of policy and guideline. The research contributes methodologically for the application of system development action research for understanding a conceptual framework for KMS development and use. The research have also theoretical contribution in extending structuration model of technology for the incorporation of variety of knowledge and practical implications to provide management understanding in developing strategies for the potential of web 2.0 tools for sharing and integration of indigenous knowledge.Keywords: communities of practice, indigenous knowledge, participation, structuration model of technology, Web 2.0 tools
Procedia PDF Downloads 2589572 Use and Effects of Kanban Board from the Aspects of Brothers Furniture Limited
Authors: Kazi Rizvan, Yamin Rekhu
Abstract:
Due to high competitiveness in industries throughout the world, every industry is trying hard to utilize all their resources to keep their productivity as high as possible. Many tools have been being used to ensure smoother flow of an operation, to balance tasks, to maintain proper schedules for tasks, to maintain proper sequence for tasks, to reduce unproductive time. All of these tools are used to augment productivity within an industry. Kanban board is one of them and of the many important tools of lean production system. Kanban Board is a visual depiction of the status of tasks. Kanban board shows the actual status of the tasks. It conveys the progress and issues of tasks as well. Using Kanban Board, tasks can be distributed among workers and operation targets can be visually represented to them. In this paper, an example of Kanban board from the aspects of Brothers Furniture Limited was taken and how the Kanban board system was implemented, how the board was designed and how it was made easily perceivable for the less literate or illiterate workers. The Kanban board was designed for the packing section of Brothers Furniture Limited. It was implemented for the purpose of representing the tasks flow to the workers and to mitigate the time that was wasted while the workers remained wondering about what task they should start after they finish one. Kanban board subsumed seven columns and there was a column for comments where if any problem occurred during working on the tasks. Kanban board was helpful for the workers as the board showed the urgency of the tasks. It was also helpful for the store section as they could understand which products and how much of them could be delivered to store at any certain time. Kanban board had all the information centralized which is why the work-flow got paced up and idle time was minimized. Regardless of many workers being illiterate or less literate, Kanban board was still explicable for the workers as the Kanban cards were colored. Since the significance of colors can be conveniently interpretable to them, colored cards helped a great deal in that matter. Hence, the illiterate or less literate workers didn’t have to spend time wondering about the significance of the cards. Even when the workers weren’t told the significance of the colored cards, they could grow a feeling about their meaning as colors can trigger anyone’s mind to perceive the situation. As a result, the board elucidated the workers about what board required them to do, when to do and what to do next. Kanban board alleviated excessive time between tasks by setting day-plan for targeted tasks and it also reduced time during tasks as the workers were acknowledged of forthcoming tasks for a day. Being very specific to the tasks, Kanban board helped the workers become more focused on their tasks helped them do their job with more perfection. As a result, The Kanban board helped achieve a 8.75% increase in productivity than the productivity before the Kanban board was implemented.Keywords: color, Kanban Board, Lean Tool, literacy, packing, productivity
Procedia PDF Downloads 2359571 Digital Transformation in Education: Artificial Intelligence Awareness of Preschool Teachers
Authors: Cansu Bozer, Saadet İrem Turgut
Abstract:
Artificial intelligence (AI) has become one of the most important technologies of the digital age and is transforming many sectors, including education. The advantages offered by AI, such as automation, personalised learning, and data analytics, create new opportunities for both teachers and students in education systems. Preschool education plays a fundamental role in the cognitive, social, and emotional development of children. In this period, the foundations of children's creative thinking, problem-solving, and critical thinking skills are laid. Educational technologies, especially artificial intelligence-based applications, are thought to contribute to the development of these skills. For example, artificial intelligence-supported digital learning tools can support learning processes by offering activities that can be customised according to the individual needs of each child. However, the successful use of artificial intelligence-based applications in preschool education can be realised under the guidance of teachers who have the right knowledge about this technology. Therefore, it is of great importance to measure preschool teachers' awareness levels of artificial intelligence and to understand which variables affect this awareness. The aim of this study is to measure preschool teachers' awareness levels of artificial intelligence and to determine which factors are related to this awareness. In line with this purpose, teachers' level of knowledge about artificial intelligence, their thoughts about the role of artificial intelligence in education, and their attitudes towards artificial intelligence will be evaluated. The study will be conducted with 100 teachers working in Turkey using a descriptive survey model. In this context, ‘Artificial Intelligence Awareness Level Scale for Teachers’ developed by Ferikoğlu and Akgün (2022) will be used. The collected data will be analysed using SPSS (Statistical Package for the Social Sciences) software. Descriptive statistics (frequency, percentage, mean, standard deviation) and relationship analyses (correlation and regression analyses) will be used in data analysis. As a result of the study, the level of artificial intelligence awareness of preschool teachers will be determined, and the factors affecting this awareness will be identified. The findings obtained will contribute to the determination of studies that can be done to increase artificial intelligence awareness in preschool education.Keywords: education, child development, artificial intelligence, preschool teachers
Procedia PDF Downloads 269570 Virtual Reality Learning Environment in Embryology Education
Authors: Salsabeel F. M. Alfalah, Jannat F. Falah, Nadia Muhaidat, Amjad Hudaib, Diana Koshebye, Sawsan AlHourani
Abstract:
Educational technology is changing the way how students engage and interact with learning materials. This improved the learning process amongst various subjects. Virtual Reality (VR) applications are considered one of the evolving methods that have contributed to enhancing medical education. This paper utilizes VR to provide a solution to improve the delivery of the subject of Embryology to medical students, and facilitate the teaching process by providing a useful aid to lecturers, whilst proving the effectiveness of this new technology in this particular area. After evaluating the current teaching methods and identifying students ‘needs, a VR system was designed that demonstrates in an interactive fashion the development of the human embryo from fertilization to week ten of intrauterine development. This system aims to overcome some of the problems faced by the students’ in the current educational methods, and to increase the efficacy of the learning process.Keywords: virtual reality, student assessment, medical education, 3D, embryology
Procedia PDF Downloads 1959569 CT Images Based Dense Facial Soft Tissue Thickness Measurement by Open-source Tools in Chinese Population
Authors: Ye Xue, Zhenhua Deng
Abstract:
Objectives: Facial soft tissue thickness (FSTT) data could be obtained from CT scans by measuring the face-to-skull distances at sparsely distributed anatomical landmarks by manually located on face and skull. However, automated measurement using 3D facial and skull models by dense points using open-source software has become a viable option due to the development of computed assisted imaging technologies. By utilizing dense FSTT information, it becomes feasible to generate plausible automated facial approximations. Therefore, establishing a comprehensive and detailed, densely calculated FSTT database is crucial in enhancing the accuracy of facial approximation. Materials and methods: This study utilized head CT scans from 250 Chinese adults of Han ethnicity, with 170 participants originally born and residing in northern China and 80 participants in southern China. The age of the participants ranged from 14 to 82 years, and all samples were divided into five non-overlapping age groups. Additionally, samples were also divided into three categories based on BMI information. The 3D Slicer software was utilized to segment bone and soft tissue based on different Hounsfield Unit (HU) thresholds, and surface models of the face and skull were reconstructed for all samples from CT data. Following procedures were performed unsing MeshLab, including converting the face models into hollowed cropped surface models amd automatically measuring the Hausdorff Distance (referred to as FSTT) between the skull and face models. Hausdorff point clouds were colorized based on depth value and exported as PLY files. A histogram of the depth distributions could be view and subdivided into smaller increments. All PLY files were visualized of Hausdorff distance value of each vertex. Basic descriptive statistics (i.e., mean, maximum, minimum and standard deviation etc.) and distribution of FSTT were analysis considering the sex, age, BMI and birthplace. Statistical methods employed included Multiple Regression Analysis, ANOVA, principal component analysis (PCA). Results: The distribution of FSTT is mainly influenced by BMI and sex, as further supported by the results of the PCA analysis. Additionally, FSTT values exceeding 30mm were found to be more sensitive to sex. Birthplace-related differences were observed in regions such as the forehead, orbital, mandibular, and zygoma. Specifically, there are distribution variances in the depth range of 20-30mm, particularly in the mandibular region. Northern males exhibit thinner FSTT in the frontal region of the forehead compared to southern males, while females shows fewer distribution differences between the northern and southern, except for the zygoma region. The observed distribution variance in the orbital region could be attributed to differences in orbital size and shape. Discussion: This study provides a database of Chinese individuals distribution of FSTT and suggested opening source tool shows fine function for FSTT measurement. By incorporating birthplace as an influential factor in the distribution of FSTT, a greater level of detail can be achieved in facial approximation.Keywords: forensic anthropology, forensic imaging, cranial facial reconstruction, facial soft tissue thickness, CT, open-source tool
Procedia PDF Downloads 629568 The Visualization of Hydrological and Hydraulic Models Based on the Platform of Autodesk Civil 3D
Authors: Xiyue Wang, Shaoning Yan
Abstract:
Cities in China today is faced with an increasingly serious river ecological crisis accompanying with the development of urbanization: waterlogging on account of the fragmented urban natural hydrological system; the limited ecological function of the hydrological system caused by a destruction of water system and waterfront ecological environment. Additionally, the eco-hydrological processes of rivers are affected by various environmental factors, which are more complex in the context of urban environment. Therefore, efficient hydrological monitoring and analysis tools, accurate and visual hydrological and hydraulic models are becoming more important basis for decision-makers and an important way for landscape architects to solve urban hydrological problems, formulating sustainable and forward-looking schemes. The study mainly introduces the river and flood analysis model based on the platform of Autodesk Civil 3D. Taking the Luanhe River in Qian'an City of Hebei Province as an example, the 3D models of the landform, river, embankment, shoal, pond, underground stream and other land features were initially built, with which the water transfer simulation analysis, river floodplain analysis, and river ecology analysis were carried out, ultimately the real-time visualized simulation and analysis of rivers in various hypothetical scenarios were realized. Through the establishment of digital hydrological and hydraulic model, the hydraulic data can be accurately and intuitively simulated, which provides basis for rational water system and benign urban ecological system design. Though, the hydrological and hydraulic model based on Autodesk Civil3D own its boundedness: the interaction between the model and other data and software is unfavorable; the huge amount of 3D data and the lack of basic data restrict the accuracy and application range. The hydrological and hydraulic model based on Autodesk Civil3D platform provides more possibility to access convenient and intelligent tool for urban planning and monitoring, a solid basis for further urban research and design.Keywords: visualization, hydrological and hydraulic model, Autodesk Civil 3D, urban river
Procedia PDF Downloads 2999567 Knowledge Management Barriers: A Statistical Study of Hardware Development Engineering Teams within Restricted Environments
Authors: Nicholas S. Norbert Jr., John E. Bischoff, Christopher J. Willy
Abstract:
Knowledge Management (KM) is globally recognized as a crucial element in securing competitive advantage through building and maintaining organizational memory, codifying and protecting intellectual capital and business intelligence, and providing mechanisms for collaboration and innovation. KM frameworks and approaches have been developed and defined identifying critical success factors for conducting KM within numerous industries ranging from scientific to business, and for ranges of organization scales from small groups to large enterprises. However, engineering and technical teams operating within restricted environments are subject to unique barriers and KM challenges which cannot be directly treated using the approaches and tools prescribed for other industries. This research identifies barriers in conducting KM within Hardware Development Engineering (HDE) teams and statistically compares significance to barriers upholding the four KM pillars of organization, technology, leadership, and learning for HDE teams. HDE teams suffer from restrictions in knowledge sharing (KS) due to classification of information (national security risks), customer proprietary restrictions (non-disclosure agreement execution for designs), types of knowledge, complexity of knowledge to be shared, and knowledge seeker expertise. As KM evolved leveraging information technology (IT) and web-based tools and approaches from Web 1.0 to Enterprise 2.0, KM may also seek to leverage emergent tools and analytics including expert locators and hybrid recommender systems to enable KS across barriers of the technical teams. The research will test hypothesis statistically evaluating if KM barriers for HDE teams affect the general set of expected benefits of a KM System identified through previous research. If correlations may be identified, then generalizations of success factors and approaches may also be garnered for HDE teams. Expert elicitation will be conducted using a questionnaire hosted on the internet and delivered to a panel of experts including engineering managers, principal and lead engineers, senior systems engineers, and knowledge management experts. The feedback to the questionnaire will be processed using analysis of variance (ANOVA) to identify and rank statistically significant barriers of HDE teams within the four KM pillars. Subsequently, KM approaches will be recommended for upholding the KM pillars within restricted environments of HDE teams.Keywords: engineering management, knowledge barriers, knowledge management, knowledge sharing
Procedia PDF Downloads 285