Search results for: semantic technology
243 Monitoring the Production of Large Composite Structures Using Dielectric Tool Embedded Capacitors
Authors: Galatee Levadoux, Trevor Benson, Chris Worrall
Abstract:
With the rise of public awareness on climate change comes an increasing demand for renewable sources of energy. As a result, the wind power sector is striving to manufacture longer, more efficient and reliable wind turbine blades. Currently, one of the leading causes of blade failure in service is improper cure of the resin during manufacture. The infusion process creating the main part of the composite blade structure remains a critical step that is yet to be monitored in real time. This stage consists of a viscous resin being drawn into a mould under vacuum, then undergoing a curing reaction until solidification. Successful infusion assumes the resin fills all the voids and cures completely. Given that the electrical properties of the resin change significantly during its solidification, both the filling of the mould and the curing reaction are susceptible to be followed using dieletrometry. However, industrially available dielectrics sensors are currently too small to monitor the entire surface of a wind turbine blade. The aim of the present research project is to scale up the dielectric sensor technology and develop a device able to monitor the manufacturing process of large composite structures, assessing the conformity of the blade before it even comes out of the mould. An array of flat copper wires acting as electrodes are embedded in a polymer matrix fixed in an infusion mould. A multi-frequency analysis from 1 Hz to 10 kHz is performed during the filling of the mould with an epoxy resin and the hardening of the said resin. By following the variations of the complex admittance Y*, the filling of the mould and curing process are monitored. Results are compared to numerical simulations of the sensor in order to validate a virtual cure-monitoring system. The results obtained by drawing glycerol on top of the copper sensor displayed a linear relation between the wetted length of the sensor and the complex admittance measured. Drawing epoxy resin on top of the sensor and letting it cure at room temperature for 24 hours has provided characteristic curves obtained when conventional interdigitated sensor are used to follow the same reaction. The response from the developed sensor has shown the different stages of the polymerization of the resin, validating the geometry of the prototype. The model created and analysed using COMSOL has shown that the dielectric cure process can be simulated, so long as a sufficient time and temperature dependent material properties can be determined. The model can be used to help design larger sensors suitable for use with full-sized blades. The preliminary results obtained with the sensor prototype indicate that the infusion and curing process of an epoxy resin can be followed with the chosen configuration on a scale of several decimeters. Further work is to be devoted to studying the influence of the sensor geometry and the infusion parameters on the results obtained. Ultimately, the aim is to develop a larger scale sensor able to monitor the flow and cure of large composite panels industrially.Keywords: composite manufacture, dieletrometry, epoxy, resin infusion, wind turbine blades
Procedia PDF Downloads 168242 Consumers and Voters’ Choice: Two Different Contexts with a Powerful Behavioural Parallel
Authors: Valentina Dolmova
Abstract:
What consumers choose to buy and who voters select on election days are two questions that have captivated the interest of both academics and practitioners for many decades. The importance of understanding what influences the behavior of those groups and whether or not we can predict or control it fuels a steady stream of research in a range of fields. By looking only at the past 40 years, more than 70 thousand scientific papers have been published in each field – consumer behavior and political psychology, respectively. From marketing, economics, and the science of persuasion to political and cognitive psychology - we have all remained heavily engaged. The ever-evolving technology, inevitable socio-cultural shifts, global economic conditions, and much more play an important role in choice-equations regardless of context. On one hand, this makes the research efforts always relevant and needed. On the other, the relatively low number of cross-field collaborations, which seem to be picking up only in more in recent years, makes the existing findings isolated into framed bubbles. By performing systematic research across both areas of psychology and building a parallel between theories and factors of influence, however, we find that there is not only a definitive common ground between the behaviors of consumers and voters but that we are moving towards a global model of choice. This means that the lines between contexts are fading which has a direct implication on what we should focus on when predicting or navigating buyers and voters’ behavior. Internal and external factors in four main categories determine the choices we make as consumers and as voters. Together, personal, psychological, social, and cultural create a holistic framework through which all stimuli in relation to a particular product or a political party get filtered. The analogy “consumer-voter” solidifies further. Leading academics suggest that this fundamental parallel is the key to managing successfully political and consumer brands alike. However, we distinguish additional four key stimuli that relate to those factor categories (1/ opportunity costs; 2/the memory of the past; 3/recognisable figures/faces and 4/conflict) arguing that the level of expertise a person has determines the prevalence of factors or specific stimuli. Our efforts take into account global trends such as the establishment of “celebrity politics” and the image of “ethically concerned consumer brands” which bridge the gap between contexts to an even greater extent. Scientists and practitioners are pushed to accept the transformative nature of both fields in social psychology. Existing blind spots as well as the limited number of research conducted outside the American and European societies open up space for more collaborative efforts in this highly demanding and lucrative field. A mixed method of research tests three main hypotheses, the first two of which are focused on the level of irrelevance of context when comparing voting or consumer behavior – both from the factors and stimuli lenses, the third on determining whether or not the level of expertise in any field skews the weight of what prism we are more likely to choose when evaluating options.Keywords: buyers’ behaviour, decision-making, voters’ behaviour, social psychology
Procedia PDF Downloads 154241 Nuclear Powered UAV for Surveillances and Aerial Photography
Authors: Rajasekar Elangopandian, Anand Shanmugam
Abstract:
Now-a-days for surveillances unmanned aerial vehicle plays a vital role. Not only for surveillances, aerial photography disaster management and the notice of earth behavior UAV1s envisages meticulously. To reduce the maintenance and fuel nuclear powered Vehicles are greater support. The design consideration is much important for the UAV manufacturing industry and Research and development agency. Eventually design is looking like a pentagon shaped fuselage and black rubber coated paint in order to escape from the enemy radar and other targets. The pentagon shape fuselage has large space to keep the mini nuclear reactor inside and the material is carbon – carbon fiber specially designed by the software called cosmol and hyper mesh 14.2. So the weight consideration will produce the positive result for productivity. The walls of the fuselage are coated with lead and protective shield. A double layer of W/Bi sheet is proposed for radiation protection at the energy range of 70 Kev to 90 Kev. The designed W/bi sheet, only 0.14 mm thick and is 36% light. The properties of the fillers were determined from zeta potential and particle size measurements. The Exposes of the radiation can be attenuated by 3 ways such as minimizing exposure time, Maximizing distance from the radiation source and shielding the whole vehicle. The inside reactor will be switched ON when the UAV starts its cruise. The moderators and the control rods can be inserted by automation technique by newly developed software. The heat generated by the reactor will be used to run the turbine which is fixed inside the UAV called mini turbine with natural rubber composite Shaft radiation shield. Cooling system will be in two mode such as liquid and air cooled. Liquid coolant for the heat regeneration is ordinary water, liquid sodium, helium and the walls are made up of regenerative and radiation protective material. The other components like camera and arms bay will be located at the bottom of the UAV high are specially made products in order to escape from the radiation. They are coated with lead Pb and natural rubber composite material. This technique provides the long rang and endurance for eternal flight mission until we need any changeability of parts or product. This UAV has the special advantage of ` land on String` means it`ll land at electric line to charge the automated electronics. Then the fuel is enriched uranium (< 5% U - 235) contains hundreds of fuel pins. This technique provides eternal duty for surveillances and aerial photography. The landing of the vehicle is ease of operation likewise the takeoff is also easier than any other mechanism which present in nowadays. This UAV gives great immense and immaculate technology for surveillance and target detecting and smashing the target.Keywords: mini turbine, liquid coolant for the heat regeneration, in order to escape from the radiation, eternal flight mission, it`ll land at electric line
Procedia PDF Downloads 410240 Modeling the Demand for the Healthcare Services Using Data Analysis Techniques
Authors: Elizaveta S. Prokofyeva, Svetlana V. Maltseva, Roman D. Zaitsev
Abstract:
Rapidly evolving modern data analysis technologies in healthcare play a large role in understanding the operation of the system and its characteristics. Nowadays, one of the key tasks in urban healthcare is to optimize the resource allocation. Thus, the application of data analysis in medical institutions to solve optimization problems determines the significance of this study. The purpose of this research was to establish the dependence between the indicators of the effectiveness of the medical institution and its resources. Hospital discharges by diagnosis; hospital days of in-patients and in-patient average length of stay were selected as the performance indicators and the demand of the medical facility. The hospital beds by type of care, medical technology (magnetic resonance tomography, gamma cameras, angiographic complexes and lithotripters) and physicians characterized the resource provision of medical institutions for the developed models. The data source for the research was an open database of the statistical service Eurostat. The choice of the source is due to the fact that the databases contain complete and open information necessary for research tasks in the field of public health. In addition, the statistical database has a user-friendly interface that allows you to quickly build analytical reports. The study provides information on 28 European for the period from 2007 to 2016. For all countries included in the study, with the most accurate and complete data for the period under review, predictive models were developed based on historical panel data. An attempt to improve the quality and the interpretation of the models was made by cluster analysis of the investigated set of countries. The main idea was to assess the similarity of the joint behavior of the variables throughout the time period under consideration to identify groups of similar countries and to construct the separate regression models for them. Therefore, the original time series were used as the objects of clustering. The hierarchical agglomerate algorithm k-medoids was used. The sampled objects were used as the centers of the clusters obtained, since determining the centroid when working with time series involves additional difficulties. The number of clusters used the silhouette coefficient. After the cluster analysis it was possible to significantly improve the predictive power of the models: for example, in the one of the clusters, MAPE error was only 0,82%, which makes it possible to conclude that this forecast is highly reliable in the short term. The obtained predicted values of the developed models have a relatively low level of error and can be used to make decisions on the resource provision of the hospital by medical personnel. The research displays the strong dependencies between the demand for the medical services and the modern medical equipment variable, which highlights the importance of the technological component for the successful development of the medical facility. Currently, data analysis has a huge potential, which allows to significantly improving health services. Medical institutions that are the first to introduce these technologies will certainly have a competitive advantage.Keywords: data analysis, demand modeling, healthcare, medical facilities
Procedia PDF Downloads 145239 Improving a Stagnant River Reach Water Quality by Combining Jet Water Flow and Ultrasonic Irradiation
Authors: A. K. Tekile, I. L. Kim, J. Y. Lee
Abstract:
Human activities put freshwater quality under risk, mainly due to expansion of agriculture and industries, damming, diversion and discharge of inadequately treated wastewaters. The rapid human population growth and climate change escalated the problem. External controlling actions on point and non-point pollution sources are long-term solution to manage water quality. To have a holistic approach, these mechanisms should be coupled with the in-water control strategies. The available in-lake or river methods are either costly or they have some adverse effect on the ecological system that the search for an alternative and effective solution with a reasonable balance is still going on. This study aimed at the physical and chemical water quality improvement in a stagnant Yeo-cheon River reach (Korea), which has recently shown sign of water quality problems such as scum formation and fish death. The river water quality was monitored, for the duration of three months by operating only water flow generator in the first two weeks and then ultrasonic irradiation device was coupled to the flow unit for the remaining duration of the experiment. In addition to assessing the water quality improvement, the correlation among the parameters was analyzed to explain the contribution of the ultra-sonication. Generally, the combined strategy showed localized improvement of water quality in terms of dissolved oxygen, Chlorophyll-a and dissolved reactive phosphate. At locations under limited influence of the system operation, chlorophyll-a was highly increased, but within 25 m of operation the low initial value was maintained. The inverse correlation coefficient between dissolved oxygen and chlorophyll-a decreased from 0.51 to 0.37 when ultrasonic irradiation unit was used with the flow, showing that ultrasonic treatment reduced chlorophyll-a concentration and it inhibited photosynthesis. The relationship between dissolved oxygen and reactive phosphate also indicated that influence of ultra-sonication was higher than flow on the reactive phosphate concentration. Even though flow increased turbidity by suspending sediments, ultrasonic waves canceled out the effect due to the agglomeration of suspended particles and the follow-up settling out. There has also been variation of interaction in the water column as the decrease of pH and dissolved oxygen from surface to the bottom played a role in phosphorus release into the water column. The variation of nitrogen and dissolved organic carbon concentrations showed mixed trend probably due to the complex chemical reactions subsequent to the operation. Besides, the intensive rainfall and strong wind around the end of the field trial had apparent impact on the result. The combined effect of water flow and ultrasonic irradiation was a cumulative water quality improvement and it maintained the dissolved oxygen and chlorophyll-a requirement of the river for healthy ecological interaction. However, the overall improvement of water quality is not guaranteed as effectiveness of ultrasonic technology requires long-term monitoring of water quality before, during and after treatment. Even though, the short duration of the study conducted here has limited nutrient pattern realization, the use of ultrasound at field scale to improve water quality is promising.Keywords: stagnant, ultrasonic irradiation, water flow, water quality
Procedia PDF Downloads 194238 Spin Rate Decaying Law of Projectile with Hemispherical Head in Exterior Trajectory
Authors: Quan Wen, Tianxiao Chang, Shaolu Shi, Yushi Wang, Guangyu Wang
Abstract:
As a kind of working environment of the fuze, the spin rate decaying law of projectile in exterior trajectory is of great value in the design of the rotation count fixed distance fuze. In addition, it is significant in the field of devices for simulation tests of fuze exterior ballistic environment, flight stability, and dispersion accuracy of gun projectile and opening and scattering design of submunition and illuminating cartridges. Besides, the self-destroying mechanism of the fuze in small-caliber projectile often works by utilizing the attenuation of centrifugal force. In the theory of projectile aerodynamics and fuze design, there are many formulas describing the change law of projectile angular velocity in external ballistic such as Roggla formula, exponential function formula, and power function formula. However, these formulas are mostly semi-empirical due to the poor test conditions and insufficient test data at that time. These formulas are difficult to meet the design requirements of modern fuze because they are not accurate enough and have a narrow range of applications now. In order to provide more accurate ballistic environment parameters for the design of a hemispherical head projectile fuze, the projectile’s spin rate decaying law in exterior trajectory under the effect of air resistance was studied. In the analysis, the projectile shape was simplified as hemisphere head, cylindrical part, rotating band part, and anti-truncated conical tail. The main assumptions are as follows: a) The shape and mass are symmetrical about the longitudinal axis, b) There is a smooth transition between the ball hea, c) The air flow on the outer surface is set as a flat plate flow with the same area as the expanded outer surface of the projectile, and the boundary layer is turbulent, d) The polar damping moment attributed to the wrench hole and rifling mark on the projectile is not considered, e) The groove of the rifle on the rotating band is uniform, smooth and regular. The impacts of the four parts on aerodynamic moment of the projectile rotation were obtained by aerodynamic theory. The surface friction stress of the projectile, the polar damping moment formed by the head of the projectile, the surface friction moment formed by the cylindrical part, the rotating band, and the anti-truncated conical tail were obtained by mathematical derivation. After that, the mathematical model of angular spin rate attenuation was established. In the whole trajectory with the maximum range angle (38°), the absolute error of the polar damping torque coefficient obtained by simulation and the coefficient calculated by the mathematical model established in this paper is not more than 7%. Therefore, the credibility of the mathematical model was verified. The mathematical model can be described as a first-order nonlinear differential equation, which has no analytical solution. The solution can be only gained as a numerical solution by connecting the model with projectile mass motion equations in exterior ballistics.Keywords: ammunition engineering, fuze technology, spin rate, numerical simulation
Procedia PDF Downloads 147237 Developing Effective Strategies to Reduce Hiv, Aids and Sexually Transmitted Infections, Nakuru, Kenya
Authors: Brian Bacia, Esther Githaiga, Teresia Kabucho, Paul Moses Ndegwa, Lucy Gichohi
Abstract:
Purpose: The aim of the study is to ensure an appropriate mix of evidence-based prevention strategies geared towards the reduction of new HIV infections and the incidence of Sexually transmitted Illnesses Background: In Nakuru County, more than 90% of all HIV-infected patients are adults and on a single-dose medication-one pill that contains a combination of several different HIV drugs. Nakuru town has been identified as the hardest hit by HIV/Aids in the County according to the latest statistics from the County Aids and STI group, with a prevalence rate of 5.7 percent attributed to the high population and an active urban center. Method: 2 key studies were carried out to provide evidence for the effectiveness of antiretroviral therapy (ART) when used optimally on preventing sexual transmission of HIV. Discussions based on an examination, assessments of successes in planning, program implementation, and ultimate impact of prevention and treatment were undertaken involving health managers, health workers, community health workers, and people living with HIV/AIDS between February -August 2021. Questionnaires were carried out by a trained duo on ethical procedures at 15 HIV treatment clinics targeting patients on ARVs and caregivers on ARV prevention and treatment of pediatric HIV infection. Findings: Levels of AIDS awareness are extremely high. Advances in HIV treatment have led to an enhanced understanding of the virus, improved care of patients, and control of the spread of drug-resistant HIV. There has been a tremendous increase in the number of people living with HIV having access to life-long antiretroviral drugs (ARV), mostly on generic medicines. Healthcare facilities providing treatment are stressed challenging the administration of the drugs, which require a clinical setting. Women find it difficult to take a daily pill which reduces the effectiveness of the medicine. ART adherence can be strengthened largely through the use of innovative digital technology. The case management approach is useful in resource-limited settings. The county has made tremendous progress in mother-to-child transmission reduction through enhanced early antenatal care (ANC) attendance and mapping of pregnant women Recommendations: Treatment reduces the risk of transmission to the child during pregnancy, labor, and delivery. Promote research of medicines through patients and community engagement. Reduce the risk of transmission through breastfeeding. Enhance testing strategies and strengthen health systems for sustainable HIV service delivery. Need exists for improved antenatal care and delivery by skilled birth attendants. Develop a comprehensive maternal reproductive health policy covering equitability, efficient and effective delivery of services. Put in place referral systems.Keywords: evidence-based prevention strategies, service delivery, human management, integrated approach
Procedia PDF Downloads 89236 Treatment Process of Sludge from Leachate with an Activated Sludge System and Extended Aeration System
Authors: A. Chávez, A. Rodríguez, F. Pinzón
Abstract:
Society is concerned about measures of environmental, economic and social impacts generated in the solid waste disposal. These places of confinement, also known as landfills, are locations where problems of pollution and damage to human health are reduced. They are technically designed and operated, using engineering principles, storing the residue in a small area, compact it to reduce volume and covering them with soil layers. Problems preventing liquid (leachate) and gases produced by the decomposition of organic matter. Despite planning and site selection for disposal, monitoring and control of selected processes, remains the dilemma of the leachate as extreme concentration of pollutants, devastating soil, flora and fauna; aggressive processes requiring priority attention. A biological technology is the activated sludge system, used for tributaries with high pollutant loads. Since transforms biodegradable dissolved and particulate matter into CO2, H2O and sludge; transform suspended and no Settleable solids; change nutrients as nitrogen and phosphorous; and degrades heavy metals. The microorganisms that remove organic matter in the processes are in generally facultative heterotrophic bacteria, forming heterogeneous populations. Is possible to find unicellular fungi, algae, protozoa and rotifers, that process the organic carbon source and oxygen, as well as the nitrogen and phosphorus because are vital for cell synthesis. The mixture of the substrate, in this case sludge leachate, molasses and wastewater is maintained ventilated by mechanical aeration diffusers. Considering as the biological processes work to remove dissolved material (< 45 microns), generating biomass, easily obtained by decantation processes. The design consists of an artificial support and aeration pumps, favoring develop microorganisms (denitrifying) using oxygen (O) with nitrate, resulting in nitrogen (N) in the gas phase. Thus, avoiding negative effects of the presence of ammonia or phosphorus. Overall the activated sludge system includes about 8 hours of hydraulic retention time, which does not prevent the demand for nitrification, which occurs on average in a value of MLSS 3,000 mg/L. The extended aeration works with times greater than 24 hours detention; with ratio of organic load/biomass inventory under 0.1; and average stay time (sludge age) more than 8 days. This project developed a pilot system with sludge leachate from Doña Juana landfill - RSDJ –, located in Bogota, Colombia, where they will be subjected to a process of activated sludge and extended aeration through a sequential Bach reactor - SBR, to be dump in hydric sources, avoiding ecological collapse. The system worked with a dwell time of 8 days, 30 L capacity, mainly by removing values of BOD and COD above 90%, with initial data of 1720 mg/L and 6500 mg/L respectively. Motivating the deliberate nitrification is expected to be possible commercial use diffused aeration systems for sludge leachate from landfills.Keywords: sludge, landfill, leachate, SBR
Procedia PDF Downloads 273235 Using Business Simulations and Game-Based Learning for Enterprise Resource Planning Implementation Training
Authors: Carin Chuang, Kuan-Chou Chen
Abstract:
An Enterprise Resource Planning (ERP) system is an integrated information system that supports the seamless integration of all the business processes of a company. Implementing an ERP system can increase efficiencies and decrease the costs while helping improve productivity. Many organizations including large, medium and small-sized companies have already adopted an ERP system for decades. Although ERP system can bring competitive advantages to organizations, the lack of proper training approach in ERP implementation is still a major concern. Organizations understand the importance of ERP training to adequately prepare managers and users. The low return on investment, however, for the ERP training makes the training difficult for knowledgeable workers to transfer what is learned in training to the jobs at workplace. Inadequate and inefficient ERP training limits the value realization and success of an ERP system. That is the need to call for a profound change and innovation for ERP training in both workplace at industry and the Information Systems (IS) education in academia. The innovated ERP training approach can improve the users’ knowledge in business processes and hands-on skills in mastering ERP system. It also can be instructed as educational material for IS students in universities. The purpose of the study is to examine the use of ERP simulation games via the ERPsim system to train the IS students in learning ERP implementation. The ERPsim is the business simulation game developed by ERPsim Lab at HEC Montréal, and the game is a real-life SAP (Systems Applications and Products) ERP system. The training uses the ERPsim system as the tool for the Internet-based simulation games and is designed as online student competitions during the class. The competitions involve student teams with the facilitation of instructor and put the students’ business skills to the test via intensive simulation games on a real-world SAP ERP system. The teams run the full business cycle of a manufacturing company while interacting with suppliers, vendors, and customers through sending and receiving orders, delivering products and completing the entire cash-to-cash cycle. To learn a range of business skills, student needs to adopt individual business role and make business decisions around the products and business processes. Based on the training experiences learned from rounds of business simulations, the findings show that learners have reduced risk in making mistakes that help learners build self-confidence in problem-solving. In addition, the learners’ reflections from their mistakes can speculate the root causes of the problems and further improve the efficiency of the training. ERP instructors teaching with the innovative approach report significant improvements in student evaluation, learner motivation, attendance, engagement as well as increased learner technology competency. The findings of the study can provide ERP instructors with guidelines to create an effective learning environment and can be transferred to a variety of other educational fields in which trainers are migrating towards a more active learning approach.Keywords: business simulations, ERP implementation training, ERPsim, game-based learning, instructional strategy, training innovation
Procedia PDF Downloads 141234 Novel Numerical Technique for Dusty Plasma Dynamics (Yukawa Liquids): Microfluidic and Role of Heat Transport
Authors: Aamir Shahzad, Mao-Gang He
Abstract:
Currently, dusty plasmas motivated the researchers' widespread interest. Since the last two decades, substantial efforts have been made by the scientific and technological community to investigate the transport properties and their nonlinear behavior of three-dimensional and two-dimensional nonideal complex (dusty plasma) liquids (NICDPLs). Different calculations have been made to sustain and utilize strongly coupled NICDPLs because of their remarkable scientific and industrial applications. Understanding of the thermophysical properties of complex liquids under various conditions is of practical interest in the field of science and technology. The determination of thermal conductivity is also a demanding question for thermophysical researchers, due to some reasons; very few results are offered for this significant property. Lack of information of the thermal conductivity of dense and complex liquids at different parameters related to the industrial developments is a major barrier to quantitative knowledge of the heat flux flow from one medium to another medium or surface. The exact numerical investigation of transport properties of complex liquids is a fundamental research task in the field of thermophysics, as various transport data are closely related with the setup and confirmation of equations of state. A reliable knowledge of transport data is also important for an optimized design of processes and apparatus in various engineering and science fields (thermoelectric devices), and, in particular, the provision of precise data for the parameters of heat, mass, and momentum transport is required. One of the promising computational techniques, the homogenous nonequilibrium molecular dynamics (HNEMD) simulation, is over viewed with a special importance on the application to transport problems of complex liquids. This proposed work is particularly motivated by the FIRST TIME to modify the problem of heat conduction equations leads to polynomial velocity and temperature profiles algorithm for the investigation of transport properties with their nonlinear behaviors in the NICDPLs. The aim of proposed work is to implement a NEMDS algorithm (Poiseuille flow) and to delve the understanding of thermal conductivity behaviors in Yukawa liquids. The Yukawa system is equilibrated through the Gaussian thermostat in order to maintain the constant system temperature (canonical ensemble ≡ NVT)). The output steps will be developed between 3.0×105/ωp and 1.5×105/ωp simulation time steps for the computation of λ data. The HNEMD algorithm shows that the thermal conductivity is dependent on plasma parameters and the minimum value of lmin shifts toward higher G with an increase in k, as expected. New investigations give more reliable simulated data for the plasma conductivity than earlier known simulation data and generally the plasma λ0 by 2%-20%, depending on Γ and κ. It has been shown that the obtained results at normalized force field are in satisfactory agreement with various earlier simulation results. This algorithm shows that the new technique provides more accurate results with fast convergence and small size effects over a wide range of plasma states.Keywords: molecular dynamics simulation, thermal conductivity, nonideal complex plasma, Poiseuille flow
Procedia PDF Downloads 274233 Big Data and Health: An Australian Perspective Which Highlights the Importance of Data Linkage to Support Health Research at a National Level
Authors: James Semmens, James Boyd, Anna Ferrante, Katrina Spilsbury, Sean Randall, Adrian Brown
Abstract:
‘Big data’ is a relatively new concept that describes data so large and complex that it exceeds the storage or computing capacity of most systems to perform timely and accurate analyses. Health services generate large amounts of data from a wide variety of sources such as administrative records, electronic health records, health insurance claims, and even smart phone health applications. Health data is viewed in Australia and internationally as highly sensitive. Strict ethical requirements must be met for the use of health data to support health research. These requirements differ markedly from those imposed on data use from industry or other government sectors and may have the impact of reducing the capacity of health data to be incorporated into the real time demands of the Big Data environment. This ‘big data revolution’ is increasingly supported by national governments, who have invested significant funds into initiatives designed to develop and capitalize on big data and methods for data integration using record linkage. The benefits to health following research using linked administrative data are recognised internationally and by the Australian Government through the National Collaborative Research Infrastructure Strategy Roadmap, which outlined a multi-million dollar investment strategy to develop national record linkage capabilities. This led to the establishment of the Population Health Research Network (PHRN) to coordinate and champion this initiative. The purpose of the PHRN was to establish record linkage units in all Australian states, to support the implementation of secure data delivery and remote access laboratories for researchers, and to develop the Centre for Data Linkage for the linkage of national and cross-jurisdictional data. The Centre for Data Linkage has been established within Curtin University in Western Australia; it provides essential record linkage infrastructure necessary for large-scale, cross-jurisdictional linkage of health related data in Australia and uses a best practice ‘separation principle’ to support data privacy and security. Privacy preserving record linkage technology is also being developed to link records without the use of names to overcome important legal and privacy constraint. This paper will present the findings of the first ‘Proof of Concept’ project selected to demonstrate the effectiveness of increased record linkage capacity in supporting nationally significant health research. This project explored how cross-jurisdictional linkage can inform the nature and extent of cross-border hospital use and hospital-related deaths. The technical challenges associated with national record linkage, and the extent of cross-border population movements, were explored as part of this pioneering research project. Access to person-level data linked across jurisdictions identified geographical hot spots of cross border hospital use and hospital-related deaths in Australia. This has implications for planning of health service delivery and for longitudinal follow-up studies, particularly those involving mobile populations.Keywords: data integration, data linkage, health planning, health services research
Procedia PDF Downloads 216232 Application of Infrared Thermal Imaging, Eye Tracking and Behavioral Analysis for Deception Detection
Authors: Petra Hypšová, Martin Seitl
Abstract:
One of the challenges of forensic psychology is to detect deception during a face-to-face interview. In addition to the classical approaches of monitoring the utterance and its components, detection is also sought by observing behavioral and physiological changes that occur as a result of the increased emotional and cognitive load caused by the production of distorted information. Typical are changes in facial temperature, eye movements and their fixation, pupil dilation, emotional micro-expression, heart rate and its variability. Expanding technological capabilities have opened the space to detect these psychophysiological changes and behavioral manifestations through non-contact technologies that do not interfere with face-to-face interaction. Non-contact deception detection methodology is still in development, and there is a lack of studies that combine multiple non-contact technologies to investigate their accuracy, as well as studies that show how different types of lies produced by different interviewers affect physiological and behavioral changes. The main objective of this study is to apply a specific non-contact technology for deception detection. The next objective is to investigate scenarios in which non-contact deception detection is possible. A series of psychophysiological experiments using infrared thermal imaging, eye tracking and behavioral analysis with FaceReader 9.0 software was used to achieve our goals. In the laboratory experiment, 16 adults (12 women, 4 men) between 18 and 35 years of age (SD = 4.42) were instructed to produce alternating prepared and spontaneous truths and lies. The baseline of each proband was also measured, and its results were compared to the experimental conditions. Because the personality of the examiner (particularly gender and facial appearance) to whom the subject is lying can influence physiological and behavioral changes, the experiment included four different interviewers. The interviewer was represented by a photograph of a face that met the required parameters in terms of gender and facial appearance (i.e., interviewer likability/antipathy) to follow standardized procedures. The subject provided all information to the simulated interviewer. During follow-up analyzes, facial temperature (main ROIs: forehead, cheeks, the tip of the nose, chin, and corners of the eyes), heart rate, emotional expression, intensity and fixation of eye movements and pupil dilation were observed. The results showed that the variables studied varied with respect to the production of prepared truths and lies versus the production of spontaneous truths and lies, as well as the variability of the simulated interviewer. The results also supported the assumption of variability in physiological and behavioural values during the subject's resting state, the so-called baseline, and the production of prepared and spontaneous truths and lies. A series of psychophysiological experiments provided evidence of variability in the areas of interest in the production of truths and lies to different interviewers. The combination of technologies used also led to a comprehensive assessment of the physiological and behavioral changes associated with false and true statements. The study presented here opens the space for further research in the field of lie detection with non-contact technologies.Keywords: emotional expression decoding, eye-tracking, functional infrared thermal imaging, non-contact deception detection, psychophysiological experiment
Procedia PDF Downloads 100231 “laws Drifting Off While Artificial Intelligence Thriving” – A Comparative Study with Special Reference to Computer Science and Information Technology
Authors: Amarendar Reddy Addula
Abstract:
Definition of Artificial Intelligence: Artificial intelligence is the simulation of mortal intelligence processes by machines, especially computer systems. Explicit operations of AI comprise expert systems, natural language processing, and speech recognition, and machine vision. Artificial Intelligence (AI) is an original medium for digital business, according to a new report by Gartner. The last 10 times represent an advance period in AI’s development, prodded by the confluence of factors, including the rise of big data, advancements in cipher structure, new machine literacy ways, the materialization of pall computing, and the vibrant open- source ecosystem. Influence of AI to a broader set of use cases and druggies and its gaining fashionability because it improves AI’s versatility, effectiveness, and rigidity. Edge AI will enable digital moments by employing AI for real- time analytics closer to data sources. Gartner predicts that by 2025, further than 50 of all data analysis by deep neural networks will do at the edge, over from lower than 10 in 2021. Responsible AI is a marquee term for making suitable business and ethical choices when espousing AI. It requires considering business and societal value, threat, trust, translucency, fairness, bias mitigation, explainability, responsibility, safety, sequestration, and nonsupervisory compliance. Responsible AI is ever more significant amidst growing nonsupervisory oversight, consumer prospects, and rising sustainability pretensions. Generative AI is the use of AI to induce new vestiges and produce innovative products. To date, generative AI sweats have concentrated on creating media content similar as photorealistic images of people and effects, but it can also be used for law generation, creating synthetic irregular data, and designing medicinals and accoutrements with specific parcels. AI is the subject of a wide- ranging debate in which there's a growing concern about its ethical and legal aspects. Constantly, the two are varied and nonplussed despite being different issues and areas of knowledge. The ethical debate raises two main problems the first, abstract, relates to the idea and content of ethics; the alternate, functional, and concerns its relationship with the law. Both set up models of social geste, but they're different in compass and nature. The juridical analysis is grounded on anon-formalistic scientific methodology. This means that it's essential to consider the nature and characteristics of the AI as a primary step to the description of its legal paradigm. In this regard, there are two main issues the relationship between artificial and mortal intelligence and the question of the unitary or different nature of the AI. From that theoretical and practical base, the study of the legal system is carried out by examining its foundations, the governance model, and the nonsupervisory bases. According to this analysis, throughout the work and in the conclusions, International Law is linked as the top legal frame for the regulation of AI.Keywords: artificial intelligence, ethics & human rights issues, laws, international laws
Procedia PDF Downloads 96230 Optimal Framework of Policy Systems with Innovation: Use of Strategic Design for Evolution of Decisions
Authors: Yuna Lee
Abstract:
In the current policy process, there has been a growing interest in more open approaches that incorporate creativity and innovation based on the forecasting groups composed by the public and experts together into scientific data-driven foresight methods to implement more effective policymaking. Especially, citizen participation as collective intelligence in policymaking with design and deep scale of innovation at the global level has been developed and human-centred design thinking is considered as one of the most promising methods for strategic foresight. Yet, there is a lack of a common theoretical foundation for a comprehensive approach for the current situation of and post-COVID-19 era, and substantial changes in policymaking practice are insignificant and ongoing with trial and error. This project hypothesized that rigorously developed policy systems and tools that support strategic foresight by considering the public understanding could maximize ways to create new possibilities for a preferable future, however, it must involve a better understating of Behavioural Insights, including individual and cultural values, profit motives and needs, and psychological motivations, for implementing holistic and multilateral foresight and creating more positive possibilities. To what extent is the policymaking system theoretically possible that incorporates the holistic and comprehensive foresight and policy process implementation, assuming that theory and practice, in reality, are different and not connected? What components and environmental conditions should be included in the strategic foresight system to enhance the capacity of decision from policymakers to predict alternative futures, or detect uncertainties of the future more accurately? And, compared to the required environmental condition, what are the environmental vulnerabilities of the current policymaking system? In this light, this research contemplates the question of how effectively policymaking practices have been implemented through the synthesis of scientific, technology-oriented innovation with the strategic design for tackling complex societal challenges and devising more significant insights to make society greener and more liveable. Here, this study conceptualizes the notions of a new collaborative way of strategic foresight that aims to maximize mutual benefits between policy actors and citizens through the cooperation stemming from evolutionary game theory. This study applies mixed methodology, including interviews of policy experts, with the case in which digital transformation and strategic design provided future-oriented solutions or directions to cities’ sustainable development goals and society-wide urgent challenges such as COVID-19. As a result, artistic and sensual interpreting capabilities through strategic design promote a concrete form of ideas toward a stable connection from the present to the future and enhance the understanding and active cooperation among decision-makers, stakeholders, and citizens. Ultimately, an improved theoretical foundation proposed in this study is expected to help strategically respond to the highly interconnected future changes of the post-COVID-19 world.Keywords: policymaking, strategic design, sustainable innovation, evolution of cooperation
Procedia PDF Downloads 195229 Innovation Outputs from Higher Education Institutions: A Case Study of the University of Waterloo, Canada
Authors: Wendy De Gomez
Abstract:
The University of Waterloo is situated in central Canada in the Province of Ontario- one hour from the metropolitan city of Toronto. For over 30 years, it has held Canada’s top spot as the most innovative university; and has been consistently ranked in the top 25 computer science and top 50 engineering schools in the world. Waterloo benefits from the federal government’s over 100 domestic innovation policies which have assisted in the country’s 15th place global ranking in the World Intellectual Property Organization’s (WIPO) 2022 Global Innovation Index. Yet undoubtedly, the University of Waterloo’s unique characteristics are what propels its innovative creativeness forward. This paper will provide a contextual definition of innovation in higher education and then demonstrate the five operational attributes that contribute to the University of Waterloo’s innovative reputation. The methodology is based on statistical analyses obtained from ranking bodies such as the QS World University Rankings, a secondary literature review related to higher education innovation in Canada, and case studies that exhibit the operationalization of the attributes outlined below. The first attribute is geography. Specifically, the paper investigates the network structure effect of the Toronto-Waterloo high-tech corridor and the resultant industrial relationships built there. The second attribute is University Policy 73-Intellectal Property Rights. This creator-owned policy grants all ownership to the creator/inventor regardless of the use of the University of Waterloo property or funding. Essentially, through the incentivization of IP ownership by all researchers, further commercialization and entrepreneurship are formed. Third, this IP policy works hand in hand with world-renowned business incubators such as the Accelerator Centre in the dedicated research and technology park and velocity, a 14-year-old facility that equips and guides founders to build and scale companies. Communitech, a 25-year-old provincially backed facility in the region, also works closely with the University of Waterloo to build strong teams, access capital, and commercialize products. Fourth, Waterloo’s co-operative education program contributes 31% of all co-op participants to the Canadian economy. Home to the world’s largest co-operative education program, data shows that over 7,000 from around the world recruit Waterloo students for short- and long-term placements- directly contributing to the student’s ability to learn and optimize essential employment skills when they graduate. Finally, the students themselves at Waterloo are exceptional. The entrance average ranges from the low 80s to the mid-90s depending on the program. In computer, electrical, mechanical, mechatronics, and systems design engineering, to have a 66% chance of acceptance, the applicant’s average must be 95% or above. Singularly, none of these five attributes could lead to the university’s outstanding track record of innovative creativity, but when bundled up into a 1000 acre- 100 building main campus with 6 academic faculties, 40,000+ students, and over 1300 world-class faculty, the recipe for success becomes quite evident.Keywords: IP policy, higher education, economy, innovation
Procedia PDF Downloads 70228 Seawater Desalination for Production of Highly Pure Water Using a Hydrophobic PTFE Membrane and Direct Contact Membrane Distillation (DCMD)
Authors: Ahmad Kayvani Fard, Yehia Manawi
Abstract:
Qatar’s primary source of fresh water is through seawater desalination. Amongst the major processes that are commercially available on the market, the most common large scale techniques are Multi-Stage Flash distillation (MSF), Multi Effect distillation (MED), and Reverse Osmosis (RO). Although commonly used, these three processes are highly expensive down to high energy input requirements and high operating costs allied with maintenance and stress induced on the systems in harsh alkaline media. Beside that cost, environmental footprint of these desalination techniques are significant; from damaging marine eco-system, to huge land use, to discharge of tons of GHG and huge carbon footprint. Other less energy consuming techniques based on membrane separation are being sought to reduce both the carbon footprint and operating costs is membrane distillation (MD). Emerged in 1960s, MD is an alternative technology for water desalination attracting more attention since 1980s. MD process involves the evaporation of a hot feed, typically below boiling point of brine at standard conditions, by creating a water vapor pressure difference across the porous, hydrophobic membrane. Main advantages of MD compared to other commercially available technologies (MSF and MED) and specially RO are reduction of membrane and module stress due to absence of trans-membrane pressure, less impact of contaminant fouling on distillate due to transfer of only water vapor, utilization of low grade or waste heat from oil and gas industries to heat up the feed up to required temperature difference across the membrane, superior water quality, and relatively lower capital and operating cost. To achieve the objective of this study, state of the art flat-sheet cross-flow DCMD bench scale unit was designed, commissioned, and tested. The objective of this study is to analyze the characteristics and morphology of the membrane suitable for DCMD through SEM imaging and contact angle measurement and to study the water quality of distillate produced by DCMD bench scale unit. Comparison with available literature data is undertaken where appropriate and laboratory data is used to compare a DCMD distillate quality with that of other desalination techniques and standards. Membrane SEM analysis showed that the PTFE membrane used for the study has contact angle of 127º with highly porous surface supported with less porous and bigger pore size PP membrane. Study on the effect of feed solution (salinity) and temperature on water quality of distillate produced from ICP and IC analysis showed that with any salinity and different feed temperature (up to 70ºC) the electric conductivity of distillate is less than 5 μS/cm with 99.99% salt rejection and proved to be feasible and effective process capable of consistently producing high quality distillate from very high feed salinity solution (i.e. 100000 mg/L TDS) even with substantial quality difference compared to other desalination methods such as RO and MSF.Keywords: membrane distillation, waste heat, seawater desalination, membrane, freshwater, direct contact membrane distillation
Procedia PDF Downloads 227227 Effects of Live Webcast-Assisted Teaching on Physical Assessment Technique Learning of Young Nursing Majors
Authors: Huey-Yeu Yan, Ching-Ying Lee, Hung-Ru Lin
Abstract:
Background: Physical assessment is a vital clinical nursing competence. The gap between conventional teaching method and the way e-generation students’ preferred could be bridged owing to the support of Internet technology, i.e. interacting with online media to manage learning works. Nursing instructors in the wake of new learning pattern of the e-generation students are challenged to actively adjust and make teaching contents and methods more versatile. Objective: The objective of this research is to explore the effects on teaching and learning with live webcast-assisted on a specific topic, Physical Assessment technique, on a designated group of young nursing majors. It’s hoped that, with a way of nursing instructing, more versatile learning resources may be provided to facilitate self-directed learning. Design: This research adopts a cross-sectional descriptive survey. The instructor demonstrated physical assessment techniques and operation procedures via live webcast broadcasted online to all students. It increased both the off-time interaction between teacher and students concerning teaching materials. Methods: A convenient sampling was used to recruit a total of 52 nursing-majors at a certain university. The nursing majors took two-hour classes of Physical Assessment per week for 18 weeks (36 hrs. in total). The instruction covered four units with live webcasting and then conducted an online anonymous survey of learning outcomes by questionnaire. The research instrument was the online questionnaire, covering three major domains—online media used, learning outcome evaluation and evaluation result. The data analysis was conducted via IBM SPSS Statistics Version 2.0. The descriptive statistics was undertaken to describe the analysis of basic data and learning outcomes. Statistical methods such as descriptive statistics, t-test, ANOVA, and Pearson’s correlation were employed in verification. Results: Results indicated the following five major findings. (1) learning motivation, about four fifth of the participants agreed the online instruction resources are very helpful in improving learning motivation and raising the learning interest. (2) learning needs, about four fifth of participants agreed it was helpful to plan self-directed practice after the instruction, and meet their needs of repetitive learning and/or practice at their leisure time. (3) learning effectiveness, about two third agreed it was helpful to reduce pre-exam anxiety, and improve their test scores. (4) course objects, about three fourth agreed that it was helpful to achieve the goal of ‘executing the complete Physical Assessment procedures with proper skills’. (5) finally, learning reflection, about all of participants agreed this experience of online instructing, learning, and practicing is beneficial to them, they recommend instructor to share with other nursing majors, and they will recommend it to fellow students too. Conclusions: Live webcasting is a low-cost, convenient, efficient and interactive resource to facilitate nursing majors’ motivation of learning, need of self-directed learning and practice, outcome of learning. When live webcasting is integrated into nursing teaching, it provides an opportunity of self-directed learning to promote learning effectiveness, as such to fulfill the teaching objective.Keywords: innovative teaching, learning effectiveness, live webcasting, physical assessment technique
Procedia PDF Downloads 132226 Hydrogeomatic System for the Economic Evaluation of Damage by Flooding in Mexico
Authors: Alondra Balbuena Medina, Carlos Diaz Delgado, Aleida Yadira Vilchis Fránces
Abstract:
In Mexico, each year news is disseminated about the ravages of floods, such as the total loss of housing, damage to the fields; the increase of the costs of the food, derived from the losses of the harvests, coupled with health problems such as skin infection, etc. In addition to social problems such as delinquency, damage in education institutions and the population in general. The flooding is a consequence of heavy rains, tropical storms and or hurricanes that generate excess water in drainage systems that exceed its capacity. In urban areas, heavy rains can be one of the main factors in causing flooding, in addition to excessive precipitation, dam breakage, and human activities, for example, excessive garbage in the strainers. In agricultural areas, these can hardly achieve large areas of cultivation. It should be mentioned that for both areas, one of the significant impacts of floods is that they can permanently affect the livelihoods of many families, cause damage, for example in their workplaces such as farmlands, commercial or industry areas and where services are provided. In recent years, Information and Communication Technologies (ICT) have had an accelerated development, being reflected in the growth and the exponential evolution of the innovation giving; as a result, the daily generation of new technologies, updates, and applications. Innovation in the development of Information Technology applications has impacted on all areas of human activity. They influence all the orders of life of individuals, reconfiguring the way of perceiving and analyzing the world such as, for instance, interrelating with people as individuals and as a society, in the economic, political, social, cultural, educational, environmental, etc. Therefore the present work describes the creation of a system of calculation of flood costs for housing areas, retail establishments and agricultural areas from the Mexican Republic, based on the use and application of geotechnical tools being able to be useful for the benefit of the sectors of public, education and private. To generate analysis of hydrometereologic affections and with the obtained results to realize the Geoinformatics tool was constructed from two different points of view: the geoinformatic (design and development of GIS software) and the methodology of flood damage validation in order to integrate a tool that provides the user the monetary estimate of the effects caused by the floods. With information from the period 2000-2014, the functionality of the application was corroborated. For the years 2000 to 2009 only the analysis of the agricultural and housing areas was carried out, incorporating for the commercial establishment's information of the period 2010 - 2014. The method proposed for the resolution of this research project is a fundamental contribution to society, in addition to the tool itself. Therefore, it can be summarized that the problems that are in the physical-geographical environment, conceiving them from the point of view of the spatial analysis, allow to offer different alternatives of solution and also to open up slopes towards academia and research.Keywords: floods, technological innovation, monetary estimation, spatial analysis
Procedia PDF Downloads 225225 PolyScan: Comprehending Human Polymicrobial Infections for Vector-Borne Disease Diagnostic Purposes
Authors: Kunal Garg, Louise Theusen Hermansan, Kanoktip Puttaraska, Oliver Hendricks, Heidi Pirttinen, Leona Gilbert
Abstract:
The Germ Theory (one infectious determinant is equal to one disease) has unarguably evolved our capability to diagnose and treat infectious diseases over the years. Nevertheless, the advent of technology, climate change, and volatile human behavior has brought about drastic changes in our environment, leading us to question the relevance of the Germ Theory in our day, i.e. will vector-borne disease (VBD) sufferers produce multiple immune responses when tested for multiple microbes? Vector diseased patients producing multiple immune responses to different microbes would evidently suggest human polymicrobial infections (HPI). Ongoing diagnostic tools are exceedingly unequipped with the current research findings that would aid in diagnosing patients for polymicrobial infections. This shortcoming has caused misdiagnosis at very high rates, consequently diminishing the patient’s quality of life due to inadequate treatment. Equipped with the state-of-art scientific knowledge, PolyScan intends to address the pitfalls in current VBD diagnostics. PolyScan is a multiplex and multifunctional enzyme linked Immunosorbent assay (ELISA) platform that can test for numerous VBD microbes and allow simultaneous screening for multiple types of antibodies. To validate PolyScan, Lyme Borreliosis (LB) and spondyloarthritis (SpA) patient groups (n = 54 each) were tested for Borrelia burgdorferi, Borrelia burgdorferi Round Body (RB), Borrelia afzelii, Borrelia garinii, and Ehrlichia chaffeensis against IgM and IgG antibodies. LB serum samples were obtained from Germany and SpA serum samples were obtained from Denmark under relevant ethical approvals. The SpA group represented chronic LB stage because reactive arthritis (SpA subtype) in the form of Lyme arthritis links to LB. It was hypothesized that patients from both the groups will produce multiple immune responses that as a consequence would evidently suggest HPI. It was also hypothesized that the multiple immune response proportion in SpA patient group would be significantly larger when compared to the LB patient group across both antibodies. It was observed that 26% LB patients and 57% SpA patients produced multiple immune responses in contrast to 33% LB patients and 30% SpA patients that produced solitary immune responses when tested against IgM. Similarly, 52% LB patients and an astounding 73% SpA patients produced multiple immune responses in contrast to 30% LB patients and 8% SpA patients that produced solitary immune responses when tested against IgG. Interestingly, IgM immune dysfunction in both the patient groups was also recorded. Atypically, 6% of the unresponsive 18% LB with IgG antibody was recorded producing multiple immune responses with the IgM antibody. Similarly, 12% of the unresponsive 19% SpA with IgG antibody was recorded producing multiple immune responses with the IgM antibody. Thus, results not only supported hypothesis but also suggested that IgM may atypically prevail longer than IgG. The PolyScan concept will aid clinicians to detect patients for early, persistent, late, polymicrobial, & immune dysfunction conditions linked to different VBD. PolyScan provides a paradigm shift for the VBD diagnostic industry to follow that will drastically shorten patient’s time to receive adequate treatment.Keywords: diagnostics, immune dysfunction, polymicrobial, TICK-TAG
Procedia PDF Downloads 329224 Impact of Climate Change on Crop Production: Climate Resilient Agriculture Is the Need of the Hour
Authors: Deepak Loura
Abstract:
Climate change is considered one of the major environmental problems of the 21st century and a lasting change in the statistical distribution of weather patterns over periods ranging from decades to millions of years. Agriculture and climate change are internally correlated with each other in various aspects, as the threat of varying global climate has greatly driven the attention of scientists, as these variations are imparting a negative impact on global crop production and compromising food security worldwide. The fast pace of development and industrialization and indiscriminate destruction of the natural environment, more so in the last century, have altered the concentration of atmospheric gases that lead to global warming. Carbon dioxide (CO₂), methane (CH₄), and nitrous oxide (NO) are important biogenic greenhouse gases (GHGs) from the agricultural sector contributing to global warming and their concentration is increasing alarmingly. Agricultural productivity can be affected by climate change in 2 ways: first, directly, by affecting plant growth development and yield due to changes in rainfall/precipitation and temperature and/or CO₂ levels, and second, indirectly, there may be considerable impact on agricultural land use due to snow melt, availability of irrigation, frequency and intensity of inter- and intra-seasonal droughts and floods, soil organic matter transformations, soil erosion, distribution and frequency of infestation by insect pests, diseases or weeds, the decline in arable areas (due to submergence of coastal lands), and availability of energy. An increase in atmospheric CO₂ promotes the growth and productivity of C3 plants. On the other hand, an increase in temperature, can reduce crop duration, increase crop respiration rates, affect the equilibrium between crops and pests, hasten nutrient mineralization in soils, decrease fertilizer- use efficiencies, and increase evapotranspiration among others. All these could considerably affect crop yield in long run. Climate resilient agriculture consisting of adaptation, mitigation, and other agriculture practices can potentially enhance the capacity of the system to withstand climate-related disturbances by resisting damage and recovering quickly. Climate resilient agriculture turns the climate change threats that have to be tackled into new business opportunities for the sector in different regions and therefore provides a triple win: mitigation, adaptation, and economic growth. Improving the soil organic carbon stock of soil is integral to any strategy towards adapting to and mitigating the abrupt climate change, advancing food security, and improving the environment. Soil carbon sequestration is one of the major mitigation strategies to achieve climate-resilient agriculture. Climate-smart agriculture is the only way to lower the negative impact of climate variations on crop adaptation before it might affect global crop production drastically. To cope with these extreme changes, future development needs to make adjustments in technology, management practices, and legislation. Adaptation and mitigation are twin approaches to bringing resilience to climate change in agriculture.Keywords: climate change, global warming, crop production, climate resilient agriculture
Procedia PDF Downloads 74223 Seek First to Regulate, Then to Understand: The Case for Preemptive Regulation of Robots
Authors: Catherine McWhorter
Abstract:
Robotics is a fast-evolving field lacking comprehensive and harm-mitigating regulation; it also lacks critical data on how human-robot interaction (HRI) may affect human psychology. As most anthropomorphic robots are intended as substitutes for humans, this paper asserts that the commercial robotics industry should be preemptively regulated at the federal level such that robots capable of embodying a victim role in criminal scenarios (“vicbots”) are prohibited until clinical studies determine their effects on the user and society. The results of these studies should then inform more permanent legislation that strives to mitigate risks of harm without infringing upon fundamental rights or stifling innovation. This paper explores these concepts through the lens of the sex robot industry. The sexbot industry offers some of the most realistic, interactive, and customizable robots for sale today. From approximately 2010 until 2017, some sex robot producers, such as True Companion, actively promoted ‘vicbot’ culture with personalities like “Frigid Farrah” and “Young Yoko” but received significant public backlash for fetishizing rape and pedophilia. Today, “Frigid Farrah” and “Young Yoko” appear to have vanished. Sexbot producers have replaced preprogrammed vicbot personalities in favor of one generic, customizable personality. According to the manufacturer ainidoll.com, when asked, there is only one thing the user won’t be able to program the sexbot to do – “…give you drama”. The ability to customize vicbot personas is possible with today’s generic personality sexbots and may undermine the intent of some current legislative efforts. Current debate on the effects of vicbots indicates a lack of consensus. Some scholars suggest vicbots may reduce the rate of actual sex crimes, and some suggest that vicbots will, in fact, create sex criminals, while others cite their potential for rehabilitation. Vicbots may have value in some instances when prescribed by medical professionals, but the overall uncertainty and lack of data further underscore the need for preemptive regulation and clinical research. Existing literature on exposure to media violence and its effects on prosocial behavior, human aggression, and addiction may serve as launch points for specific studies into the hyperrealism of vicbots. Of course, the customization, anthropomorphism and artificial intelligence of sexbots, and therefore more mainstream robots, will continue to evolve. The existing sexbot industry offers an opportunity to preemptively regulate and to research answers to these and many more questions before this type of technology becomes even more advanced and mainstream. Robots pose complicated moral, ethical, and legal challenges, most of which are beyond the scope of this paper. By examining the possibility for custom vicbots via the sexbots industry, reviewing existing literature on regulation, media violence, and vicbot user effects, this paper strives to underscore the need for preemptive federal regulation prohibiting vicbot capabilities in robots while advocating for further research into the potential for the user and societal harm by the same.Keywords: human-robot interaction effects, regulation, research, robots
Procedia PDF Downloads 207222 Extended Knowledge Exchange with Industrial Partners: A Case Study
Authors: C. Fortin, D. Tokmeninova, O. Ushakova
Abstract:
Among 500 Russian universities Skolkovo Institute of Science and Technology (Skoltech) is one of the youngest (established in 2011), quite small and vastly international, comprising 20 percent of international students and 70 percent of faculty with significant academic experience at top-100 universities (QS, THE). The institute has emerged from close collaboration with MIT and leading Russian universities. Skoltech is an entirely English speaking environment. Skoltech curriculum plans of ten Master programs are based on the CDIO learning outcomes model. However, despite the Institute’s unique focus on industrial innovations and startups, one of the main challenges has become an evident large proportion of nearly half of MSc graduates entering PhD programs at Skoltech or other universities rather than industry or entrepreneurship. In order to increase the share of students joining the industrial sector after graduation, Skoltech started implementing a number of unique practices with a focus on employers’ expectations incorporated into the curriculum redesign. In this sense, extended knowledge exchange with industrial partners via collaboration in learning activities, industrial projects and assessments became essential for students’ headway into industrial and entrepreneurship pathways. Current academic curriculum includes the following types of components based on extended knowledge exchange with industrial partners: innovation workshop, industrial immersion, special industrial tracks, MSc defenses. Innovation workshop is a 4 week full time diving into the Skoltech vibrant ecosystem designed to foster innovators, focuses on teamwork, group projects, and sparks entrepreneurial instincts from the very first days of study. From 2019 the number of mentors from industry and startups significantly increased to guide students across these sectors’ demands. Industrial immersion is an exclusive part of Skoltech curriculum where students after the first year of study spend 8 weeks in an industrial company carrying out an individual or team project and are guided jointly by both Skoltech and company supervisors. The aim of the industrial immersion is to familiarize students with relevant needs of Russian industry and to prepare graduates for job placement. During the immersion a company plays the role of a challenge provider for students. Skoltech has started a special industrial track comprising deep collaboration with IPG Photonics – a leading R&D company and manufacturer of high-performance fiber lasers and amplifiers for diverse applications. The track is aimed to train a new cohort of engineers and includes a variety of activities for students within the “Photonics” MSc program. It is expected to be a successful story and used as an example for similar initiatives with other Russian high-tech companies. One of the pathways of extended knowledge exchange with industrial partners is an active involvement of potential employers in MSc Defense Committees to review and assess MSc thesis projects and to participate in defense procedures. The paper will evaluate the effect and results of the above undertaken measures.Keywords: Curriculum redesign, knowledge exchange model, learning outcomes framework, stakeholder engagement
Procedia PDF Downloads 81221 Automation of Finite Element Simulations for the Design Space Exploration and Optimization of Type IV Pressure Vessel
Authors: Weili Jiang, Simon Cadavid Lopera, Klaus Drechsler
Abstract:
Fuel cell vehicle has become the most competitive solution for the transportation sector in the hydrogen economy. Type IV pressure vessel is currently the most popular and widely developed technology for the on-board storage, based on their high reliability and relatively low cost. Due to the stringent requirement on mechanical performance, the pressure vessel is subject to great amount of composite material, a major cost driver for the hydrogen tanks. Evidently, the optimization of composite layup design shows great potential in reducing the overall material usage, yet requires comprehensive understanding on underlying mechanisms as well as the influence of different design parameters on mechanical performance. Given the type of materials and manufacturing processes by which the type IV pressure vessels are manufactured, the design and optimization are a nuanced subject. The manifold of stacking sequence and fiber orientation variation possibilities have an out-standing effect on vessel strength due to the anisotropic property of carbon fiber composites, which make the design space high dimensional. Each variation of design parameters requires computational resources. Using finite element analysis to evaluate different designs is the most common method, however, the model-ing, setup and simulation process can be very time consuming and result in high computational cost. For this reason, it is necessary to build a reliable automation scheme to set up and analyze the di-verse composite layups. In this research, the simulation process of different tank designs regarding various parameters is conducted and automatized in a commercial finite element analysis framework Abaqus. Worth mentioning, the modeling of the composite overwrap is automatically generated using an Abaqus-Python scripting interface. The prediction of the winding angle of each layer and corresponding thickness variation on dome region is the most crucial step of the modeling, which is calculated and implemented using analytical methods. Subsequently, these different composites layups are simulated as axisymmetric models to facilitate the computational complexity and reduce the calculation time. Finally, the results are evaluated and compared regarding the ultimate tank strength. By automatically modeling, evaluating and comparing various composites layups, this system is applicable for the optimization of the tanks structures. As mentioned above, the mechanical property of the pressure vessel is highly dependent on composites layup, which requires big amount of simulations. Consequently, to automatize the simulation process gains a rapid way to compare the various designs and provide an indication of the optimum one. Moreover, this automation process can also be operated for creating a data bank of layups and corresponding mechanical properties with few preliminary configuration steps for the further case analysis. Subsequently, using e.g. machine learning to gather the optimum by the data pool directly without the simulation process.Keywords: type IV pressure vessels, carbon composites, finite element analy-sis, automation of simulation process
Procedia PDF Downloads 135220 MEIOSIS: Museum Specimens Shed Light in Biodiversity Shrinkage
Authors: Zografou Konstantina, Anagnostellis Konstantinos, Brokaki Marina, Kaltsouni Eleftheria, Dimaki Maria, Kati Vassiliki
Abstract:
Body size is crucial to ecology, influencing everything from individual reproductive success to the dynamics of communities and ecosystems. Understanding how temperature affects variations in body size is vital for both theoretical and practical purposes, as changes in size can modify trophic interactions by altering predator-prey size ratios and changing the distribution and transfer of biomass, which ultimately impacts food web stability and ecosystem functioning. Notably, a decrease in body size is frequently mentioned as the third "universal" response to climate warming, alongside shifts in distribution and changes in phenology. This trend is backed by ecological theories like the temperature-size rule (TSR) and Bergmann's rule, which have been observed in numerous species, indicating that many species are likely to shrink in size as temperatures rise. However, the thermal responses related to body size are still contradictory, and further exploration is needed. To tackle this challenge, we developed the MEIOSIS project, aimed at providing valuable insights into the relationship between the body size of species, species’ traits, environmental factors, and their response to climate change. We combined a digitized collection of butterflies from the Swiss Federal Institute of Technology in Zürich with our newly digitized butterfly collection from Goulandris Natural History Museum in Greece to analyse trends in time. For a total of 23868 images, the length of the right forewing was measured using ImageJ software. Each forewing was measured from the point at which the wing meets the thorax to the apex of the wing. The forewing length of museum specimens has been shown to have a strong correlation with wing surface area and has been utilized in prior studies as a proxy for overall body size. Temperature data corresponding to the years of collection were also incorporated into the datasets. A second dataset was generated when a custom computer vision tool was implemented for the automated morphological measuring of samples for the digitized collection in Zürich. Using the second dataset, we corrected manual measurements with ImageJ, and a final dataset containing 31922 samples was used for analysis. Setting time as a smoother variable, species identity as a random factor, and the length of right-wing size (a proxy for body size) as the response variable, we ran a global model for a maximum period of 110 years (1900 – 2010). Then, we investigated functional variability between different terrestrial biomes in a second model. Both models confirmed our initial hypothesis and resulted in a decreasing trend in body size over the years. We expect that this first output can be provided as basic data for the next challenge, i.e., to identify the ecological traits that influence species' temperature-size responses, enabling us to predict the direction and intensity of a species' reaction to rising temperatures more accurately.Keywords: butterflies, shrinking body size, museum specimens, climate change
Procedia PDF Downloads 12219 The Effects of the GAA15 (Gaelic Athletic Association 15) on Lower Extremity Injury Incidence and Neuromuscular Functional Outcomes in Collegiate Gaelic Games: A 2 Year Prospective Study
Authors: Brenagh E. Schlingermann, Clare Lodge, Paula Rankin
Abstract:
Background: Gaelic football, hurling and camogie are highly popular field games in Ireland. Research into the epidemiology of injury in Gaelic games revealed that approximately three quarters of the injuries in the games occur in the lower extremity. These injuries can have player, team and institutional impacts due to multiple factors including financial burden and time loss from competition. Research has shown it is possible to record injury data consistently with the GAA through a closed online recording system known as the GAA injury surveillance database. It has been established that determining the incidence of injury is the first step of injury prevention. The goals of this study were to create a dynamic GAA15 injury prevention programme which addressed five key components/goals; avoid positions associated with a high risk of injury, enhance flexibility, enhance strength, optimize plyometrics and address sports specific agilities. These key components are internationally recognized through the Prevent Injury, Enhance performance (PEP) programme which has proven reductions in ACL injuries by 74%. In national Gaelic games the programme is known as the GAA15 which has been devised from the principles of the PEP. No such injury prevention strategies have been published on this cohort in Gaelic games to date. This study will investigate the effects of the GAA15 on injury incidence and neuromuscular function in Gaelic games. Methods: A total of 154 players (mean age 20.32 ± 2.84) were recruited from the GAA teams within the Institute of Technology Carlow (ITC). Preseason and post season testing involved two objective screening tests; Y balance test and Three Hop Test. Practical workshops, with ongoing liaison, were provided to the coaches on the implementation of the GAA15. The programme was performed before every training session and game and the existing GAA injury surveillance database was accessed to monitor player’s injuries by the college sports rehabilitation athletic therapist. Retrospective analysis of the ITC clinic records were performed in conjunction with the database analysis as a means of tracking injuries that may have been missed. The effects of the programme were analysed by comparing the intervention groups Y balance and three hop test scores to an age/gender matched control group. Results: Year 1 results revealed significant increases in neuromuscular function as a result of the GAA15. Y Balance test scores for the intervention group increased in both the posterolateral (p=.005 and p=.001) and posteromedial reach directions (p= .001 and p=.001). A decrease in performance was determined for the three hop test (p=.039). Overall twenty-five injuries were reported during the season resulting in an injury rate of 3.00 injuries/1000hrs of participation; 1.25 injuries/1000hrs training and 4.25 injuries/1000hrs match play. Non-contact injuries accounted for 40% of the injuries sustained. Year 2 results are pending and expected April 2016. Conclusion: It is envisaged that implementation of the GAA15 will continue to reduce the risk of injury and improve neuromuscular function in collegiate Gaelic games athletes.Keywords: GAA15, Gaelic games, injury prevention, neuromuscular training
Procedia PDF Downloads 339218 Effect of Natural and Urban Environments on the Perception of Thermal Pain – Experimental Research Using Virtual Environments
Authors: Anna Mucha, Ewa Wojtyna, Anita Pollak
Abstract:
The environment in which an individual resides and observes may play a meaningful role in well-being and related constructs. Contact with nature may have a positive influence of natural environments on individuals, impacting mood and psychophysical sensations, such as pain relief. Conversely, urban settings, dominated by concrete elements, might lead to mood decline and heightened stress levels. Similarly, the situation may appear in the case of the perception of virtual environments. However, this is a topic that requires further exploration, especially in the context of relationships with pain. The aforementioned matters served as the basis for formulating and executing the outlined experimental research within the realm of environmental psychology, leveraging new technologies, notably virtual reality (VR), which is progressively gaining prominence in the domain of mental health. The primary objective was to investigate the impact of a simulated virtual environment, mirroring a natural setting abundant in greenery, on the perception of acute pain induced by thermal stimuli (high temperature) – encompassing intensity, unpleasantness, and pain tolerance. Comparative analyses were conducted between the virtual natural environment (intentionally constructed in the likeness of a therapeutic garden), virtual urban environment, and a control group devoid of virtual projections. Secondary objectives aimed to determine the mutual relationships among variables such as positive and negative emotions, preferences regarding virtual environments, sense of presence, and restorative experience in the context of the perception of presented virtual environments and induced thermal pain. The study encompassed 126 physically healthy Polish adults, distributing 42 individuals across each of the three comparative groups. Oculus Rift VR technology and the TSA-II neurosensory analyzer facilitated the experiment. Alongside demographic data, participants' subjective feelings concerning virtual reality and pain were evaluated using the Visual Analogue Scale (VAS), the original Restorative Experience in the Virtual World questionnaire (Doświadczenie Regeneracji w Wirtualnym Świecie), and an adapted Slater-Usoh-Steed (SUS) questionnaire. Results of statistical and psychometric analyses, such as Kruskal-Wallis tests, Wilcoxon tests, and contrast analyses, underscored the positive impact of the virtual natural environment on individual pain perception and mood. The virtual natural environment outperformed the virtual urban environment and the control group without virtual projection, particularly in subjective pain components like intensity and unpleasantness. Variables such as restorative experience, sense of presence and virtual environment preference also proved pivotal in pain perception and pain tolerance threshold alterations, contingent on specific conditions. This implies considerable application potential for virtual natural environments across diverse realms of psychology and related fields, among others as a supportive analgesic approach and a form of relaxation following psychotherapeutic sessions.Keywords: environmental psychology, nature, acute pain, emotions, vitrual reality, virtual environments
Procedia PDF Downloads 64217 The Influence of Human Movement on the Formation of Adaptive Architecture
Authors: Rania Raouf Sedky
Abstract:
Adaptive architecture relates to buildings specifically designed to adapt to their residents and their environments. To design a biologically adaptive system, we can observe how living creatures in nature constantly adapt to different external and internal stimuli to be a great inspiration. The issue is not just how to create a system that is capable of change but also how to find the quality of change and determine the incentive to adapt. The research examines the possibilities of transforming spaces using the human body as an active tool. The research also aims to design and build an effective dynamic structural system that can be applied on an architectural scale and integrate them all into the creation of a new adaptive system that allows us to conceive a new way to design, build and experience architecture in a dynamic manner. The main objective was to address the possibility of a reciprocal transformation between the user and the architectural element so that the architecture can adapt to the user, as the user adapts to architecture. The motivation is the desire to deal with the psychological benefits of an environment that can respond and thus empathize with human emotions through its ability to adapt to the user. Adaptive affiliations of kinematic structures have been discussed in architectural research for more than a decade, and these issues have proven their effectiveness in developing kinematic structures, responsive and adaptive, and their contribution to 'smart architecture'. A wide range of strategies have been used in building complex kinetic and robotic systems mechanisms to achieve convertibility and adaptability in engineering and architecture. One of the main contributions of this research is to explore how the physical environment can change its shape to accommodate different spatial displays based on the movement of the user’s body. The main focus is on the relationship between materials, shape, and interactive control systems. The intention is to develop a scenario where the user can move, and the structure interacts without any physical contact. The soft form of shifting language and interaction control technology will provide new possibilities for enriching human-environmental interactions. How can we imagine a space in which to construct and understand its users through physical gestures, visual expressions, and response accordingly? How can we imagine a space whose interaction depends not only on preprogrammed operations but on real-time feedback from its users? The research also raises some important questions for the future. What would be the appropriate structure to show physical interaction with the dynamic world? This study concludes with a strong belief in the future of responsive motor structures. We imagine that they are developing the current structure and that they will radically change the way spaces are tested. These structures have obvious advantages in terms of energy performance and the ability to adapt to the needs of users. The research highlights the interface between remote sensing and a responsive environment to explore the possibility of an interactive architecture that adapts to and responds to user movements. This study ends with a strong belief in the future of responsive motor structures. We envision that it will improve the current structure and that it will bring a fundamental change to the way in which spaces are tested.Keywords: adaptive architecture, interactive architecture, responsive architecture, tensegrity
Procedia PDF Downloads 160216 Creation of a Trust-Wide, Cross-Speciality, Virtual Teaching Programme for Doctors, Nurses and Allied Healthcare Professionals
Authors: Nelomi Anandagoda, Leanne J. Eveson
Abstract:
During the COVID-19 pandemic, the surge in in-patient admissions across the medical directorate of a district general hospital necessitated the implementation of an incident rota. Conscious of the impact on training and professional development, the idea of developing a virtual teaching programme was conceived. The programme initially aimed to provide junior doctors, specialist nurses, pharmacists, and allied healthcare professionals from medical specialties and those re-deployed from other specialties (e.g., ophthalmology, GP, surgery, psychiatry) the knowledge and skills to manage the deteriorating patient with COVID-19. The programme was later developed to incorporate the general internal medicine curriculum. To facilitate continuing medical education whilst maintaining social distancing during this period, a virtual platform was used to deliver teaching to junior doctors across two large district general hospitals and two community hospitals. Teaching sessions were recorded and uploaded to a common platform, providing a resource for participants to catch up on and re-watch teaching sessions, making strides towards reducing discrimination against the professional development of less than full-time trainees. Thus, creating a learning environment, which is inclusive and accessible to adult learners in a self-directed manner. The negative impact of the pandemic on the well-being of healthcare professionals is well documented. To support the multi-disciplinary team, the virtual teaching programme evolved to included sessions on well-being, resilience, and work-life balance. Providing teaching for learners across the multi-disciplinary team (MDT) has been an eye-opening experience. By challenging the concept that learners should only be taught within their own peer groups, the authors have fostered a greater appreciation of the strengths of the MDT and showcased the immense wealth of expertise available within the trust. The inclusive nature of the teaching and the ease of joining a virtual teaching session has facilitated the dissemination of knowledge across the MDT, thus improving patient care on the frontline. The weekly teaching programme has been running for over eight months, with ongoing engagement, interest, and participation. As described above, the teaching programme has evolved to accommodate the needs of its learners. It has received excellent feedback with an appreciation of its inclusive, multi-disciplinary, and holistic nature. The COVID-19 pandemic provided a catalyst to rapidly develop novel methods of working and training and widened access/exposure to the virtual technologies available to large organisations. By merging pedagogical expertise and technology, the authors have created an effective online learning environment. Although the authors do not propose to replace face-to-face teaching altogether, this model of virtual multidisciplinary team, cross-site teaching has proven to be a great leveler. It has made high-quality teaching accessible to learners of different confidence levels, grades, specialties, and working patterns.Keywords: cross-site, cross-speciality, inter-disciplinary, multidisciplinary, virtual teaching
Procedia PDF Downloads 170215 Computer Aided Design Solution Based on Genetic Algorithms for FMEA and Control Plan in Automotive Industry
Authors: Nadia Belu, Laurenţiu Mihai Ionescu, Agnieszka Misztal
Abstract:
The automotive industry is one of the most important industries in the world that concerns not only the economy, but also the world culture. In the present financial and economic context, this field faces new challenges posed by the current crisis, companies must maintain product quality, deliver on time and at a competitive price in order to achieve customer satisfaction. Two of the most recommended techniques of quality management by specific standards of the automotive industry, in the product development, are Failure Mode and Effects Analysis (FMEA) and Control Plan. FMEA is a methodology for risk management and quality improvement aimed at identifying potential causes of failure of products and processes, their quantification by risk assessment, ranking of the problems identified according to their importance, to the determination and implementation of corrective actions related. The companies use Control Plans realized using the results from FMEA to evaluate a process or product for strengths and weaknesses and to prevent problems before they occur. The Control Plans represent written descriptions of the systems used to control and minimize product and process variation. In addition Control Plans specify the process monitoring and control methods (for example Special Controls) used to control Special Characteristics. In this paper we propose a computer-aided solution with Genetic Algorithms in order to reduce the drafting of reports: FMEA analysis and Control Plan required in the manufacture of the product launch and improved knowledge development teams for future projects. The solution allows to the design team to introduce data entry required to FMEA. The actual analysis is performed using Genetic Algorithms to find optimum between RPN risk factor and cost of production. A feature of Genetic Algorithms is that they are used as a means of finding solutions for multi criteria optimization problems. In our case, along with three specific FMEA risk factors is considered and reduce production cost. Analysis tool will generate final reports for all FMEA processes. The data obtained in FMEA reports are automatically integrated with other entered parameters in Control Plan. Implementation of the solution is in the form of an application running in an intranet on two servers: one containing analysis and plan generation engine and the other containing the database where the initial parameters and results are stored. The results can then be used as starting solutions in the synthesis of other projects. The solution was applied to welding processes, laser cutting and bending to manufacture chassis for buses. Advantages of the solution are efficient elaboration of documents in the current project by automatically generating reports FMEA and Control Plan using multiple criteria optimization of production and build a solid knowledge base for future projects. The solution which we propose is a cheap alternative to other solutions on the market using Open Source tools in implementation.Keywords: automotive industry, FMEA, control plan, automotive technology
Procedia PDF Downloads 406214 Deep Learning-Based Classification of 3D CT Scans with Real Clinical Data; Impact of Image format
Authors: Maryam Fallahpoor, Biswajeet Pradhan
Abstract:
Background: Artificial intelligence (AI) serves as a valuable tool in mitigating the scarcity of human resources required for the evaluation and categorization of vast quantities of medical imaging data. When AI operates with optimal precision, it minimizes the demand for human interpretations and, thereby, reduces the burden on radiologists. Among various AI approaches, deep learning (DL) stands out as it obviates the need for feature extraction, a process that can impede classification, especially with intricate datasets. The advent of DL models has ushered in a new era in medical imaging, particularly in the context of COVID-19 detection. Traditional 2D imaging techniques exhibit limitations when applied to volumetric data, such as Computed Tomography (CT) scans. Medical images predominantly exist in one of two formats: neuroimaging informatics technology initiative (NIfTI) and digital imaging and communications in medicine (DICOM). Purpose: This study aims to employ DL for the classification of COVID-19-infected pulmonary patients and normal cases based on 3D CT scans while investigating the impact of image format. Material and Methods: The dataset used for model training and testing consisted of 1245 patients from IranMehr Hospital. All scans shared a matrix size of 512 × 512, although they exhibited varying slice numbers. Consequently, after loading the DICOM CT scans, image resampling and interpolation were performed to standardize the slice count. All images underwent cropping and resampling, resulting in uniform dimensions of 128 × 128 × 60. Resolution uniformity was achieved through resampling to 1 mm × 1 mm × 1 mm, and image intensities were confined to the range of (−1000, 400) Hounsfield units (HU). For classification purposes, positive pulmonary COVID-19 involvement was designated as 1, while normal images were assigned a value of 0. Subsequently, a U-net-based lung segmentation module was applied to obtain 3D segmented lung regions. The pre-processing stage included normalization, zero-centering, and shuffling. Four distinct 3D CNN models (ResNet152, ResNet50, DensNet169, and DensNet201) were employed in this study. Results: The findings revealed that the segmentation technique yielded superior results for DICOM images, which could be attributed to the potential loss of information during the conversion of original DICOM images to NIFTI format. Notably, ResNet152 and ResNet50 exhibited the highest accuracy at 90.0%, and the same models achieved the best F1 score at 87%. ResNet152 also secured the highest Area under the Curve (AUC) at 0.932. Regarding sensitivity and specificity, DensNet201 achieved the highest values at 93% and 96%, respectively. Conclusion: This study underscores the capacity of deep learning to classify COVID-19 pulmonary involvement using real 3D hospital data. The results underscore the significance of employing DICOM format 3D CT images alongside appropriate pre-processing techniques when training DL models for COVID-19 detection. This approach enhances the accuracy and reliability of diagnostic systems for COVID-19 detection.Keywords: deep learning, COVID-19 detection, NIFTI format, DICOM format
Procedia PDF Downloads 88