Search results for: edge intelligence
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2194

Search results for: edge intelligence

394 Optimizing the Readability of Orthopaedic Trauma Patient Education Materials Using ChatGPT-4

Authors: Oscar Covarrubias, Diane Ghanem, Christopher Murdock, Babar Shafiq

Abstract:

Introduction: ChatGPT is an advanced language AI tool designed to understand and generate human-like text. The aim of this study is to assess the ability of ChatGPT-4 to re-write orthopaedic trauma patient education materials at the recommended 6th-grade level. Methods: Two independent reviewers accessed ChatGPT-4 (chat.openai.com) and gave identical instructions to simplify the readability of provided text to a 6th-grade level. All trauma-related articles by the Orthopaedic Trauma Association (OTA) and American Academy of Orthopaedic Surgeons (AAOS) were sequentially provided. The academic grade level was determined using the Flesh-Kincaid Grade Level (FKGL) and Flesch Reading Ease (FRE). Paired t-tests and Wilcox-rank sum tests were used to compare the FKGL and FRE between the ChatGPT-4 revised and original text. Inter-rater correlation coefficient (ICC) was used to assess variability in ChatGPT-4 generated text between the two reviewers. Results: ChatGPT-4 significantly reduced FKGL and increased FRE scores in the OTA (FKGL: 5.7±0.5 compared to the original 8.2±1.1, FRE: 76.4±5.7 compared to the original 65.5±6.6, p < 0.001) and AAOS articles (FKGL: 5.8±0.8 compared to the original 8.9±0.8, FRE: 76±5.5 compared to the original 56.7±5.9, p < 0.001). On average, 14.6% of OTA and 28.6% of AAOS articles required at least two revisions by ChatGPT-4 to achieve a 6th-grade reading level. ICC demonstrated poor reliability for FKGL (OTA 0.24, AAOS 0.45) and moderate reliability for FRE (OTA 0.61, AAOS 0.73). Conclusion: This study provides a novel, simple and efficient method using language AI to optimize the readability of patient education content which may only require the surgeon’s final proofreading. This method would likely be as effective for other medical specialties.

Keywords: artificial intelligence, AI, chatGPT, patient education, readability, trauma education

Procedia PDF Downloads 52
393 The Intersection of Art and Technology: Innovations in Visual Communication Design

Authors: Sareh Enjavi

Abstract:

In recent years, the field of visual communication design has seen a significant shift in the way that art is created and consumed, with the advent of new technologies like virtual reality, augmented reality, and artificial intelligence. This paper explores the ways in which technology is changing the landscape of visual communication design, and how designers are incorporating new technological tools into their artistic practices. The primary objective of this research paper is to investigate the ways in which technology is influencing the creative process of designers and artists in the field of visual communication design. The paper also aims to examine the challenges and limitations that arise from the intersection of art and technology in visual communication design, and to identify strategies for overcoming these challenges. Drawing on examples from a range of fields, including advertising, fine art, and digital media, this paper highlights the exciting innovations that are emerging as artists and designers use technology to push the boundaries of traditional artistic expression. The paper argues that embracing technological innovation is essential for the continued evolution of visual communication design. By exploring the intersection of art and technology, designers can create new and exciting visual experiences that engage and inspire audiences in new ways. The research also contributes to the theoretical and methodological understanding of the intersection of art and technology, a topic that has gained significant attention in recent years. Ultimately, this paper emphasizes the importance of embracing innovation and experimentation in the field of visual communication design, and highlights the exciting innovations that are emerging as a result of the intersection of art and technology, and emphasizes the importance of embracing innovation and experimentation in the field of visual communication design.

Keywords: visual communication design, art and technology, virtual reality, interactive art, creative process

Procedia PDF Downloads 89
392 A Virtual Reality Simulation Tool for Reducing the Risk of Building Content during Earthquakes

Authors: Ali Asgary, Haopeng Zhou, Ghassem Tofighi

Abstract:

Use of virtual (VR), augmented reality (AR), and extended reality technologies for training and education has increased in recent years as more hardware and software tools have become available and accessible to larger groups of users. Similarly, the applications of these technologies in earthquake related training and education are on the rise. Several studies have reported promising results for the use of VR and AR for evacuation behaviour and training under earthquake situations. They simulate the impacts that earthquake has on buildings, buildings’ contents, and how building occupants and users can find safe spots or open paths to outside. Considering that considerable number of earthquake injuries and fatalities are linked to the behaviour, our goal is to use these technologies to reduce the impacts of building contents on people. Building on our artificial intelligence (AI) based indoor earthquake risk assessment application that enables users to use their mobile device to assess the risks associated with building contents during earthquakes, we develop a virtual reality application to demonstrate the behavior of different building contents during earthquakes, their associate moving, spreading, falling, and collapsing risks, and their risk mitigation methods. We integrate realistic seismic models, building contents behavior with and without risk mitigation measures in virtual reality environment. The application can be used for training of architects, interior design experts, and building users to enhance indoor safety of the buildings that can sustain earthquakes. This paper describes and demonstrates the application development background, structure, components, and usage.

Keywords: virtual reality, earthquake damage, building content, indoor risks, earthquake risk mitigation, interior design, unity game engine, oculus

Procedia PDF Downloads 61
391 Microfluidic Plasmonic Bio-Sensing of Exosomes by Using a Gold Nano-Island Platform

Authors: Srinivas Bathini, Duraichelvan Raju, Simona Badilescu, Muthukumaran Packirisamy

Abstract:

A bio-sensing method, based on the plasmonic property of gold nano-islands, has been developed for detection of exosomes in a clinical setting. The position of the gold plasmon band in the UV-Visible spectrum depends on the size and shape of gold nanoparticles as well as on the surrounding environment. By adsorbing various chemical entities, or binding them, the gold plasmon band will shift toward longer wavelengths and the shift is proportional to the concentration. Exosomes transport cargoes of molecules and genetic materials to proximal and distal cells. Presently, the standard method for their isolation and quantification from body fluids is by ultracentrifugation, not a practical method to be implemented in a clinical setting. Thus, a versatile and cutting-edge platform is required to selectively detect and isolate exosomes for further analysis at clinical level. The new sensing protocol, instead of antibodies, makes use of a specially synthesized polypeptide (Vn96), to capture and quantify the exosomes from different media, by binding the heat shock proteins from exosomes. The protocol has been established and optimized by using a glass substrate, in order to facilitate the next stage, namely the transfer of the protocol to a microfluidic environment. After each step of the protocol, the UV-Vis spectrum was recorded and the position of gold Localized Surface Plasmon Resonance (LSPR) band was measured. The sensing process was modelled, taking into account the characteristics of the nano-island structure, prepared by thermal convection and annealing. The optimal molar ratios of the most important chemical entities, involved in the detection of exosomes were calculated as well. Indeed, it was found that the results of the sensing process depend on the two major steps: the molar ratios of streptavidin to biotin-PEG-Vn96 and, the final step, the capture of exosomes by the biotin-PEG-Vn96 complex. The microfluidic device designed for sensing of exosomes consists of a glass substrate, sealed by a PDMS layer that contains the channel and a collecting chamber. In the device, the solutions of linker, cross-linker, etc., are pumped over the gold nano-islands and an Ocean Optics spectrometer is used to measure the position of the Au plasmon band at each step of the sensing. The experiments have shown that the shift of the Au LSPR band is proportional to the concentration of exosomes and, thereby, exosomes can be accurately quantified. An important advantage of the method is the ability to discriminate between exosomes having different origins.

Keywords: exosomes, gold nano-islands, microfluidics, plasmonic biosensing

Procedia PDF Downloads 150
390 Analysis of Truck Drivers’ Distraction on Crash Risk

Authors: Samuel Nderitu Muchiri, Tracy Wangechi Maina

Abstract:

Truck drivers face a myriad of challenges in their profession. Enhancements in logistics effectiveness can be pivotal in propelling economic developments. The specific objective of the study was to assess the influence of driver distraction on crash risk. The study is significant as it elucidates best practices that truck drivers can embrace in an effort to enhance road safety. These include amalgamating behaviors that enable drivers to fruitfully execute multifaceted functions such as finding and following routes, evading collisions, monitoring speed, adhering to road regulations, and evaluating vehicle systems’ conditions. The analysis involved an empirical review of ten previous studies related to the research topic. The articles revealed that driver distraction plays a substantial role in road accidents and other crucial road security incidents across the globe. Africa depends immensely on the freight transport sector to facilitate supply chain operations. Several studies indicate that drivers who operate primarily on rural roads, such as those found in Sub-Saharan Africa, have an increased propensity to engage in distracted activities such as cell phone usage while driving. The findings also identified the need for digitalization in truck driving operations, including carrier management techniques such as fatigue management, artificial intelligence, and automating functions like cell phone usage controls. The recommendations can aid policymakers and commercial truck carriers in deepening their understanding of driver distraction and enforcing mitigations to foster road safety.

Keywords: truck drivers, distraction, digitalization, crash risk, road safety

Procedia PDF Downloads 22
389 Cai Guo-Qiang: A Chinese Artist at the Cutting-Edge of Global Art

Authors: Marta Blavia

Abstract:

Magiciens de la terre, organized in 1989 by the Centre Pompidou, became 'the first worldwide exhibition of contemporary art' by presenting artists from Western and non-Western countries, including three Chinese artists. For the first time, West turned its eyes to other countries not as exotic sources of inspiration, but as places where contemporary art was also being created. One year later, Chine: demain pour hier was inaugurated as the first Chinese avant-garde group-exhibition in Occident. Among the artists included was Cai Guo-Qiang who, like many other Chinese artists, had left his home country in the eighties in pursuit of greater creative freedom. By exploring artistic non-Western perspectives, both landmark exhibitions questioned the predominance of the Eurocentric vision in the construction of history art. But more than anything else, these exhibitions laid the groundwork for the rise of the so-called phenomenon 'global contemporary art'. All the same time, 1989 also was a turning point in Chinese art history. Because of the Tiananmen student protests, The Chinese government undertook a series of measures to cut down any kind of avant-garde artistic activity after a decade of a relative openness. During the eighties, and especially after the Tiananmen crackdown, some important artists began to leave China to move overseas such as Xu Bing and Ai Weiwei (USA); Chen Zhen and Huang Yong Ping (France); or Cai Guo-Qiang (Japan). After emigrating abroad, Chinese overseas artists began to develop projects in accordance with their new environments and audiences as well as to appear in numerous international exhibitions. With their creations, that moved freely between a variety of Eastern and Western art sources, these artists were crucial agents in the emergence of global contemporary art. As other Chinese artists overseas, Cai Guo-Qiang’s career took off during the 1990s and early 2000s right at the same moment in which Western art world started to look beyond itself. Little by little, he developed a very personal artistic language that redefines Chinese ideas, symbols, and traditional materials in a new world order marked by globalization. Cai Guo-Qiang participated in many of the exhibitions that contributed to shape global contemporary art: Encountering the Others (1992); the 45th Venice Biennale (1993); Inside Out: New Chinese Art (1997), or the 48th Venice Biennale (1999), where he recreated the Chinese monumental social realist work Rent Collection Courtyard that earned him the Golden Lion Award. By examining the different stages of Cai Guo-Qiang’s artistic path as well as the transnational dimensions of his creations, this paper aims at offering a comprehensive survey on the construction of the discourse of global contemporary art.

Keywords: Cai Guo-Qiang, Chinese artists overseas, emergence global art, transnational art

Procedia PDF Downloads 266
388 Design and Characterization of Ecological Materials Based on Demolition and Concrete Waste, Casablanca (Morocco)

Authors: Mourad Morsli, Mohamed Tahiri, Azzedine Samdi

Abstract:

The Cities are the urbanized territories most favorable to the consumption of resources (materials, energy). In Morocco, the economic capital Casablanca is one of them, with its 4M inhabitants and its 60% share in the economic and industrial activity of the kingdom. In the absence of legal status in force, urban development has favored the generation of millions of tons of demolition and construction waste scattered in open spaces causing a significant nuisance to the environment and citizens. Hence the main objective of our work is to valorize concrete waste. The representative wastes are mainly concrete, concrete, and fired clay bricks, ceramic tiles, marble panels, gypsum, and scrap metal. The work carried out includes: geolocation with a combination of artificial intelligence, GIS, and Google Earth, which allowed the estimation of the quantity of these wastes per site; then the sorting, crushing, grinding, and physicochemical characterization of the collected samples allowed the definition of the exploitation ways for each extracted fraction for integrated management of the said wastes. In the present work, we proceeded to the exploitation of the fractions obtained after sieving the representative samples to incorporate them in the manufacture of new ecological materials for construction. These formulations prepared studies have been tested and characterized: physical criteria (specific surface, resistance to flexion and compression) and appearance (cracks, deformation). We will present in detail the main results of our research work and also describe the specific properties of each material developed.

Keywords: demolition and construction waste, GIS combination software, inert waste recovery, ecological materials, Casablanca, Morocco

Procedia PDF Downloads 113
387 Social-Cognitive Aspects of Interpretation: Didactic Approaches in Language Processing and English as a Second Language Difficulties in Dyslexia

Authors: Schnell Zsuzsanna

Abstract:

Background: The interpretation of written texts, language processing in the visual domain, in other words, atypical reading abilities, also known as dyslexia, is an ever-growing phenomenon in today’s societies and educational communities. The much-researched problem affects cognitive abilities and, coupled with normal intelligence normally manifests difficulties in the differentiation of sounds and orthography and in the holistic processing of written words. The factors of susceptibility are varied: social, cognitive psychological, and linguistic factors interact with each other. Methods: The research will explain the psycholinguistics of dyslexia on the basis of several empirical experiments and demonstrate how domain-general abilities of inhibition, retrieval from the mental lexicon, priming, phonological processing, and visual modality transfer affect successful language processing and interpretation. Interpretation of visual stimuli is hindered, and the problem seems to be embedded in a sociocultural, psycholinguistic, and cognitive background. This makes the picture even more complex, suggesting that the understanding and resolving of the issues of dyslexia has to be interdisciplinary, aided by several disciplines in the field of humanities and social sciences, and should be researched from an empirical approach, where the practical, educational corollaries can be analyzed on an applied basis. Aim and applicability: The lecture sheds light on the applied, cognitive aspects of interpretation, social cognitive traits of language processing, the mental underpinnings of cognitive interpretation strategies in different languages (namely, Hungarian and English), offering solutions with a few applied techniques for success in foreign language learning that can be useful advice for the developers of testing methodologies and measures across ESL teaching and testing platforms.

Keywords: dyslexia, social cognition, transparency, modalities

Procedia PDF Downloads 62
386 Dosimetric Comparison among Different Head and Neck Radiotherapy Techniques Using PRESAGE™ Dosimeter

Authors: Jalil ur Rehman, Ramesh C. Tailor, Muhammad Isa Khan, Jahnzeeb Ashraf, Muhammad Afzal, Geofferry S. Ibbott

Abstract:

Purpose: The purpose of this analysis was to investigate dose distribution of different techniques (3D-CRT, IMRT and VMAT) of head and neck cancer using 3-dimensional dosimeter called PRESAGETM Dosimeter. Materials and Methods: Computer tomography (CT) scans of radiological physics center (RPC) head and neck anthropomorphic phantom with both RPC standard insert and PRESAGETM insert were acquired separated with Philipp’s CT scanner and both CT scans were exported via DICOM to the Pinnacle version 9.4 treatment planning system (TPS). Each plan was delivered twice to the RPC phantom first containing the RPC standard insert having TLD and film dosimeters and then again containing the Presage insert having 3-D dosimeter (PRESAGETM) by using a Varian True Beam linear accelerator. After irradiation, the standard insert including point dose measurements (TLD) and planar Gafchromic® EBT film measurement were read using RPC standard procedure. The 3D dose distribution from PRESAGETM was read out with the Duke Midsized optical scanner dedicated to RPC (DMOS-RPC). Dose volume histogram (DVH), mean and maximal doses for organs at risk were calculated and compared among each head and neck technique. The prescription dose was same for all head and neck radiotherapy techniques which was 6.60 Gy/friction. Beam profile comparison and gamma analysis were used to quantify agreements among film measurement, PRESAGETM measurement and calculated dose distribution. Quality assurances of all plans were performed by using ArcCHECK method. Results: VMAT delivered the lowest mean and maximum doses to organ at risk (spinal cord, parotid) than IMRT and 3DCRT. Such dose distribution was verified by absolute dose distribution using thermoluminescent dosimeter (TLD) system. The central axial, sagittal and coronal planes were evaluated using 2D gamma map criteria(± 5%/3 mm) and results were 99.82% (axial), 99.78% (sagital), 98.38% (coronal) for VMAT plan and found the agreement between PRESAGE and pinnacle was better than IMRT and 3D-CRT plan excludes a 7 mm rim at the edge of the dosimeter. Profile showed good agreement for all plans between film, PRESAGE and pinnacle and 3D gamma was performed for PTV and OARs, VMAT and 3DCRT endow with better agreement than IMRT. Conclusion: VMAT delivered lowered mean and maximal doses to organs at risk and better PTV coverage during head and neck radiotherapy. TLD, EBT film and PRESAGETM dosimeters suggest that VMAT was better for the treatment of head and neck cancer than IMRT and 3D-CRT.

Keywords: RPC, 3DCRT, IMRT, VMAT, EBT2 film, TLD, PRESAGETM

Procedia PDF Downloads 364
385 Winning the Future of Education in Africa through Project Base Learning: How the Implementation of PBL Pedagogy Can Transform Africa’s Educational System from Theory Base to Practical Base in School Curriculum

Authors: Bismark Agbemble

Abstract:

This paper talks about how project-based learning (PBL) is being infused or implemented in the educational sphere of Africa. The paper navigates through the liminal aspects of PBL as a pedagogical approach to bridge the divide between theoretical knowledge and its application within school curriculums. Given that contextualized learning can be embodied, the abstract vehemently discusses that PBL creates an opportunity for students to work on projects that are of academic relevance in their local settings. It presents PBL’s growth of critical thinking, problem-solving, cooperation, and communications, which is vital in getting young citizens to prepare for the 21st-century revolution. In addition, the abstract stresses the possibility that PBL could become a stimulus to creativity and innovation wherein learning becomes motivated from within by intrinsic motivations. The paper advocates for a holistic approach that is based on teacher’s professional development with the provision of adequate infrastructural facilities and resource allocation, thus ensuring the success and sustainability of PBLs in African education systems. In the end, the paper positions this as a transformative educational methodology that has great potential in helping to shape an African generation that is prepared for a great future.

Keywords: student centered pedagogy, constructivist learning theory, self-directed learning, active exploration, real world challenges, STEM, 21st century skills, curriculum design, classroom management, project base learning curriculum, global intelligence, social and communication skills, transferable skills, critical thinking, investigatable learning, life skills

Procedia PDF Downloads 30
384 An Intelligent Prediction Method for Annular Pressure Driven by Mechanism and Data

Authors: Zhaopeng Zhu, Xianzhi Song, Gensheng Li, Shuo Zhu, Shiming Duan, Xuezhe Yao

Abstract:

Accurate calculation of wellbore pressure is of great significance to prevent wellbore risk during drilling. The traditional mechanism model needs a lot of iterative solving procedures in the calculation process, which reduces the calculation efficiency and is difficult to meet the demand of dynamic control of wellbore pressure. In recent years, many scholars have introduced artificial intelligence algorithms into wellbore pressure calculation, which significantly improves the calculation efficiency and accuracy of wellbore pressure. However, due to the ‘black box’ property of intelligent algorithm, the existing intelligent calculation model of wellbore pressure is difficult to play a role outside the scope of training data and overreacts to data noise, often resulting in abnormal calculation results. In this study, the multi-phase flow mechanism is embedded into the objective function of the neural network model as a constraint condition, and an intelligent prediction model of wellbore pressure under the constraint condition is established based on more than 400,000 sets of pressure measurement while drilling (MPD) data. The constraint of the multi-phase flow mechanism makes the prediction results of the neural network model more consistent with the distribution law of wellbore pressure, which overcomes the black-box attribute of the neural network model to some extent. The main performance is that the accuracy of the independent test data set is further improved, and the abnormal calculation values basically disappear. This method is a prediction method driven by MPD data and multi-phase flow mechanism, and it is the main way to predict wellbore pressure accurately and efficiently in the future.

Keywords: multiphase flow mechanism, pressure while drilling data, wellbore pressure, mechanism constraints, combined drive

Procedia PDF Downloads 155
383 Signs, Signals and Syndromes: Algorithmic Surveillance and Global Health Security in the 21st Century

Authors: Stephen L. Roberts

Abstract:

This article offers a critical analysis of the rise of syndromic surveillance systems for the advanced detection of pandemic threats within contemporary global health security frameworks. The article traces the iterative evolution and ascendancy of three such novel syndromic surveillance systems for the strengthening of health security initiatives over the past two decades: 1) The Program for Monitoring Emerging Diseases (ProMED-mail); 2) The Global Public Health Intelligence Network (GPHIN); and 3) HealthMap. This article demonstrates how each newly introduced syndromic surveillance system has become increasingly oriented towards the integration of digital algorithms into core surveillance capacities to continually harness and forecast upon infinitely generating sets of digital, open-source data, potentially indicative of forthcoming pandemic threats. This article argues that the increased centrality of the algorithm within these next-generation syndromic surveillance systems produces a new and distinct form of infectious disease surveillance for the governing of emergent pathogenic contingencies. Conceptually, the article also shows how the rise of this algorithmic mode of infectious disease surveillance produces divergences in the governmental rationalities of global health security, leading to the rise of an algorithmic governmentality within contemporary contexts of Big Data and these surveillance systems. Empirically, this article demonstrates how this new form of algorithmic infectious disease surveillance has been rapidly integrated into diplomatic, legal, and political frameworks to strengthen the practice of global health security – producing subtle, yet distinct shifts in the outbreak notification and reporting transparency of states, increasingly scrutinized by the algorithmic gaze of syndromic surveillance.

Keywords: algorithms, global health, pandemic, surveillance

Procedia PDF Downloads 157
382 Potassium-Phosphorus-Nitrogen Detection and Spectral Segmentation Analysis Using Polarized Hyperspectral Imagery and Machine Learning

Authors: Nicholas V. Scott, Jack McCarthy

Abstract:

Military, law enforcement, and counter terrorism organizations are often tasked with target detection and image characterization of scenes containing explosive materials in various types of environments where light scattering intensity is high. Mitigation of this photonic noise using classical digital filtration and signal processing can be difficult. This is partially due to the lack of robust image processing methods for photonic noise removal, which strongly influence high resolution target detection and machine learning-based pattern recognition. Such analysis is crucial to the delivery of reliable intelligence. Polarization filters are a possible method for ambient glare reduction by allowing only certain modes of the electromagnetic field to be captured, providing strong scene contrast. An experiment was carried out utilizing a polarization lens attached to a hyperspectral imagery camera for the purpose of exploring the degree to which an imaged polarized scene of potassium, phosphorus, and nitrogen mixture allows for improved target detection and image segmentation. Preliminary imagery results based on the application of machine learning algorithms, including competitive leaky learning and distance metric analysis, to polarized hyperspectral imagery, suggest that polarization filters provide a slight advantage in image segmentation. The results of this work have implications for understanding the presence of explosive material in dry, desert areas where reflective glare is a significant impediment to scene characterization.

Keywords: explosive material, hyperspectral imagery, image segmentation, machine learning, polarization

Procedia PDF Downloads 116
381 Study of Ion Density Distribution and Sheath Thickness in Warm Electronegative Plasma

Authors: Rajat Dhawan, Hitendra K. Malik

Abstract:

Electronegative plasmas comprising electrons, positive ions, and negative ions are advantageous for their expanding applications in industries. In plasma cleaning, plasma etching, and plasma deposition process, electronegative plasmas are preferred because of relatively less potential developed on the surface of the material under investigation. Also, the presence of negative ions avoid the irregularity in etching shapes and also enhance the material working during the fabrication process. The interaction of metallic conducting surface with plasma becomes mandatory to understand these applications. A metallic conducting probe immersed in a plasma results in the formation of a thin layer of charged species around the probe called as a sheath. The density of the ions embedded on the surface of the material and the sheath thickness are the important parameters for the surface-plasma interaction. Sheath thickness will give rise to the information of affected plasma region due to conducting surface/probe. The knowledge of the density of ions in the sheath region is advantageous in plasma nitriding, and their temperature is equally important as it strongly influences the thickness of the modified layer during surface plasma interaction. In the present work, we considered a negatively biased metallic probe immersed in a warm electronegative plasma. For this system, we adopted the continuity equation and momentum transfer equation for both the positive and negative ions, whereas electrons are described by Boltzmann distribution. Finally, we use the Poisson’s equation. Here, we assumed the spherical geometry for small probe radius. Poisson’s equation reveals the behaviour of potential surrounding a conducting metallic probe along with the use of the continuity and momentum transfer equations, with the help of proper boundary conditions. In turn, it gives rise to the information about the density profile of charged species and most importantly the thickness of the sheath. By keeping in mind, the well-known Bohm-Sheath criterion, all calculations are done. We found that positive ion density decreases with an increase in positive ion temperature, whereas it increases with the higher temperature of the negative ions. Positive ion density decreases as we move away from the center of the probe and is found to show a discontinuity at a particular distance from the center of the probe. The distance where discontinuity occurs is designated as sheath edge, i.e., the point where sheath ends. These results are beneficial for industrial applications, as the density of ions embedded on material surface is strongly affected by the temperature of plasma species. It has a drastic influence on the surface properties, i.e., the hardness, corrosion resistance, etc. of the materials.

Keywords: electronegative plasmas, plasma surface interaction positive ion density, sheath thickness

Procedia PDF Downloads 116
380 An Analytical Metric and Process for Critical Infrastructure Architecture System Availability Determination in Distributed Computing Environments under Infrastructure Attack

Authors: Vincent Andrew Cappellano

Abstract:

In the early phases of critical infrastructure system design, translating distributed computing requirements to an architecture has risk given the multitude of approaches (e.g., cloud, edge, fog). In many systems, a single requirement for system uptime / availability is used to encompass the system’s intended operations. However, when architected systems may perform to those availability requirements only during normal operations and not during component failure, or during outages caused by adversary attacks on critical infrastructure (e.g., physical, cyber). System designers lack a structured method to evaluate availability requirements against candidate system architectures through deep degradation scenarios (i.e., normal ops all the way down to significant damage of communications or physical nodes). This increases risk of poor selection of a candidate architecture due to the absence of insight into true performance for systems that must operate as a piece of critical infrastructure. This research effort proposes a process to analyze critical infrastructure system availability requirements and a candidate set of systems architectures, producing a metric assessing these architectures over a spectrum of degradations to aid in selecting appropriate resilient architectures. To accomplish this effort, a set of simulation and evaluation efforts are undertaken that will process, in an automated way, a set of sample requirements into a set of potential architectures where system functions and capabilities are distributed across nodes. Nodes and links will have specific characteristics and based on sampled requirements, contribute to the overall system functionality, such that as they are impacted/degraded, the impacted functional availability of a system can be determined. A machine learning reinforcement-based agent will structurally impact the nodes, links, and characteristics (e.g., bandwidth, latency) of a given architecture to provide an assessment of system functional uptime/availability under these scenarios. By varying the intensity of the attack and related aspects, we can create a structured method of evaluating the performance of candidate architectures against each other to create a metric rating its resilience to these attack types/strategies. Through multiple simulation iterations, sufficient data will exist to compare this availability metric, and an architectural recommendation against the baseline requirements, in comparison to existing multi-factor computing architectural selection processes. It is intended that this additional data will create an improvement in the matching of resilient critical infrastructure system requirements to the correct architectures and implementations that will support improved operation during times of system degradation due to failures and infrastructure attacks.

Keywords: architecture, resiliency, availability, cyber-attack

Procedia PDF Downloads 75
379 Performance Estimation of Small Scale Wind Turbine Rotor for Very Low Wind Regime Condition

Authors: Vilas Warudkar, Dinkar Janghel, Siraj Ahmed

Abstract:

Rapid development experienced by India requires huge amount of energy. Actual supply capacity additions have been consistently lower than the targets set by the government. According to World Bank 40% of residences are without electricity. In 12th five year plan 30 GW grid interactive renewable capacity is planned in which 17 GW is Wind, 10 GW is from solar and 2.1 GW from small hydro project, and rest is compensated by bio gas. Renewable energy (RE) and energy efficiency (EE) meet not only the environmental and energy security objectives, but also can play a crucial role in reducing chronic power shortages. In remote areas or areas with a weak grid, wind energy can be used for charging batteries or can be combined with a diesel engine to save fuel whenever wind is available. India according to IEC 61400-1 belongs to class IV Wind Condition; it is not possible to set up wind turbine in large scale at every place. So, the best choice is to go for small scale wind turbine at lower height which will have good annual energy production (AEP). Based on the wind characteristic available at MANIT Bhopal, rotor for small scale wind turbine is designed. Various Aero foil data is reviewed for selection of airfoil in the Blade Profile. Airfoil suited of Low wind conditions i.e. at low Reynold’s number is selected based on Coefficient of Lift, Drag and angle of attack. For designing of the rotor blade, standard Blade Element Momentum (BEM) Theory is implanted. Performance of the Blade is estimated using BEM theory in which axial induction factor and angular induction factor is optimized using iterative technique. Rotor performance is estimated for particular designed blade specifically for low wind Conditions. Power production of rotor is determined at different wind speeds for particular pitch angle of the blade. At pitch 15o and velocity 5 m/sec gives good cut in speed of 2 m/sec and power produced is around 350 Watts. Tip speed of the Blade is considered as 6.5 for which Coefficient of Performance of the rotor is calculated 0.35, which is good acceptable value for Small scale Wind turbine. Simple Load Model (SLM, IEC 61400-2) is also discussed to improve the structural strength of the rotor. In SLM, Edge wise Moment and Flap Wise moment is considered which cause bending stress at the root of the blade. Various Load case mentioned in the IEC 61400-2 is calculated and checked for the partial safety factor of the wind turbine blade.

Keywords: annual energy production, Blade Element Momentum Theory, low wind Conditions, selection of airfoil

Procedia PDF Downloads 318
378 Endoscopic Stenting of the Main Pancreatic Duct in Patients With Pancreatic Fluid Collections After Pancreas Transplantation

Authors: Y. Teterin, S. Suleymanova, I. Dmitriev, P. Yartcev

Abstract:

Introduction: One of the most common complications after pancreas transplantation are pancreatic fluid collections (PFCs), which are often complicated not only by infection and subsequent disfunction of the pancreatoduodenal graft (PDG), but also with a rather high mortality rate of recipients. Drainage is not always effective and often requires repeated open surgical interventions, which worsens the outcome of the surgery. Percutaneous drainage of PFCs combined with endoscopic stenting of the main pancreatic duct of the pancreatoduodenal graft (MPDPDG) showed high efficiency in the treatment of PFCs. Aims & Methods: From 01.01.2012 to 31.12.2021 at the Sklifosovsky Research Institute for Emergency Medicine were performed 64 transplantations of PDG. In 11 cases (17.2%), the early postoperative period was complicated by the formation of PFCs. Of these, 7 patients underwent percutaneous drainage of pancreonecrosis with high efficiency and did not required additional methods of treatment. In the remaining 4 patients, drainage was ineffective and was an indication for endoscopic stenting of the MPDPDG. They were the ones who made up the study group. Among them were 3 men and 1 woman. The mean age of the patients was 36,4 years.PFCs in these patients formed on days 1, 12, 18, and 47 after PDG transplantation. We used a gastroscope to stent the MPDPDG, due to anatomical features of the location of the duodenoduodenal anastomosis after PDG transplantation. Through the endoscope channel was performed selective catheterization of the MPDPDG, using a catheter and a guidewire, followed by its contrasting with a water-soluble contrast agent. Due to the extravasation of the contrast, was determined the localization of the defect in the PDG duct system. After that, a plastic pancreatic stent with a diameter of 7 Fr. and a length of 7 cm. was installed along guidewire. The stent was installed in such a way that its proximal edge completely covered the defect zone, and the distal one was determined in the intestinal lumen. Results: In all patients PDG pancreaticography revealed extravasation of a contrast in the area of the isthmus and body of the pancreas, which required stenting of the MPDPDG. In 1 (25%) case, the patient had a dislocation of the stent into the intestinal lumen (III degree according to Clavien-Dindo (2009)). This patient underwent repeated endoscopic stenting of the MPDPDG. On average 23 days after endoscopic stenting of the MPDPDG, the drainage tubes were removed and after approximately 40 days all patients were discharged in a satisfactory condition with follow-up endocrinologist and surgeon consultation. Pancreatic stents were removed after 6 months ± 7 days. Conclusion: Endoscopic stenting of the main pancreatic duct of the donor pancreas is by far the most highly effective and minimally invasive method in the treatment of PFCs after transplantation of the pancreatoduodenal complex.

Keywords: pancreas transplantation, endoscopy surgery, diabetes, stenting, main pancreatic duct

Procedia PDF Downloads 67
377 The Features of the Synergistic Approach in Marketing Management to Regional Level

Authors: Evgeni Baratashvili, Anzor Abralava, Rusudan Kutateladze, Nino Pailodze, Irma Makharashvili, Larisa Takalandze

Abstract:

Sinergy as a neological term is reflected in modern sciences. It can be found in the various fields of science including the humanities and technical sciences. Among them are biology and medicine, philology, economy and etc. Synergy is the received surplus of marginal high total effect of the groups, consolidated by one common idea, received through endeavored applies of their combined tools, via obtained effect of the separate independent actions of the groups. In the conditions of market economy, according the terms of new communication terminology, synergy effects on management and marketing successfully as well as on purity defense of native language. The well-known scientist’s and public figure’s Academician I. Prangishvili’s works are especially valuable in this aspect. In our opinion the entropy research is linked to his name in our country. In modern economy, the current qualitative changes shows us that the most number of factors and issues have been regrouped. They have a great influence and even define the economic development. The declining abilities of traditional recourses of economic growth have been related on the use of their physical abilities and their moving closer to the edge. Also it is related on the reduced effectiveness, which at the same time increases the expenditures. This means that the leading must be the innovative process system of products and services in the economic growth model. In our opinion the above mentioned system is distinguished with the synergistic approach. It should be noted that the main components of the innovative system are technological, scientific and scientific-technical, social-organizational, managerial and cognitive changes. All of them are reflected on scientific works and inventions in the proper dosages, in know-how and material source. At any stage they create the reproduction cycle. The innovations are different from each other by technologies, origination, design, innovation and quality, subject-content structure, by the the spread of economic processes and the impact of the level of it’s distribution. We have presented a generalized statement of an innovative approach, which is not a single act of innovation but it is also targeted system of the development, implementation, reconciling-exploitation, production, diffusion and commercialization of news. The innovative approaches should be considered as the creation of news, in-depth process of creativity as an innovative alternative to the realization of innovative and entrepreneurial efforts and measures, in order to meet the requirements of the permanent process.

Keywords: economic development, leading process, neological term, synergy

Procedia PDF Downloads 169
376 A Use Case-Oriented Performance Measurement Framework for AI and Big Data Solutions in the Banking Sector

Authors: Yassine Bouzouita, Oumaima Belghith, Cyrine Zitoun, Charles Bonneau

Abstract:

Performance measurement framework (PMF) is an essential tool in any organization to assess the performance of its processes. It guides businesses to stay on track with their objectives and benchmark themselves from the market. With the growing trend of the digital transformation of business processes, led by innovations in artificial intelligence (AI) & Big Data applications, developing a mature system capable of capturing the impact of digital solutions across different industries became a necessity. Based on the conducted research, no such system has been developed in academia nor the industry. In this context, this paper covers a variety of methodologies on performance measurement, overviews the major AI and big data applications in the banking sector, and covers an exhaustive list of relevant metrics. Consequently, this paper is of interest to both researchers and practitioners. From an academic perspective, it offers a comparative analysis of the reviewed performance measurement frameworks. From an industry perspective, it offers exhaustive research, from market leaders, of the major applications of AI and Big Data technologies, across the different departments of an organization. Moreover, it suggests a standardized classification model with a well-defined structure of intelligent digital solutions. The aforementioned classification is mapped to a centralized library that contains an indexed collection of potential metrics for each application. This library is arranged in a manner that facilitates the rapid search and retrieval of relevant metrics. This proposed framework is meant to guide professionals in identifying the most appropriate AI and big data applications that should be adopted. Furthermore, it will help them meet their business objectives through understanding the potential impact of such solutions on the entire organization.

Keywords: AI and Big Data applications, impact assessment, metrics, performance measurement

Procedia PDF Downloads 176
375 Recent Developments in the Application of Deep Learning to Stock Market Prediction

Authors: Shraddha Jain Sharma, Ratnalata Gupta

Abstract:

Predicting stock movements in the financial market is both difficult and rewarding. Analysts and academics are increasingly using advanced approaches such as machine learning techniques to anticipate stock price patterns, thanks to the expanding capacity of computing and the recent advent of graphics processing units and tensor processing units. Stock market prediction is a type of time series prediction that is incredibly difficult to do since stock prices are influenced by a variety of financial, socioeconomic, and political factors. Furthermore, even minor mistakes in stock market price forecasts can result in significant losses for companies that employ the findings of stock market price prediction for financial analysis and investment. Soft computing techniques are increasingly being employed for stock market prediction due to their better accuracy than traditional statistical methodologies. The proposed research looks at the need for soft computing techniques in stock market prediction, the numerous soft computing approaches that are important to the field, past work in the area with their prominent features, and the significant problems or issue domain that the area involves. For constructing a predictive model, the major focus is on neural networks and fuzzy logic. The stock market is extremely unpredictable, and it is unquestionably tough to correctly predict based on certain characteristics. This study provides a complete overview of the numerous strategies investigated for high accuracy prediction, with a focus on the most important characteristics.

Keywords: stock market prediction, artificial intelligence, artificial neural networks, fuzzy logic, accuracy, deep learning, machine learning, stock price, trading volume

Procedia PDF Downloads 63
374 Study and Simulation of a Dynamic System Using Digital Twin

Authors: J.P. Henriques, E. R. Neto, G. Almeida, G. Ribeiro, J.V. Coutinho, A.B. Lugli

Abstract:

Industry 4.0, or the Fourth Industrial Revolution, is transforming the relationship between people and machines. In this scenario, some technologies such as Cloud Computing, Internet of Things, Augmented Reality, Artificial Intelligence, Additive Manufacturing, among others, are making industries and devices increasingly intelligent. One of the most powerful technologies of this new revolution is the Digital Twin, which allows the virtualization of a real system or process. In this context, the present paper addresses the linear and nonlinear dynamic study of a didactic level plant using Digital Twin. In the first part of the work, the level plant is identified at a fixed point of operation, BY using the existing method of least squares means. The linearized model is embedded in a Digital Twin using Automation Studio® from Famous Technologies. Finally, in order to validate the usage of the Digital Twin in the linearized study of the plant, the dynamic response of the real system is compared to the Digital Twin. Furthermore, in order to develop the nonlinear model on a Digital Twin, the didactic level plant is identified by using the method proposed by Hammerstein. Different steps are applied to the plant, and from the Hammerstein algorithm, the nonlinear model is obtained for all operating ranges of the plant. As for the linear approach, the nonlinear model is embedded in the Digital Twin, and the dynamic response is compared to the real system in different points of operation. Finally, yet importantly, from the practical results obtained, one can conclude that the usage of Digital Twin to study the dynamic systems is extremely useful in the industrial environment, taking into account that it is possible to develop and tune controllers BY using the virtual model of the real systems.

Keywords: industry 4.0, digital twin, system identification, linear and nonlinear models

Procedia PDF Downloads 121
373 Investigation of FOXM1 Gene Expression in Breast Cancer and Its Relationship with Mir-216B-5P Expression Level

Authors: Ramin Mehdiabadi, Neda Menbari, Mohammad Nazir Menbari

Abstract:

As a pressing public health concern, breast cancer stands as the predominant oncological diagnosis and principal cause of cancer-related mortality among women globally, accounting for 11.7% of new cancer incidences and 6.9% of cancer-related deaths. The annual figures indicate that approximately 230,480 women are diagnosed with breast cancer in the United States alone, with 39,520 succumbing to the disease. While developed economies have reported a deceleration in both incidence and mortality rates across various forms of cancer, including breast cancer, emerging and low-income economies manifest a contrary escalation, largely attributable to lifestyle-mediated risk factors such as tobacco usage, physical inactivity, and high caloric intake. Breast cancer is distinctly characterized by molecular heterogeneity, manifesting in specific subtypes delineated by biomarkers—Estrogen Receptors (ER), Progesterone Receptors (PR), and Human Epidermal Growth Factor Receptor 2 (HER2). These subtypes, comprising Luminal A, Luminal B, HER2-enriched, triple-negative/basal-like, and normal-like, necessitate nuanced, subtype-specific therapeutic regimens, thereby challenging the applicability of generalized treatment protocols. Within this molecular complexity, the transcription factor Forkhead Box M1 (FoxM1) has garnered attention as a significant driver of cellular proliferation, tumorigenesis, metastatic progression, and treatment resistance in a spectrum of human malignancies, including breast cancer. Concurrently, microRNAs (miRs), specifically miR-216b-5p, have been identified as post-transcriptional gene expression regulators and potential tumor suppressors. The overarching objective of this academic investigation is to explicate the multifaceted interrelationship between FoxM1 and miR-216b-5p across the disparate molecular subtypes of breast cancer. Employing a methodologically rigorous, interdisciplinary research design that incorporates cutting-edge molecular biology techniques, sophisticated bioinformatics analytics, and exhaustive meta-analyses of extant clinical data, this scholarly endeavor aims to unveil novel biomarker-specific therapeutic pathways. By doing so, this research is positioned to make a seminal contribution to the advancement of personalized, efficacious, and minimally toxic treatment paradigms, thus profoundly impacting the global efforts to ameliorate the burden of breast cancer.

Keywords: breast cancer, fox m1, microRNAs, mir-216b-5p, gene expression

Procedia PDF Downloads 41
372 Synthesized Doped TiO2 Photocatalysts for Mineralization of Quinalphos from Aqueous Streams

Authors: Nidhi Sharotri, Dhiraj Sud

Abstract:

Water pollution by pesticides constitutes a serious ecological problem due to their potential toxicity and bioaccumulation. The widespread use of pesticides in industry and agriculture along with their resistance to natural decomposition, biodegradation, chemical and photochemical degradation under typical environmental conditions has resulted in the emergence of these chemicals and their transformed products in natural water. Among AOP’s, heterogeneous photocatalysis using TiO2 as photocatalyst appears as the most emerging destructive technology for mineralization of the pollutant in aquatic streams. Among the various semiconductors (TiO2, ZnO, CdS, FeTiO3, MnTiO3, SrTiO2 and SnO2), TiO2 has proven to be the most efficient photocatalyst for environmental applications due to its biological and chemical inertness, high photo reactivity, non-toxicity, and photo stability. Semiconductor photocatalysts are characterized by an electronic band structure in which valence band and conduction band are separated by a band gap, i.e. a region of forbidden energy. Semiconductor based photocatalysts produces e-/h+ pairs which have been employed for degradation of organic pollutants. The present paper focuses on modification of TiO2 photocatalyst in order to shift its absorption edge towards longer wavelength to make it active under natural light. Semiconductor TiO2 photocatalysts was prepared by doping with anion (N), cation (Mn) and double doped (Mn, N) using greener approach. Titanium isopropoxide is used as titania precursor and ethanedithiol, hydroxyl amine hydrochloride, manganous chloride as sulphur, nitrogen and manganese precursors respectively. Synthesized doped TiO2 nanomaterials are characterized for surface morphology (SEM, TEM), crystallinity (XRD) and optical properties (absorption spectra and band gap). EPR data confirms the substitutional incorporation of Mn2+ in TiO2 lattice. The doping influences the phase transformation of rutile and anatase phase crystal and thereby the absorption spectrum changes were observed. The effect of variation of reaction parameters such as solvent, reaction time and calcination temperature on the yield, surface morphology and optical properties was also investigated. The TEM studies show the particle size of nanomaterials varies from 10-50 nm. The calculated band gap of nanomaterials varies from 2.30-2.60 eV. The photocatalytic degradation of organic pollutant organophosphate pesticide (Quinalphos) has been investigated by studying the changes in UV absorption spectrum and the promising results were obtained under visible light. The complete mineralization of quinalphos has occurred as no intermediates were recorded after 8 hrs of degradation confirmed from the HPLC studies.

Keywords: quinalphos, doped-TiO2, mineralization, EPR

Procedia PDF Downloads 305
371 Studying Second Language Development from a Complex Dynamic Systems Perspective

Authors: L. Freeborn

Abstract:

This paper discusses the application of complex dynamic system theory (DST) to the study of individual differences in second language development. This transdisciplinary framework allows researchers to view the trajectory of language development as a dynamic, non-linear process. A DST approach views language as multi-componential, consisting of multiple complex systems and nested layers. These multiple components and systems continuously interact and influence each other at both the macro- and micro-level. Dynamic systems theory aims to explain and describe the development of the language system, rather than make predictions about its trajectory. Such a holistic and ecological approach to second language development allows researchers to include various research methods from neurological, cognitive, and social perspectives. A DST perspective would involve in-depth analyses as well as mixed methods research. To illustrate, a neurobiological approach to second language development could include non-invasive neuroimaging techniques such as electroencephalography (EEG) and functional magnetic resonance imaging (fMRI) to investigate areas of brain activation during language-related tasks. A cognitive framework would further include behavioural research methods to assess the influence of intelligence and personality traits, as well as individual differences in foreign language aptitude, such as phonetic coding ability and working memory capacity. Exploring second language development from a DST approach would also benefit from including perspectives from the field of applied linguistics, regarding the teaching context, second language input, and the role of affective factors such as motivation. In this way, applying mixed research methods from neurobiological, cognitive, and social approaches would enable researchers to have a more holistic view of the dynamic and complex processes of second language development.

Keywords: dynamic systems theory, mixed methods, research design, second language development

Procedia PDF Downloads 111
370 Reducing the Imbalance Penalty Through Artificial Intelligence Methods Geothermal Production Forecasting: A Case Study for Turkey

Authors: Hayriye Anıl, Görkem Kar

Abstract:

In addition to being rich in renewable energy resources, Turkey is one of the countries that promise potential in geothermal energy production with its high installed power, cheapness, and sustainability. Increasing imbalance penalties become an economic burden for organizations since geothermal generation plants cannot maintain the balance of supply and demand due to the inadequacy of the production forecasts given in the day-ahead market. A better production forecast reduces the imbalance penalties of market participants and provides a better imbalance in the day ahead market. In this study, using machine learning, deep learning, and, time series methods, the total generation of the power plants belonging to Zorlu Natural Electricity Generation, which has a high installed capacity in terms of geothermal, was estimated for the first one and two weeks of March, then the imbalance penalties were calculated with these estimates and compared with the real values. These modeling operations were carried out on two datasets, the basic dataset and the dataset created by extracting new features from this dataset with the feature engineering method. According to the results, Support Vector Regression from traditional machine learning models outperformed other models and exhibited the best performance. In addition, the estimation results in the feature engineering dataset showed lower error rates than the basic dataset. It has been concluded that the estimated imbalance penalty calculated for the selected organization is lower than the actual imbalance penalty, optimum and profitable accounts.

Keywords: machine learning, deep learning, time series models, feature engineering, geothermal energy production forecasting

Procedia PDF Downloads 84
369 Non-Invasive Data Extraction from Machine Display Units Using Video Analytics

Authors: Ravneet Kaur, Joydeep Acharya, Sudhanshu Gaur

Abstract:

Artificial Intelligence (AI) has the potential to transform manufacturing by improving shop floor processes such as production, maintenance and quality. However, industrial datasets are notoriously difficult to extract in a real-time, streaming fashion thus, negating potential AI benefits. The main example is some specialized industrial controllers that are operated by custom software which complicates the process of connecting them to an Information Technology (IT) based data acquisition network. Security concerns may also limit direct physical access to these controllers for data acquisition. To connect the Operational Technology (OT) data stored in these controllers to an AI application in a secure, reliable and available way, we propose a novel Industrial IoT (IIoT) solution in this paper. In this solution, we demonstrate how video cameras can be installed in a factory shop floor to continuously obtain images of the controller HMIs. We propose image pre-processing to segment the HMI into regions of streaming data and regions of fixed meta-data. We then evaluate the performance of multiple Optical Character Recognition (OCR) technologies such as Tesseract and Google vision to recognize the streaming data and test it for typical factory HMIs and realistic lighting conditions. Finally, we use the meta-data to match the OCR output with the temporal, domain-dependent context of the data to improve the accuracy of the output. Our IIoT solution enables reliable and efficient data extraction which will improve the performance of subsequent AI applications.

Keywords: human machine interface, industrial internet of things, internet of things, optical character recognition, video analytics

Procedia PDF Downloads 87
368 The Impact of Artificial Intelligence on Pharmacy and Pharmacology

Authors: Mamdouh Milad Adly Morkos

Abstract:

Despite having the greatest rates of mortality and morbidity in the world, low- and middle-income (LMIC) nations trail high-income nations in terms of the number of clinical trials, the number of qualified researchers, and the amount of research information specific to their people. Health inequities and the use of precision medicine may be hampered by a lack of local genomic data, clinical pharmacology and pharmacometrics competence, and training opportunities. These issues can be solved by carrying out health care infrastructure development, which includes data gathering and well-designed clinical pharmacology training in LMICs. It will be advantageous if there is international cooperation focused at enhancing education and infrastructure and promoting locally motivated clinical trials and research. This paper outlines various instances where clinical pharmacology knowledge could be put to use, including pharmacogenomic opportunities that could lead to better clinical guideline recommendations. Examples of how clinical pharmacology training can be successfully implemented in LMICs are also provided, including clinical pharmacology and pharmacometrics training programmes in Africa and a Tanzanian researcher's personal experience while on a training sabbatical in the United States. These training initiatives will profit from advocacy for clinical pharmacologists' employment prospects and career development pathways, which are gradually becoming acknowledged and established in LMICs. The advancement of training and research infrastructure to increase clinical pharmacologists' knowledge in LMICs would be extremely beneficial because they have a significant role to play in global health

Keywords: electromagnetic solar system, nano-material, nano pharmacology, pharmacovigilance, quantum theoryclinical simulation, education, pharmacology, simulation, virtual learning low- and middle-income, clinical pharmacology, pharmacometrics, career development pathways

Procedia PDF Downloads 50
367 A Methodology to Integrate Data in the Company Based on the Semantic Standard in the Context of Industry 4.0

Authors: Chang Qin, Daham Mustafa, Abderrahmane Khiat, Pierre Bienert, Paulo Zanini

Abstract:

Nowadays, companies are facing lots of challenges in the process of digital transformation, which can be a complex and costly undertaking. Digital transformation involves the collection and analysis of large amounts of data, which can create challenges around data management and governance. Furthermore, it is also challenged to integrate data from multiple systems and technologies. Although with these pains, companies are still pursuing digitalization because by embracing advanced technologies, companies can improve efficiency, quality, decision-making, and customer experience while also creating different business models and revenue streams. In this paper, the issue that data is stored in data silos with different schema and structures is focused. The conventional approaches to addressing this issue involve utilizing data warehousing, data integration tools, data standardization, and business intelligence tools. However, these approaches primarily focus on the grammar and structure of the data and neglect the importance of semantic modeling and semantic standardization, which are essential for achieving data interoperability. In this session, the challenge of data silos in Industry 4.0 is addressed by developing a semantic modeling approach compliant with Asset Administration Shell (AAS) models as an efficient standard for communication in Industry 4.0. The paper highlights how our approach can facilitate the data mapping process and semantic lifting according to existing industry standards such as ECLASS and other industrial dictionaries. It also incorporates the Asset Administration Shell technology to model and map the company’s data and utilize a knowledge graph for data storage and exploration.

Keywords: data interoperability in industry 4.0, digital integration, industrial dictionary, semantic modeling

Procedia PDF Downloads 70
366 Finite Element Modeling of Mass Transfer Phenomenon and Optimization of Process Parameters for Drying of Paddy in a Hybrid Solar Dryer

Authors: Aprajeeta Jha, Punyadarshini P. Tripathy

Abstract:

Drying technologies for various food processing operations shares an inevitable linkage with energy, cost and environmental sustainability. Hence, solar drying of food grains has become imperative choice to combat duo challenges of meeting high energy demand for drying and to address climate change scenario. But performance and reliability of solar dryers depend hugely on sunshine period, climatic conditions, therefore, offer a limited control over drying conditions and have lower efficiencies. Solar drying technology, supported by Photovoltaic (PV) power plant and hybrid type solar air collector can potentially overpower the disadvantages of solar dryers. For development of such robust hybrid dryers; to ensure quality and shelf-life of paddy grains the optimization of process parameter becomes extremely critical. Investigation of the moisture distribution profile within the grains becomes necessary in order to avoid over drying or under drying of food grains in hybrid solar dryer. Computational simulations based on finite element modeling can serve as potential tool in providing a better insight of moisture migration during drying process. Hence, present work aims at optimizing the process parameters and to develop a 3-dimensional (3D) finite element model (FEM) for predicting moisture profile in paddy during solar drying. COMSOL Multiphysics was employed to develop a 3D finite element model for predicting moisture profile. Furthermore, optimization of process parameters (power level, air velocity and moisture content) was done using response surface methodology in design expert software. 3D finite element model (FEM) for predicting moisture migration in single kernel for every time step has been developed and validated with experimental data. The mean absolute error (MAE), mean relative error (MRE) and standard error (SE) were found to be 0.003, 0.0531 and 0.0007, respectively, indicating close agreement of model with experimental results. Furthermore, optimized process parameters for drying paddy were found to be 700 W, 2.75 m/s at 13% (wb) with optimum temperature, milling yield and drying time of 42˚C, 62%, 86 min respectively, having desirability of 0.905. Above optimized conditions can be successfully used to dry paddy in PV integrated solar dryer in order to attain maximum uniformity, quality and yield of product. PV-integrated hybrid solar dryers can be employed as potential and cutting edge drying technology alternative for sustainable energy and food security.

Keywords: finite element modeling, moisture migration, paddy grain, process optimization, PV integrated hybrid solar dryer

Procedia PDF Downloads 123
365 A Study on the Effects of a Mindfulness Training on Managers: The Case of the Malian Company for the Development of Textile

Authors: Aboubacar Garba Konte, Wei Jun, Li Xiaohui

Abstract:

Nowadays companies are facing increasing pressure. The market environment changes more frequently than ever. Therefore, managers have to develop their agility, their performance and their capacity for innovation. Most companies look for managerial innovations to develop in their employees qualities such as motivation, commitment, creativity, autonomy or even the ability to adapt to change and manage intensive pressure. On a more collective level, companies are looking for teams that are able to organize, communicate and develop a form of collective intelligence based on cooperation and solidarity. Among the many managerial innovations that are currently developing, mindfulness (or mindfulness) is drawing the attention of a growing number of companies (Google, Apple, Sony, ING ...), These companies have implemented programs based on mindfulness. Although the concept of mindfulness and its effects have been the subject of in-depth research in the psychological field, research on mindfulness in the field of management is still in its infancy and it is necessary to evaluate its contribution to organizations. The purpose of this research is to evaluate the effects of a mindfulness training among the managers of a Malian textile company (CMDT). We conducted a case study on their experience and their managerial practices. In addition, we discuss the innovative nature of mindfulness in terms of managerial practice The results show significant positive effects on two major skills identified by managers that raise significant difficulties in their daily lives: their ability to supervise a team of employees with all that this implies in terms of interpersonal skills and their ability to organize and prioritize their activities. In addition, the research methodology sheds light on the innovative nature of mindfulness in a favorable organizational environment.

Keywords: mindfulness, manager, managerial innovation, relational skills, organization and prioritization

Procedia PDF Downloads 81