Search results for: assistive technologies
1120 Pharmacological Activities and Potential Uses of Cyperus Rotundus: A Review
Authors: Arslan Masood Pirzada, Muhammad Naeem, Hafiz Haider Ali, Muhammad Latif, Aown Sammar Raza, Asad Hussain Bukhari, Muhammad Saqib, Muhammad Ijaz
Abstract:
Cyperus rotundus (Cyperaceae), a medicinal herb, is being traditionally used as a home remedy for the treatment of various clinical conditions like diarrhea, diabetic, pyretic, inflammation, malaria, and for treating stomach and bowel disorders. Its current status is one of the most widespread, troublesome, and economically damaging agronomic weeds, growing wildly in various tropical and sub-tropical regions of the world. Tuber and rhizomes of Cyperus rotundus possess a higher concentration of active ingredients in the form of essential oils, phenolic acids, ascorbic acids and flavonoids, responsible for its remedial properties. Exploitation of any medicinal plant application depends on the crucial and comprehensive information about the therapeutic potential of a plant. Researchers have evaluated and characterized the significance of Cyperus rotundus as an anti-androgenic, anti-bacterial, anti-cancerous, anti-convulsant, anti-diabetic, anti-diarrheal, anti-genotoxic, anti-inflammatory, anti-lipidemic, anti-malarial, anti-mutagenic, anti-obesity, anti-oxidant, anti-uropathogenic, hepato-, cardio-, neuroprotective, and nootropic agent. This paper comprises a broad review to summarize the current state of knowledge about chemical constituents, potential economic uses and therapeutic aspects of Cyperus rotundus that will aid in the development of bioethanol and modern herbal medicine through latest technologies that will promote the ability of this plant in the cure of many clinical disorders.Keywords: purple nutsedge, chemical composition, economic uses, therapeutic values, future directions
Procedia PDF Downloads 5151119 Challenges and Pedagogical Strategies in Teaching Chemical Bonding: Perspectives from Moroccan Educators
Authors: Sara Atibi, Azzeddine Atibi, Salim Ahmed, Khadija El Kababi
Abstract:
The concept of chemical bonding is fundamental in chemistry education, ubiquitous in school curricula, and essential to numerous topics in the field. Mastery of this concept enables students to predict and explain the physical and chemical properties of substances. However, chemical bonding is often regarded as one of the most complex concepts for secondary and higher education students to comprehend, due to the underlying complex theory and the use of abstract models. Teachers also encounter significant challenges in conveying this concept effectively. This study aims to identify the difficulties and alternative conceptions faced by Moroccan secondary school students in learning about chemical bonding, as well as the pedagogical strategies employed by teachers to overcome these obstacles. A survey was conducted involving 150 Moroccan secondary school physical science teachers, using a structured questionnaire comprising closed, open-ended, and multiple-choice questions. The results reveal frequent student misconceptions, such as the octet rule, molecular geometry, and molecular polarity. Contributing factors to these misconceptions include the abstract nature of the concepts, the use of models, and teachers' difficulties in explaining certain aspects of chemical bonding. The study proposes improvements for teaching chemical bonding, such as integrating information and communication technologies (ICT), diversifying pedagogical tools, and considering students' pre-existing conceptions. These recommendations aim to assist teachers, curriculum developers, and textbook authors in making chemistry more accessible and in addressing students' misconceptions.Keywords: chemical bonding, alternative conceptions, chemistry education, pedagogical strategies
Procedia PDF Downloads 261118 Deep Reinforcement Learning-Based Computation Offloading for 5G Vehicle-Aware Multi-Access Edge Computing Network
Authors: Ziying Wu, Danfeng Yan
Abstract:
Multi-Access Edge Computing (MEC) is one of the key technologies of the future 5G network. By deploying edge computing centers at the edge of wireless access network, the computation tasks can be offloaded to edge servers rather than the remote cloud server to meet the requirements of 5G low-latency and high-reliability application scenarios. Meanwhile, with the development of IOV (Internet of Vehicles) technology, various delay-sensitive and compute-intensive in-vehicle applications continue to appear. Compared with traditional internet business, these computation tasks have higher processing priority and lower delay requirements. In this paper, we design a 5G-based Vehicle-Aware Multi-Access Edge Computing Network (VAMECN) and propose a joint optimization problem of minimizing total system cost. In view of the problem, a deep reinforcement learning-based joint computation offloading and task migration optimization (JCOTM) algorithm is proposed, considering the influences of multiple factors such as concurrent multiple computation tasks, system computing resources distribution, and network communication bandwidth. And, the mixed integer nonlinear programming problem is described as a Markov Decision Process. Experiments show that our proposed algorithm can effectively reduce task processing delay and equipment energy consumption, optimize computing offloading and resource allocation schemes, and improve system resource utilization, compared with other computing offloading policies.Keywords: multi-access edge computing, computation offloading, 5th generation, vehicle-aware, deep reinforcement learning, deep q-network
Procedia PDF Downloads 1201117 Academic Staff Perspective of Adoption of Augmented Reality in Teaching Practice to Support Students Learning Remotely in a Crisis Time in Higher
Authors: Ebtisam Alqahtani
Abstract:
The purpose of this study is to investigate academic staff perspectives on using Augmented Reality in teaching practice to support students learning remotely during the COVID pandemic. the study adopted the DTPB theoretical model to guide the identification of key potential factors that could motivate academic staff to use or not use AR in teaching practices. A mixing method design was adopted for a better understanding of the study problem. A survey was completed by 851 academic staff, and this was followed by interviews with 20 academic staff. Statistical analyses were used to assess the survey data, and thematic analysis was used to assess the interview data. The study finding indicates that 75% of academic staff were aware of AR as a pedagogical tool, and they agreed on the potential benefits of AR in teaching and learning practices. However, 36% of academic staff use it in teaching and learning practice, and most of them agree with most of the potential barriers to adopting AR in educational environments. In addition, the study results indicate that 91% of them are planning to use it in the future. The most important factors that motivated them to use it in the future are the COVID pandemic factor, hedonic motivation factor, and academic staff attitude factor. The perceptions of academic staff differed according to the universities they attended, the faculties they worked in, and their gender. This study offers further empirical support for the DTPB model, as well as recommendations to help higher education implement technology in its educational environment based on the findings of the study. It is unprecedented the study the necessity of the use of AR technologies in the time of Covid-19. Therefore, the contribution is both theoretical and practiceKeywords: higher education, academic staff, AR technology as pedological tools, teaching and learning practice, benefits of AR, barriers of adopting AR, and motivating factors to adopt AR
Procedia PDF Downloads 1301116 Statistical Analysis and Optimization of a Process for CO2 Capture
Authors: Muftah H. El-Naas, Ameera F. Mohammad, Mabruk I. Suleiman, Mohamed Al Musharfy, Ali H. Al-Marzouqi
Abstract:
CO2 capture and storage technologies play a significant role in contributing to the control of climate change through the reduction of carbon dioxide emissions into the atmosphere. The present study evaluates and optimizes CO2 capture through a process, where carbon dioxide is passed into pH adjusted high salinity water and reacted with sodium chloride to form a precipitate of sodium bicarbonate. This process is based on a modified Solvay process with higher CO2 capture efficiency, higher sodium removal, and higher pH level without the use of ammonia. The process was tested in a bubble column semi-batch reactor and was optimized using response surface methodology (RSM). CO2 capture efficiency and sodium removal were optimized in terms of major operating parameters based on four levels and variables in Central Composite Design (CCD). The operating parameters were gas flow rate (0.5–1.5 L/min), reactor temperature (10 to 50 oC), buffer concentration (0.2-2.6%) and water salinity (25-197 g NaCl/L). The experimental data were fitted to a second-order polynomial using multiple regression and analyzed using analysis of variance (ANOVA). The optimum values of the selected variables were obtained using response optimizer. The optimum conditions were tested experimentally using desalination reject brine with salinity ranging from 65,000 to 75,000 mg/L. The CO2 capture efficiency in 180 min was 99% and the maximum sodium removal was 35%. The experimental and predicted values were within 95% confidence interval, which demonstrates that the developed model can successfully predict the capture efficiency and sodium removal using the modified Solvay method.Keywords: CO2 capture, water desalination, Response Surface Methodology, bubble column reactor
Procedia PDF Downloads 2881115 Design and Implementation of a Software Platform Based on Artificial Intelligence for Product Recommendation
Authors: Giuseppina Settanni, Antonio Panarese, Raffaele Vaira, Maurizio Galiano
Abstract:
Nowdays, artificial intelligence is used successfully in academia and industry for its ability to learn from a large amount of data. In particular, in recent years the use of machine learning algorithms in the field of e-commerce has spread worldwide. In this research study, a prototype software platform was designed and implemented in order to suggest to users the most suitable products for their needs. The platform includes a chatbot and a recommender system based on artificial intelligence algorithms that provide suggestions and decision support to the customer. The recommendation systems perform the important function of automatically filtering and personalizing information, thus allowing to manage with the IT overload to which the user is exposed on a daily basis. Recently, international research has experimented with the use of machine learning technologies with the aim to increase the potential of traditional recommendation systems. Specifically, support vector machine algorithms have been implemented combined with natural language processing techniques that allow the user to interact with the system, express their requests and receive suggestions. The interested user can access the web platform on the internet using a computer, tablet or mobile phone, register, provide the necessary information and view the products that the system deems them most appropriate. The platform also integrates a dashboard that allows the use of the various functions, which the platform is equipped with, in an intuitive and simple way. Artificial intelligence algorithms have been implemented and trained on historical data collected from user browsing. Finally, the testing phase allowed to validate the implemented model, which will be further tested by letting customers use it.Keywords: machine learning, recommender system, software platform, support vector machine
Procedia PDF Downloads 1341114 A Review on Benzo(a)pyrene Emission Factors from Biomass Combustion
Authors: Franziska Klauser, Manuel Schwabl, Alexander Weissinger, Christoph Schmidl, Walter Haslinger, Anne Kasper-Giebl
Abstract:
Benzo(a)pyrene (BaP) is the most widely investigated representative of Polycyclic Aromatic Hydrocarbons (PAH) as well as one of the most toxic compounds in this group. Since 2013 in the European Union a limit value for BaP concentration in the ambient air is applied, which was set to a yearly average value of 1 ng m-3. Several reports show that in some regions, even where industry and traffic are of minor impact this threshold is regularly exceeded. This is taken as proof that biomass combustion for heating purposes contributes significantly to BaP pollution. Several investigations have been already carried out on the BaP emission behavior of biomass combustion furnaces, mostly focusing on a certain aspect like the influences from wood type, of operation type or of technology type. However, a superior view on emission patterns of BaP from biomass combustion and the aggregation of determined values also from recent studies is not presented so far. The combination of determined values allows a better understanding of the BaP emission behavior from biomass combustion. In this work the review conclusions are driven from the combination of outcomes from different publication. In two examples it was shown that technical progress leads to 10 to 100 fold lower BaP emission from modern furnaces compared to old technologies of equivalent type. It was also indicated that the operation with pellets or wood chips exhibits clearly lower BaP emission factors compared to operation with log wood. Although, the BaP emission level from automatic furnaces is strongly impacted by the kind of operation. This work delivers an overview on BaP emission factors from different biomass combustion appliances, from different operation modes and from the combustion of different fuel and wood types. The main impact factors are depicted, and suggestions for low BaP emission biomass combustion are derived. As one result possible investigation fields concerning BaP emissions from biomass combustion that seem to be most important to be clarified are suggested.Keywords: benzo(a)pyrene, biomass, combustion, emission, pollution
Procedia PDF Downloads 3581113 C-eXpress: A Web-Based Analysis Platform for Comparative Functional Genomics and Proteomics in Human Cancer Cell Line, NCI-60 as an Example
Authors: Chi-Ching Lee, Po-Jung Huang, Kuo-Yang Huang, Petrus Tang
Abstract:
Background: Recent advances in high-throughput research technologies such as new-generation sequencing and multi-dimensional liquid chromatography makes it possible to dissect the complete transcriptome and proteome in a single run for the first time. However, it is almost impossible for many laboratories to handle and analysis these “BIG” data without the support from a bioinformatics team. We aimed to provide a web-based analysis platform for users with only limited knowledge on bio-computing to study the functional genomics and proteomics. Method: We use NCI-60 as an example dataset to demonstrate the power of the web-based analysis platform and data delivering system: C-eXpress takes a simple text file that contain the standard NCBI gene or protein ID and expression levels (rpkm or fold) as input file to generate a distribution map of gene/protein expression levels in a heatmap diagram organized by color gradients. The diagram is hyper-linked to a dynamic html table that allows the users to filter the datasets based on various gene features. A dynamic summary chart is generated automatically after each filtering process. Results: We implemented an integrated database that contain pre-defined annotations such as gene/protein properties (ID, name, length, MW, pI); pathways based on KEGG and GO biological process; subcellular localization based on GO cellular component; functional classification based on GO molecular function, kinase, peptidase and transporter. Multiple ways of sorting of column and rows is also provided for comparative analysis and visualization of multiple samples.Keywords: cancer, visualization, database, functional annotation
Procedia PDF Downloads 6191112 Effective Water Purification by Impregnated Carbon Nanotubes
Authors: Raviteja Chintala
Abstract:
Water shortage in many areas of the world have predominantly increased the demand for efficient methods involved in the production of drinking water, So purification of water invoking cost effective and efficient methods is a challenging field of research. In this regard, Reverse osmosis membrane desalination of both seawater and inland brackish water is currently being deployed in various locations around the world. In the present work an attempt is made to integrate these existing technologies with novel method, Wherein carbon nanotubes at the lab scale are prepared which further replace activated carbon tubes being used traditionally. This has proven to enhance the efficiency of the water filter, Effectively neutralising most of the organic impurities. Furthermore, This ensures the reduction in TDS. Carbon nanotubes have wide range in scope of applications such as composite reinforcements, Field emitters, Sensors, Energy storage and energy conversion devices and catalysts support phases, Because of their unusual mechanical, Electrical, Thermal and structural properties. In particular, The large specific surface area, as well as the high chemical and thermal stability, Makes carbon nanotube an attractive adsorbent in waste water treatment. Carbon nanotubes are effective in eliminating these harmful media from water as an adsorbent. In this work, Candle soot method has been incorporated for the preparation of carbon nanotubes and mixed with activated charcoal in different compositions. The effect of composition change is monitored by using TDS measuring meter. As the composition of Nano carbon increases, The TDS of the water gradually decreases. In order to enhance the life time for carbon filter, Nano tubes are provided with larger surface area.Keywords: TDS (Total Dissolved Solids), carbon nanotubes, water, candle soot
Procedia PDF Downloads 3401111 The Development of Student Core Competencies through the STEM Education Opportunities in Classroom
Authors: Z. Dedovets, M. Rodionov
Abstract:
The goal of the modern education system is to prepare students to be able to adapt to ever-changing life situations. They must be able to acquire required knowledge independently; apply such knowledge in practice to solve various problems by using modern technologies; think critically and creatively; competently use information; be communicative, work in a team; and develop their own moral values, intellect and cultural awareness. As a result, the status of education significantly increases; new requirements to its quality have been formed. In recent years, the competency-based approach in education has become of significant interest. This approach is a strengthening of applied and practical characteristics of a school education and leads to the forming of the key students’ competencies which define their success in future life. In this article, the authors’ attention focuses on a range of key competencies, educational, informational and communicative and on the possibility to develop such competencies via STEM education. This research shows the change in students’ attitude towards scientific disciplines such as mathematics, general science, technology and engineering as a result of STEM education. Two-staged analyzes questionnaires completed by students of forms II to IV in the republic of Trinidad and Tobago allowed the authors to categorize students between two levels that represent students’ attitude to various disciplines. The significance of differences between selected levels was confirmed with the use of Pearsons’ chi-squared test. In summary, the analysis of obtained data makes it possible to conclude that STEM education has a great potential for development of core students’ competencies and encourages the development of positive student attitude towards the above mentioned above scientific disciplines.Keywords: STEM, science, technology, engineering, mathematics, students’ competency, Pearson's chi-squared test
Procedia PDF Downloads 3871110 Understanding Cyber Kill Chains: Optimal Allocation of Monitoring Resources Using Cooperative Game Theory
Authors: Roy. H. A. Lindelauf
Abstract:
Cyberattacks are complex processes consisting of multiple interwoven tasks conducted by a set of agents. Interdictions and defenses against such attacks often rely on cyber kill chain (CKC) models. A CKC is a framework that tries to capture the actions taken by a cyber attacker. There exists a growing body of literature on CKCs. Most of this work either a) describes the CKC with respect to one or more specific cyberattacks or b) discusses the tools and technologies used by the attacker at each stage of the CKC. Defenders, facing scarce resources, have to decide where to allocate their resources given the CKC and partial knowledge on the tools and techniques attackers use. In this presentation CKCs are analyzed through the lens of covert projects, i.e., interrelated tasks that have to be conducted by agents (human and/or computer) with the aim of going undetected. Various aspects of covert project models have been studied abundantly in the operations research and game theory domain, think of resource-limited interdiction actions that maximally delay completion times of a weapons project for instance. This presentation has investigated both cooperative and non-cooperative game theoretic covert project models and elucidated their relation to CKC modelling. To view a CKC as a covert project each step in the CKC is broken down into tasks and there are players of which each one is capable of executing a subset of the tasks. Additionally, task inter-dependencies are represented by a schedule. Using multi-glove cooperative games it is shown how a defender can optimize the allocation of his scarce resources (what, where and how to monitor) against an attacker scheduling a CKC. This study presents and compares several cooperative game theoretic solution concepts as metrics for assigning resources to the monitoring of agents.Keywords: cyber defense, cyber kill chain, game theory, information warfare techniques
Procedia PDF Downloads 1421109 A QoS Aware Cluster Based Routing Algorithm for Wireless Mesh Network Using LZW Lossless Compression
Authors: J. S. Saini, P. P. K. Sandhu
Abstract:
The multi-hop nature of Wireless Mesh Networks and the hasty progression of throughput demands results in multi- channels and multi-radios structures in mesh networks, but the main problem of co-channels interference reduces the total throughput, specifically in multi-hop networks. Quality of Service mentions a vast collection of networking technologies and techniques that guarantee the ability of a network to make available desired services with predictable results. Quality of Service (QoS) can be directed at a network interface, towards a specific server or router's performance, or in specific applications. Due to interference among various transmissions, the QoS routing in multi-hop wireless networks is formidable task. In case of multi-channel wireless network, since two transmissions using the same channel may interfere with each other. This paper has considered the Destination Sequenced Distance Vector (DSDV) routing protocol to locate the secure and optimised path. The proposed technique also utilizes the Lempel–Ziv–Welch (LZW) based lossless data compression and intra cluster data aggregation to enhance the communication between the source and the destination. The use of clustering has the ability to aggregate the multiple packets and locates a single route using the clusters to improve the intra cluster data aggregation. The use of the LZW based lossless data compression has ability to reduce the data packet size and hence it will consume less energy, thus increasing the network QoS. The MATLAB tool has been used to evaluate the effectiveness of the projected technique. The comparative analysis has shown that the proposed technique outperforms over the existing techniques.Keywords: WMNS, QOS, flooding, collision avoidance, LZW, congestion control
Procedia PDF Downloads 3401108 VISSIM Modeling of Driver Behavior at Connecticut Roundabouts
Authors: F. Clara Fang, Hernan Castaneda
Abstract:
The Connecticut Department of Transportation (ConnDOT) has constructed four roundabouts in the State of Connecticut within the past ten years. VISSIM traffic simulation software was utilized to analyze these roundabouts during their design phase. The queue length and level of service observed in the field appear to be better than predicted by the VISSIM model. The objectives of this project are to: identify VISSIM input variables most critical to accurate modeling; recommend VISSIM calibration factors; and, provide other recommendations for roundabout traffic operations modeling. Traffic data were collected at these roundabouts using Miovision Technologies. Cameras were set up to capture vehicle circulating activity and entry behavior for two weekdays. A large sample size of filed data was analyzed to achieve accurate and statistically significant results. The data extracted from the videos include: vehicle circulating speed; critical gap estimated by Maximum Likelihood Method; peak hour volume; follow-up headway; travel time; and, vehicle queue length. A VISSIM simulation of existing roundabouts was built to compare both queue length and travel time predicted from simulation with measured in the field. The research investigated a variety of simulation parameters as calibration factors for describing driver behaviors at roundabouts. Among them, critical gap is the most effective calibration variable in roundabout simulation. It has a significant impact to queue length, particularly when the volume is higher. The results will improve the design of future roundabouts in Connecticut and provide decision makers with insights on the relationship between various choices and future performance.Keywords: driver critical gap, roundabout analysis, simulation, VISSIM modeling
Procedia PDF Downloads 2921107 Iot-Based Interactive Patient Identification and Safety Management System
Authors: Jonghoon Chun, Insung Kim, Jonghyun Lim, Gun Ro
Abstract:
We believe that it is possible to provide a solution to reduce patient safety accidents by displaying correct medical records and prescription information through interactive patient identification. Our system is based on the use of smart bands worn by patients and these bands communicate with the hybrid gateways which understand both BLE and Wifi communication protocols. Through the convergence of low-power Bluetooth (BLE) and hybrid gateway technology, which is one of short-range wireless communication technologies, we implement ‘Intelligent Patient Identification and Location Tracking System’ to prevent medical malfunction frequently occurring in medical institutions. Based on big data and IOT technology using MongoDB, smart band (BLE, NFC function) and hybrid gateway, we develop a system to enable two-way communication between medical staff and hospitalized patients as well as to store locational information of the patients in minutes. Based on the precise information provided using big data systems, such as location tracking and movement of in-hospital patients wearing smart bands, our findings include the fact that a patient-specific location tracking algorithm can more efficiently operate HIS (Hospital Information System) and other related systems. Through the system, we can always correctly identify patients using identification tags. In addition, the system automatically determines whether the patient is a scheduled for medical service by the system in use at the medical institution, and displays the appropriateness of the medical treatment and the medical information (medical record and prescription information) on the screen and voice. This work was supported in part by the Korea Technology and Information Promotion Agency for SMEs (TIPA) grant funded by the Korean Small and Medium Business Administration (No. S2410390).Keywords: BLE, hybrid gateway, patient identification, IoT, safety management, smart band
Procedia PDF Downloads 3111106 Smart Technology Work Practices to Minimize Job Pressure
Authors: Babar Rasheed
Abstract:
The organizations are in continuous effort to increase their yield and to retain their associates, employees. Technology is considered an integral part of attaining apposite work practices, work environment, and employee engagement. Unconsciously, these advanced practices like work from home, personalized intra-network are disturbing employee work-life balance which ultimately increases psychological pressure on employees. The smart work practice is to develop business models and organizational practices with enhanced employee engagement, minimum trouncing of organization resources with persistent revenue and positive addition in global societies. Need of smart work practices comes from increasing employee turnover rate, global economic recession, unnecessary job pressure, increasing contingent workforce and advancement in technologies. Current practices are not enough elastic to tackle global changing work environment and organizational competitions. Current practices are causing many reciprocal problems among employee and organization mechanically. There is conscious understanding among business sectors smart work practices that will deal with new century challenges with addressing the concerns of relevant issues. It is aimed in this paper to endorse customized and smart work practice tools along knowledge framework to manage the growing concerns of employee engagement, use of technology, orgaization concerns and challenges for the business. This includes a Smart Management Information System to address necessary concerns of employees and combine with a framework to extract the best possible ways to allocate companies resources and re-align only required efforts to adopt the best possible strategy for controlling potential risks.Keywords: employees engagement, management information system, psychological pressure, current and future HR practices
Procedia PDF Downloads 1851105 Non-Invasive Data Extraction from Machine Display Units Using Video Analytics
Authors: Ravneet Kaur, Joydeep Acharya, Sudhanshu Gaur
Abstract:
Artificial Intelligence (AI) has the potential to transform manufacturing by improving shop floor processes such as production, maintenance and quality. However, industrial datasets are notoriously difficult to extract in a real-time, streaming fashion thus, negating potential AI benefits. The main example is some specialized industrial controllers that are operated by custom software which complicates the process of connecting them to an Information Technology (IT) based data acquisition network. Security concerns may also limit direct physical access to these controllers for data acquisition. To connect the Operational Technology (OT) data stored in these controllers to an AI application in a secure, reliable and available way, we propose a novel Industrial IoT (IIoT) solution in this paper. In this solution, we demonstrate how video cameras can be installed in a factory shop floor to continuously obtain images of the controller HMIs. We propose image pre-processing to segment the HMI into regions of streaming data and regions of fixed meta-data. We then evaluate the performance of multiple Optical Character Recognition (OCR) technologies such as Tesseract and Google vision to recognize the streaming data and test it for typical factory HMIs and realistic lighting conditions. Finally, we use the meta-data to match the OCR output with the temporal, domain-dependent context of the data to improve the accuracy of the output. Our IIoT solution enables reliable and efficient data extraction which will improve the performance of subsequent AI applications.Keywords: human machine interface, industrial internet of things, internet of things, optical character recognition, video analytics
Procedia PDF Downloads 1111104 Analysis of Organizational Hybrid Agile Methods Environments: Frameworks, Benefits, and Challenges
Authors: Majid Alsubaie, Hamed Sarbazhosseini
Abstract:
Many working environments have experienced increased uncertainty due to the fast-moving and unpredictable world. IT systems development projects, in particular, face several challenges because of their rapidly changing environments and emerging technologies. Information technology organizations within these contexts adapt systems development methodology and new software approaches to address this issue. One of these methodologies is the Agile method, which has gained huge attention in recent years. However, due to failure rates in IT projects, there is an increasing demand for the use of hybrid Agile methods among organizations. The scarce research in the area means that organizations do not have solid evidence-based knowledge for the use of hybrid Agile. This research was designed to provide further insights into the development of hybrid Agile methods within systems development projects, including how frameworks and processes are used and what benefits and challenges are gained and faced as a result of hybrid Agile methods. This paper presents how three organizations (two government and one private) use hybrid Agile methods in their Agile environments. The data was collected through interviews and a review of relevant documents. The results indicate that these organizations do not predominantly use pure Agile. Instead, they are waterfall organizations by virtue of systems nature and complexity, and Agile is used underneath as the delivery model. Prince2 Agile framework, SAFe, Scrum, and Kanban were the identified models and frameworks followed. This study also found that customer satisfaction and the ability to build quickly are the most frequently perceived benefits of using hybrid Agile methods. In addition, team resistance and scope changes are the common challenges identified by research participants in their working environments. The findings can help to understand Agile environmental conditions and projects that can help get better success rates and customer satisfaction.Keywords: agile, hybrid, IT systems, management, success rate, technology
Procedia PDF Downloads 1081103 The Application of Participatory Social Media in Collaborative Planning: A Systematic Review
Authors: Yujie Chen , Zhen Li
Abstract:
In the context of planning transformation, how to promote public participation in the formulation and implementation of collaborative planning has been the focused issue of discussion. However, existing studies have often been case-specific or focused on a specific design field, leaving the role of participatory social media (PSM) in urban collaborative planning generally questioned. A systematic database search was conducted in December 2019. Articles and projects were eligible if they reported a quantitative empirical study applying participatory social media in the collaborative planning process (a prospective, retrospective, experimental, longitudinal research, or collective actions in planning practices). Twenty studies and seven projects were included in the review. Findings showed that social media are generally applied in public spatial behavior, transportation behavior, and community planning fields, with new technologies and new datasets. PSM has provided a new platform for participatory design, decision analysis, and collaborative negotiation most widely used in participatory design. Findings extracted several existing forms of PSM. PSM mainly act as three roles: the language of decision-making for communication, study mode for spatial evaluation, and decision agenda for interactive decision support. Three optimization content of PSM were recognized, including improving participatory scale, improvement of the grass-root organization, and promotion of politics. However, basically, participants only could provide information and comment through PSM in the future collaborative planning process, therefore the issues of low data response rate, poor spatial data quality, and participation sustainability issues worth more attention and solutions.Keywords: participatory social media, collaborative planning, planning workshop, application mode
Procedia PDF Downloads 1361102 The Role of Heat Pumps in the Decarbonization of European Regions
Authors: Domenico M. Mongelli, Michele De Carli, Laura Carnieletto, Filippo Busato
Abstract:
Europe's dependence on imported fossil fuels has been particularly highlighted by the Russian invasion of Ukraine. Limiting this dependency with a massive replacement of fossil fuel boilers with heat pumps for building heating is the goal of this work. Therefore, with the aim of diversifying energy sources and evaluating the potential use of heat pump technologies for residential buildings with a view to decarbonization, the quantitative reduction in the consumption of fossil fuels was investigated in all regions of Europe through the use of heat pumps. First, a general overview of energy consumption in buildings in Europe has been assessed. The consumption of buildings has been addressed to the different uses (heating, cooling, DHW, etc.) as well as the different sources (natural gas, oil, biomass, etc.). The analysis has been done in order to provide a baseline at the European level on the current consumptions and future consumptions, with a particular interest in the future increase of cooling. A database was therefore created on the distribution of residential energy consumption linked to air conditioning among the various energy carriers (electricity, waste heat, gas, solid fossil fuels, liquid fossil fuels, and renewable sources) for each region in Europe. Subsequently, the energy profiles of various European cities representative of the different climates are analyzed in order to evaluate, in each European climatic region, which energy coverage can be provided by heat pumps in replacement of natural gas and solid and liquid fossil fuels for air conditioning of the buildings, also carrying out the environmental and economic assessments for this energy transition operation. This work aims to make an innovative contribution to the evaluation of the potential for introducing heat pump technology for decarbonization in the air conditioning of buildings in all climates of the different European regions.Keywords: heat pumps, heating, decarbonization, energy policies
Procedia PDF Downloads 1311101 Unlocking the Genetic Code: Exploring the Potential of DNA Barcoding for Biodiversity Assessment
Authors: Mohammed Ahmed Ahmed Odah
Abstract:
DNA barcoding is a crucial method for assessing and monitoring species diversity amidst escalating threats to global biodiversity. The author explores DNA barcoding's potential as a robust and reliable tool for biodiversity assessment. It begins with a comprehensive review of existing literature, delving into the theoretical foundations, methodologies and applications of DNA barcoding. The suitability of various DNA regions, like the COI gene, as universal barcodes is extensively investigated. Additionally, the advantages and limitations of different DNA sequencing technologies and bioinformatics tools are evaluated within the context of DNA barcoding. To evaluate the efficacy of DNA barcoding, diverse ecosystems, including terrestrial, freshwater and marine habitats, are sampled. Extracted DNA from collected specimens undergoes amplification and sequencing of the target barcode region. Comparison of the obtained DNA sequences with reference databases allows for the identification and classification of the sampled organisms. Findings demonstrate that DNA barcoding accurately identifies species, even in cases where morphological identification proves challenging. Moreover, it sheds light on cryptic and endangered species, aiding conservation efforts. The author also investigates patterns of genetic diversity and evolutionary relationships among different taxa through the analysis of genetic data. This research contributes to the growing knowledge of DNA barcoding and its applicability for biodiversity assessment. The advantages of this approach, such as speed, accuracy and cost-effectiveness, are highlighted, along with areas for improvement. By unlocking the genetic code, DNA barcoding enhances our understanding of biodiversity, supports conservation initiatives and informs evidence-based decision-making for the sustainable management of ecosystems.Keywords: DNA barcoding, biodiversity assessment, genetic code, species identification, taxonomic resolution, next-generation sequencing
Procedia PDF Downloads 271100 The Quest for Institutional Independence to Advance Police Pluralism in Ethiopia
Authors: Demelash Kassaye Debalkie
Abstract:
The primary objective of this study is to report the tributes that are significantly impeding the Ethiopian police's ability to provide quality services to the people. Policing in Ethiopia started in the medieval period. However, modern policing was introduced instead of vigilantism in the early 1940s. The progress counted since the date police became modernized is, however, under contention when viewed from the standpoint of officers’ development and technologies in the 21st century. The police in Ethiopia are suffering a lot to be set free from any form of political interference by the government and to be loyal to impartiality, equity, and justice in enforcing the law. Moreover, the institutional competence of the police in Ethiopia is currently losing its power derived from the constitution as a legitimate enforcement agency due to the country’s political landscape encouraging ethnic-based politics. According to studies, the impact of ethnic politics has been a significant challenge for police in controlling conflicts between two ethnic groups. The study used qualitative techniques and data was gathered from key informants selected purposely. The findings indicate that governments in the past decades were skeptical about establishing a constitutional police force in the country. This has certainly been one of the challenges of pluralizing the police: building police-community relations based on trust. The study conducted to uncover the obstructions has finally reported that the government’s commitment to form a non-partisan, functionally decentralized, and operationally demilitarized police force is too minimal and appalling. They mainly intend to formulate the missions of the police in accordance with their interests and political will to remain in power. It, therefore, reminds the policymakers, law enforcement officials, and the government in power to revise its policies and working procedures already operational to strengthen the police in Ethiopia based on public participation and engagement.Keywords: community, constitution, Ethiopia, law enforcement
Procedia PDF Downloads 871099 Meanings and Concepts of Standardization in Systems Medicine
Authors: Imme Petersen, Wiebke Sick, Regine Kollek
Abstract:
In systems medicine, high-throughput technologies produce large amounts of data on different biological and pathological processes, including (disturbed) gene expressions, metabolic pathways and signaling. The large volume of data of different types, stored in separate databases and often located at different geographical sites have posed new challenges regarding data handling and processing. Tools based on bioinformatics have been developed to resolve the upcoming problems of systematizing, standardizing and integrating the various data. However, the heterogeneity of data gathered at different levels of biological complexity is still a major challenge in data analysis. To build multilayer disease modules, large and heterogeneous data of disease-related information (e.g., genotype, phenotype, environmental factors) are correlated. Therefore, a great deal of attention in systems medicine has been put on data standardization, primarily to retrieve and combine large, heterogeneous datasets into standardized and incorporated forms and structures. However, this data-centred concept of standardization in systems medicine is contrary to the debate in science and technology studies (STS) on standardization that rather emphasizes the dynamics, contexts and negotiations of standard operating procedures. Based on empirical work on research consortia that explore the molecular profile of diseases to establish systems medical approaches in the clinic in Germany, we trace how standardized data are processed and shaped by bioinformatics tools, how scientists using such data in research perceive such standard operating procedures and which consequences for knowledge production (e.g. modeling) arise from it. Hence, different concepts and meanings of standardization are explored to get a deeper insight into standard operating procedures not only in systems medicine, but also beyond.Keywords: data, science and technology studies (STS), standardization, systems medicine
Procedia PDF Downloads 3421098 Digital Tools in Education and Online Learning in the Field of Accounting
Authors: Marina Ercegović, Mateja Brozović, Nikolina Dečman
Abstract:
The extent of using digital technologies in teaching has definitely intensified during the pandemic, leading to the replacement of traditional learning with online learning. The experiences through the pandemic have shown that not all fields of study and all levels of education are equally suitable for the implementation of digital tools and online learning. It is generally expected that students at higher levels of study have better digital competences and are therefore more equipped and prepared to participate in online education or traditional education in classrooms that include the use of digital tools. Accounting as a field of study has good predispositions to be suitable for the use of digital tools and online learning: it can usually be taught remotely, while modern accounting also incorporates the use of different digital tools. The goals of the research are: 1) to systematize the results of the existing literature regarding the use of digital tools and online learning in education, with a special emphasis on teaching accounting, 2) to analyze the current level of digital competences of accounting students in Croatia, 3) to investigate the current attitudes of accounting students in Croatia regarding the use of digital tools in education, as well as the advantages and disadvantages of online learning, and 4) to compare the results of the research conducted in 2024/2025 with the same research conducted in 2021/2022. In addition to the literature review, a primary research using an online questionnaire was conducted among accounting students in Croatia. The sample included students enrolled in the university or professional study program related to accounting and finance, or accounting and auditing. The original research was conducted in 2021/2022, i.e. during the pandemic, when students had to suddenly transition from traditional learning to online learning, mostly without proper preparation and planning, which might have negatively affected the attitudes of students towards online learning and digital tools. This is why it repeated the research in 2024/2025, to compare the results and to explore if there are any significant differences.Keywords: digital tools, accounting, online learning, education
Procedia PDF Downloads 51097 Global Culture Museums: Bridging Societies, Fostering Understanding, and Preserving Heritage
Authors: Hossam Hegazi
Abstract:
Global culture museums play a pivotal role in fostering cross-cultural connections, enhancing mutual understanding, and safeguarding the rich tapestry of cultural heritage. These institutions serve as dynamic bridges, facilitating the exchange of ideas and values among diverse societies. One of the primary functions of global culture museums is to connect people from different backgrounds. By showcasing the artistic expressions, traditions, and historical artifacts of various civilizations, these museums create a shared space for dialogue. Visitors are afforded the opportunity to explore and appreciate the nuances of cultures different from their own, promoting a sense of global interconnectedness. Moreover, these museums contribute significantly to mutual understanding. Through interactive exhibits, innovative technologies, and educational programs, they offer immersive experiences that transcend linguistic and geographical barriers. Visitors gain insights into the customs, beliefs, and lifestyles of others, fostering empathy and appreciation for cultural diversity. Preserving cultural heritage stands as another key objective of global culture museums. By housing and curating artifacts, artworks, and historical items, these institutions play a crucial role in safeguarding the collective memory of humanity. This preservation effort ensures that future generations have access to the cultural legacies that have shaped societies across the globe. In conclusion, global culture museums serve as dynamic hubs that bring people together, promote understanding, and safeguard the wealth of human cultural heritage. Their impact extends beyond the walls of exhibition halls, contributing to a more interconnected and culturally enriched world.Keywords: global culture museums, cross-cultural connections, mutual understanding, societal dialogue
Procedia PDF Downloads 311096 Review of Numerical Models for Granular Beds in Solar Rotary Kilns for Thermal Applications
Authors: Edgar Willy Rimarachin Valderrama, Eduardo Rojas Parra
Abstract:
Thermal energy from solar radiation is widely present in power plants, food drying, chemical reactors, heating and cooling systems, water treatment processes, hydrogen production, and others. In the case of power plants, one of the technologies available to transform solar energy into thermal energy is by solar rotary kilns where a bed of granular matter is heated through concentrated radiation obtained from an arrangement of heliostats. Numerical modeling is a useful approach to study the behavior of granular beds in solar rotary kilns. This technique, once validated with small-scale experiments, can be used to simulate large-scale processes for industrial applications. This study gives a comprehensive classification of numerical models used to simulate the movement and heat transfer for beds of granular media within solar rotary furnaces. In general, there exist three categories of models: 1) continuum, 2) discrete, and 3) multiphysics modeling. The continuum modeling considers zero-dimensional, one-dimensional and fluid-like models. On the other hand, the discrete element models compute the movement of each particle of the bed individually. In this kind of modeling, the heat transfer acts during contacts, which can occur by solid-solid and solid-gas-solid conduction. Finally, the multiphysics approach considers discrete elements to simulate grains and a continuous modeling to simulate the fluid around particles. This classification allows to compare the advantages and disadvantages for each kind of model in terms of accuracy, computational cost and implementation.Keywords: granular beds, numerical models, rotary kilns, solar thermal applications
Procedia PDF Downloads 431095 Investigation on Reducing the Bandgap in Nanocomposite Polymers by Doping
Authors: Sharvare Palwai, Padmaja Guggilla
Abstract:
Smart materials, also called as responsive materials, undergo reversible physical or chemical changes in their properties as a consequence of small environmental variations. They can respond to a single or multiple stimuli such as stress, temperature, moist, electric or magnetic fields, light, or chemical compounds. Hence smart materials are the basis of many applications, including biosensors and transducers, particularly electroactive polymers. As the polymers exhibit good flexibility, high transparency, easy processing, and low cost, they would be promising for the sensor material. Polyvinylidene Fluoride (PVDF), being a ferroelectric polymer, exhibits piezoelectric and pyro electric properties. Pyroelectric materials convert heat directly into electricity, while piezoelectric materials convert mechanical energy into electricity. These characteristics of PVDF make it useful in biosensor devices and batteries. However, the influence of nanoparticle fillers such as Lithium Tantalate (LiTaO₃/LT), Potassium Niobate (KNbO₃/PN), and Zinc Titanate (ZnTiO₃/ZT) in polymer films will be studied comprehensively. Developing advanced and cost-effective biosensors is pivotal to foresee the fullest potential of polymer based wireless sensor networks, which will further enable new types of self-powered applications. Finally, nanocomposites films with best set of properties; the sensory elements will be designed and tested for their performance as electric generators under laboratory conditions. By characterizing the materials for their optical properties and investigate the effects of doping on the bandgap energies, the science in the next-generation biosensor technologies can be advanced.Keywords: polyvinylidene fluoride, PVDF, lithium tantalate, potassium niobate, zinc titanate
Procedia PDF Downloads 1361094 Social and Digital Transformation of the Saudi Education System: A Cyberconflict Analysis
Authors: Mai Alshareef
Abstract:
The Saudi government considers the modernisation of the education system as a critical component of the national development plan, Saudi Vision 2030; however, this sudden reform creates tension amongst Saudis. This study examines first the reflection of the social and digital education reform on stakeholders and the general Saudi public, and second, the influence of information and communication technologies (ICTs) on the ethnoreligious conflict in Saudi Arabia. This study employs Cyberconflict theory to examine conflicts in the real world and cyberspace. The findings are based on a qualitative case study methodology that uses netnography, an analysis of 3,750 Twitter posts and semi-structural interviews with 30 individuals, including key actors in the Saudi education sector and Twitter activists during 2019\2020. The methods utilised are guided by thematic analysis to map an understanding of factors that influence societal conflicts in Saudi Arabia, which in this case include religious, national, and gender identity. Elements of Cyberconflict theory are used to better understand how conflicting groups build their identities in connection to their ethnic/religious/cultural differences and competing national identities. The findings correspond to the ethnoreligious components of the Cyberconflict theory. Twitter became a battleground for liberals, conservatives, the Saudi public and elites, and it is used in a novel way to influence public opinion and to challenge the media monopoly. Opposing groups relied heavily on a discourse of exclusion and inclusion and showed ethnic and religious affiliations, national identity, and chauvinism. The findings add to existing knowledge in the cyberconflict field of study, and they also reveal outcomes that are critical to the Saudi Arabian national context.Keywords: education, cyberconflict, Twitter, national identity
Procedia PDF Downloads 1751093 Assessing Significance of Correlation with Binomial Distribution
Authors: Vijay Kumar Singh, Pooja Kushwaha, Prabhat Ranjan, Krishna Kumar Ojha, Jitendra Kumar
Abstract:
Present day high-throughput genomic technologies, NGS/microarrays, are producing large volume of data that require improved analysis methods to make sense of the data. The correlation between genes and samples has been regularly used to gain insight into many biological phenomena including, but not limited to, co-expression/co-regulation, gene regulatory networks, clustering and pattern identification. However, presence of outliers and violation of assumptions underlying Pearson correlation is frequent and may distort the actual correlation between the genes and lead to spurious conclusions. Here, we report a method to measure the strength of association between genes. The method assumes that the expression values of a gene are Bernoulli random variables whose outcome depends on the sample being probed. The method considers the two genes as uncorrelated if the number of sample with same outcome for both the genes (Ns) is equal to certainly expected number (Es). The extent of correlation depends on how far Ns can deviate from the Es. The method does not assume normality for the parent population, fairly unaffected by the presence of outliers, can be applied to qualitative data and it uses the binomial distribution to assess the significance of association. At this stage, we would not claim about the superiority of the method over other existing correlation methods, but our method could be another way of calculating correlation in addition to existing methods. The method uses binomial distribution, which has not been used until yet, to assess the significance of association between two variables. We are evaluating the performance of our method on NGS/microarray data, which is noisy and pierce by the outliers, to see if our method can differentiate between spurious and actual correlation. While working with the method, it has not escaped our notice that the method could also be generalized to measure the association of more than two variables which has been proven difficult with the existing methods.Keywords: binomial distribution, correlation, microarray, outliers, transcriptome
Procedia PDF Downloads 4161092 MRI Quality Control Using Texture Analysis and Spatial Metrics
Authors: Kumar Kanudkuri, A. Sandhya
Abstract:
Typically, in a MRI clinical setting, there are several protocols run, each indicated for a specific anatomy and disease condition. However, these protocols or parameters within them can change over time due to changes to the recommendations by the physician groups or updates in the software or by the availability of new technologies. Most of the time, the changes are performed by the MRI technologist to account for either time, coverage, physiological, or Specific Absorbtion Rate (SAR ) reasons. However, giving properly guidelines to MRI technologist is important so that they do not change the parameters that negatively impact the image quality. Typically a standard American College of Radiology (ACR) MRI phantom is used for Quality Control (QC) in order to guarantee that the primary objectives of MRI are met. The visual evaluation of quality depends on the operator/reviewer and might change amongst operators as well as for the same operator at various times. Therefore, overcoming these constraints is essential for a more impartial evaluation of quality. This makes quantitative estimation of image quality (IQ) metrics for MRI quality control is very important. So in order to solve this problem, we proposed that there is a need for a robust, open-source, and automated MRI image control tool. The Designed and developed an automatic analysis tool for measuring MRI image quality (IQ) metrics like Signal to Noise Ratio (SNR), Signal to Noise Ratio Uniformity (SNRU), Visual Information Fidelity (VIF), Feature Similarity (FSIM), Gray level co-occurrence matrix (GLCM), slice thickness accuracy, slice position accuracy, High contrast spatial resolution) provided good accuracy assessment. A standardized quality report has generated that incorporates metrics that impact diagnostic quality.Keywords: ACR MRI phantom, MRI image quality metrics, SNRU, VIF, FSIM, GLCM, slice thickness accuracy, slice position accuracy
Procedia PDF Downloads 1731091 Tokyo Skyscrapers: Technologically Advanced Structures in Seismic Areas
Authors: J. Szolomicki, H. Golasz-Szolomicka
Abstract:
The architectural and structural analysis of selected high-rise buildings in Tokyo is presented in this paper. The capital of Japan is the most densely populated city in the world and moreover is located in one of the most active seismic zones. The combination of these factors has resulted in the creation of sophisticated designs and innovative engineering solutions, especially in the field of design and construction of high-rise buildings. The foreign architectural studios (as, for Jean Nouvel, Kohn Pedesen Associates, Skidmore, Owings & Merill) which specialize in the designing of skyscrapers, played a major role in the development of technological ideas and architectural forms for such extraordinary engineering structures. Among the projects completed by them, there are examples of high-rise buildings that set precedents for future development. An essential aspect which influences the design of high-rise buildings is the necessity to take into consideration their dynamic reaction to earthquakes and counteracting wind vortices. The need to control motions of these buildings, induced by the force coming from earthquakes and wind, led to the development of various methods and devices for dissipating energy which occur during such phenomena. Currently, Japan is a global leader in seismic technologies which safeguard seismic influence on high-rise structures. Due to these achievements the most modern skyscrapers in Tokyo are able to withstand earthquakes with a magnitude of over seven degrees at the Richter scale. Damping devices applied are of a passive, which do not require additional power supply or active one which suppresses the reaction with the input of extra energy. In recent years also hybrid dampers were used, with an additional active element to improve the efficiency of passive damping.Keywords: core structures, damping system, high-rise building, seismic zone
Procedia PDF Downloads 175