Search results for: real volume
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7739

Search results for: real volume

6629 Numerical Simulation of Large-Scale Landslide-Generated Impulse Waves With a Soil‒Water Coupling Smooth Particle Hydrodynamics Model

Authors: Can Huang, Xiaoliang Wang, Qingquan Liu

Abstract:

Soil‒water coupling is an important process in landslide-generated impulse waves (LGIW) problems, accompanied by large deformation of soil, strong interface coupling and three-dimensional effect. A meshless particle method, smooth particle hydrodynamics (SPH) has great advantages in dealing with complex interface and multiphase coupling problems. This study presents an improved soil‒water coupled model to simulate LGIW problems based on an open source code DualSPHysics (v4.0). Aiming to solve the low efficiency problem in modeling real large-scale LGIW problems, graphics processing unit (GPU) acceleration technology is implemented into this code. An experimental example, subaerial landslide-generated water waves, is simulated to demonstrate the accuracy of this model. Then, the Huangtian LGIW, a real large-scale LGIW problem is modeled to reproduce the entire disaster chain, including landslide dynamics, fluid‒solid interaction, and surge wave generation. The convergence analysis shows that a particle distance of 5.0 m can provide a converged landslide deposit and surge wave for this example. Numerical simulation results are in good agreement with the limited field survey data. The application example of the Huangtian LGIW provides a typical reference for large-scale LGIW assessments, which can provide reliable information on landslide dynamics, interface coupling behavior, and surge wave characteristics.

Keywords: soil‒water coupling, landslide-generated impulse wave, large-scale, SPH

Procedia PDF Downloads 64
6628 Statistically Accurate Synthetic Data Generation for Enhanced Traffic Predictive Modeling Using Generative Adversarial Networks and Long Short-Term Memory

Authors: Srinivas Peri, Siva Abhishek Sirivella, Tejaswini Kallakuri, Uzair Ahmad

Abstract:

Effective traffic management and infrastructure planning are crucial for the development of smart cities and intelligent transportation systems. This study addresses the challenge of data scarcity by generating realistic synthetic traffic data using the PeMS-Bay dataset, improving the accuracy and reliability of predictive modeling. Advanced synthetic data generation techniques, including TimeGAN, GaussianCopula, and PAR Synthesizer, are employed to produce synthetic data that replicates the statistical and structural characteristics of real-world traffic. Future integration of Spatial-Temporal Generative Adversarial Networks (ST-GAN) is planned to capture both spatial and temporal correlations, further improving data quality and realism. The performance of each synthetic data generation model is evaluated against real-world data to identify the best models for accurately replicating traffic patterns. Long Short-Term Memory (LSTM) networks are utilized to model and predict complex temporal dependencies within traffic patterns. This comprehensive approach aims to pinpoint areas with low vehicle counts, uncover underlying traffic issues, and inform targeted infrastructure interventions. By combining GAN-based synthetic data generation with LSTM-based traffic modeling, this study supports data-driven decision-making that enhances urban mobility, safety, and the overall efficiency of city planning initiatives.

Keywords: GAN, long short-term memory, synthetic data generation, traffic management

Procedia PDF Downloads 28
6627 Social Networks Global Impact on Protest Movements and Human Rights Activism

Authors: Marcya Burden, Savonna Greer

Abstract:

In the wake of social unrest around the world, protest movements have been captured like never before. As protest movements have evolved, so too have their visibility and sources of coverage. Long gone are the days of print media as our only glimpse into the action surrounding a protest. Now, with social networks such as Facebook, Instagram and Snapchat, we have access to real-time video footage of protest movements and human rights activism that can reach millions of people within seconds. This research paper investigated various social media network platforms’ statistical usage data in the areas of human rights activism and protest movements, paralleling with other past forms of media coverage. This research demonstrates that social networks are extremely important to protest movements and human rights activism. With over 2.9 billion users across social media networks globally, these platforms are the heart of most recent protests and human rights activism. This research shows the paradigm shift from the Selma March of 1965 to the more recent protests of Ferguson in 2014, Ni Una Menos in 2015, and End Sars in 2018. The research findings demonstrate that today, almost anyone may use their social networks to protest movement leaders and human rights activists. From a student to an 80-year-old professor, the possibility of reaching billions of people all over the world is limitless. Findings show that 82% of the world’s internet population is on social networks 1 in every 5 minutes. Over 65% of Americans believe social media highlights important issues. Thus, there is no need to have a formalized group of people or even be known online. A person simply needs to be engaged on their respective social media networks (Facebook, Twitter, Instagram, Snapchat) regarding any cause they are passionate about. Information may be exchanged in real time around the world and a successful protest can begin.

Keywords: activism, protests, human rights, networks

Procedia PDF Downloads 95
6626 Visualization of Quantitative Thresholds in Stocks

Authors: Siddhant Sahu, P. James Daniel Paul

Abstract:

Technical analysis comprised by various technical indicators is a holistic way of representing price movement of stocks in the market. Various forms of indicators have evolved from the primitive ones in the past decades. There have been many attempts to introduce volume as a major determinant to determine strong patterns in market forecasting. The law of demand defines the relationship between the volume and price. Most of the traders are familiar with the volume game. Including the time dimension to the law of demand provides a different visualization to the theory. While attempting the same, it was found that there are different thresholds in the market for different companies. These thresholds have a significant influence on the price. This article is an attempt in determining the thresholds for companies using the three dimensional graphs for optimizing the portfolios. It also emphasizes on the magnitude of importance of volumes as a key factor for determining of predicting strong price movements, bullish and bearish markets. It uses a comprehensive data set of major companies which form a major chunk of the Indian automotive sector and are thus used as an illustration.

Keywords: technical analysis, expert system, law of demand, stocks, portfolio analysis, Indian automotive sector

Procedia PDF Downloads 317
6625 Security Issues on Smart Grid and Blockchain-Based Secure Smart Energy Management Systems

Authors: Surah Aldakhl, Dafer Alali, Mohamed Zohdy

Abstract:

The next generation of electricity grid infrastructure, known as the "smart grid," integrates smart ICT (information and communication technology) into existing grids in order to alleviate the drawbacks of existing one-way grid systems. Future power systems' efficiency and dependability are anticipated to significantly increase thanks to the Smart Grid, especially given the desire for renewable energy sources. The security of the Smart Grid's cyber infrastructure is a growing concern, though, as a result of the interconnection of significant power plants through communication networks. Since cyber-attacks can destroy energy data, beginning with personal information leaking from grid members, they can result in serious incidents like huge outages and the destruction of power network infrastructure. We shall thus propose a secure smart energy management system based on the Blockchain as a remedy for this problem. The power transmission and distribution system may undergo a transformation as a result of the inclusion of optical fiber sensors and blockchain technology in smart grids. While optical fiber sensors allow real-time monitoring and management of electrical energy flow, Blockchain offers a secure platform to safeguard the smart grid against cyberattacks and unauthorized access. Additionally, this integration makes it possible to see how energy is produced, distributed, and used in real time, increasing transparency. This strategy has advantages in terms of improved security, efficiency, dependability, and flexibility in energy management. An in-depth analysis of the advantages and drawbacks of combining blockchain technology with optical fiber is provided in this paper.

Keywords: smart grids, blockchain, fiber optic sensor, security

Procedia PDF Downloads 120
6624 Analytical Modelling of the Moment-Rotation Behavior of Top and Seat Angle Connection with Stiffeners

Authors: Merve Sagiroglu

Abstract:

The earthquake-resistant steel structure design is required taking into account the behavior of beam-column connections besides the basic properties of the structure such as material and geometry. Beam-column connections play an important role in the behavior of frame systems. Taking into account the behaviour of connection in analysis and design of steel frames is important due to presenting the actual behavior of frames. So, the behavior of the connections should be well known. The most important force which transmitted by connections in the structural system is the moment. The rotational deformation is customarily expressed as a function of the moment in the connection. So, the moment-rotation curves are the best expression of behaviour of the beam-to-column connections. The designed connections form various moment-rotation curves according to the elements of connection and the shape of placement. The only way to achieve this curve is with real-scale experiments. The experiments of some connections have been carried out partially and are formed in the databank. It has been formed the models using this databank to express the behavior of connection. In this study, theoretical studies have been carried out to model a real behavior of the top and seat angles connections with angles. Two stiffeners in the top and seat angle to increase the stiffness of the connection, and two stiffeners in the beam web to prevent local buckling are used in this beam-to-column connection. Mathematical models have been performed using the database of the beam-to-column connection experiments previously by authors. Using the data of the tests, it has been aimed that analytical expressions have been developed to obtain the moment-rotation curve for the connection details whose test data are not available. The connection has been dimensioned in various shapes and the effect of the dimensions of the connection elements on the behavior has been examined.

Keywords: top and seat angle connection, stiffener, moment-rotation curves, analytical study

Procedia PDF Downloads 180
6623 Bank, Stock Market Efficiency and Economic Growth: Lessons for ASEAN-5

Authors: Tan Swee Liang

Abstract:

This paper estimates bank and stock market efficiency associations with real per capita GDP growth by examining panel-data across three different regions using Panel-Corrected Standard Errors (PCSE) regression developed by Beck and Katz (1995). Data from five economies in ASEAN (Singapore, Malaysia, Thailand, Philippines, and Indonesia), five economies in Asia (Japan, China, Hong Kong SAR, South Korea, and India) and seven economies in OECD (Australia, Canada, Denmark, Norway, Sweden, United Kingdom U.K., and United States U.S.), between 1990 and 2017 are used. Empirical findings suggest one, for Asia-5 high bank net interest margin means greater bank profitability, hence spurring economic growth. Two, for OECD-7 low bank overhead costs (as a share of total assets) may reflect weak competition and weak investment in providing superior banking services, hence dampening economic growth. Three, stock market turnover ratio has negative association with OECD-7 economic growth, but a positive association with Asia-5, which suggest the relationship between liquidity and growth is ambiguous. Lastly, for ASEAN-5 high bank overhead costs (as a share of total assets) may suggest expenses have not been channelled efficiently to income generating activities. One practical implication of the findings is that policy makers should take necessary measures toward financial liberalisation policies that boost growth through the efficiency channel, so that funds are efficiently allocated through the financial system between financial and real sectors.

Keywords: financial development, banking system, capital markets, economic growth

Procedia PDF Downloads 139
6622 Selective Extraction of Lithium from Native Geothermal Brines Using Lithium-ion Sieves

Authors: Misagh Ghobadi, Rich Crane, Karen Hudson-Edwards, Clemens Vinzenz Ullmann

Abstract:

Lithium is recognized as the critical energy metal of the 21st century, comparable in importance to coal in the 19th century and oil in the 20th century, often termed 'white gold'. Current global demand for lithium, estimated at 0.95-0.98 million metric tons (Mt) of lithium carbonate equivalent (LCE) annually in 2024, is projected to rise to 1.87 Mt by 2027 and 3.06 Mt by 2030. Despite anticipated short-term stability in supply and demand, meeting the forecasted 2030 demand will require the lithium industry to develop an additional capacity of 1.42 Mt of LCE annually, exceeding current planned and ongoing efforts. Brine resources constitute nearly 65% of global lithium reserves, underscoring the importance of exploring lithium recovery from underutilized sources, especially geothermal brines. However, conventional lithium extraction from brine deposits faces challenges due to its time-intensive process, low efficiency (30-50% lithium recovery), unsuitability for low lithium concentrations (<300 mg/l), and notable environmental impacts. Addressing these challenges, direct lithium extraction (DLE) methods have emerged as promising technologies capable of economically extracting lithium even from low-concentration brines (>50 mg/l) with high recovery rates (75-98%). However, most studies (70%) have predominantly focused on synthetic brines instead of native (natural/real), with limited application of these approaches in real-world case studies or industrial settings. This study aims to bridge this gap by investigating a geothermal brine sample collected from a real case study site in the UK. A Mn-based lithium-ion sieve (LIS) adsorbent was synthesized and employed to selectively extract lithium from the sample brine. Adsorbents with a Li:Mn molar ratio of 1:1 demonstrated superior lithium selectivity and adsorption capacity. Furthermore, the pristine Mn-based adsorbent was modified through transition metals doping, resulting in enhanced lithium selectivity and adsorption capacity. The modified adsorbent exhibited a higher separation factor for lithium over major co-existing cations such as Ca, Mg, Na, and K, with separation factors exceeding 200. The adsorption behaviour was well-described by the Langmuir model, indicating monolayer adsorption, and the kinetics followed a pseudo-second-order mechanism, suggesting chemisorption at the solid surface. Thermodynamically, negative ΔG° values and positive ΔH° and ΔS° values were observed, indicating the spontaneity and endothermic nature of the adsorption process.

Keywords: adsorption, critical minerals, DLE, geothermal brines, geochemistry, lithium, lithium-ion sieves

Procedia PDF Downloads 46
6621 Exergy Analysis of a Green Dimethyl Ether Production Plant

Authors: Marcello De Falco, Gianluca Natrella, Mauro Capocelli

Abstract:

CO₂ capture and utilization (CCU) is a promising approach to reduce GHG(greenhouse gas) emissions. Many technologies in this field are recently attracting attention. However, since CO₂ is a very stable compound, its utilization as a reagent is energetic intensive. As a consequence, it is unclear whether CCU processes allow for a net reduction of environmental impacts from a life cycle perspective and whether these solutions are sustainable. Among the tools to apply for the quantification of the real environmental benefits of CCU technologies, exergy analysis is the most rigorous from a scientific point of view. The exergy of a system is the maximum obtainable work during a process that brings the system into equilibrium with its reference environment through a series of reversible processes in which the system can only interact with such an environment. In other words, exergy is an “opportunity for doing work” and, in real processes, it is destroyed by entropy generation. The exergy-based analysis is useful to evaluate the thermodynamic inefficiencies of processes, to understand and locate the main consumption of fuels or primary energy, to provide an instrument for comparison among different process configurations and to detect solutions to reduce the energy penalties of a process. In this work, the exergy analysis of a process for the production of Dimethyl Ether (DME) from green hydrogen generated through an electrolysis unit and pure CO₂ captured from flue gas is performed. The model simulates the behavior of all units composing the plant (electrolyzer, carbon capture section, DME synthesis reactor, purification step), with the scope to quantify the performance indices based on the II Law of Thermodynamics and to identify the entropy generation points. Then, a plant optimization strategy is proposed to maximize the exergy efficiency.

Keywords: green DME production, exergy analysis, energy penalties, exergy efficiency

Procedia PDF Downloads 257
6620 Face Recognition Using Eigen Faces Algorithm

Authors: Shweta Pinjarkar, Shrutika Yawale, Mayuri Patil, Reshma Adagale

Abstract:

Face recognition is the technique which can be applied to the wide variety of problems like image and film processing, human computer interaction, criminal identification etc. This has motivated researchers to develop computational models to identify the faces, which are easy and simple to implement. In this, demonstrates the face recognition system in android device using eigenface. The system can be used as the base for the development of the recognition of human identity. Test images and training images are taken directly with the camera in android device.The test results showed that the system produces high accuracy. The goal is to implement model for particular face and distinguish it with large number of stored faces. face recognition system detects the faces in picture taken by web camera or digital camera and these images then checked with training images dataset based on descriptive features. Further this algorithm can be extended to recognize the facial expressions of a person.recognition could be carried out under widely varying conditions like frontal view,scaled frontal view subjects with spectacles. The algorithm models the real time varying lightning conditions. The implemented system is able to perform real-time face detection, face recognition and can give feedback giving a window with the subject's info from database and sending an e-mail notification to interested institutions using android application. Face recognition is the technique which can be applied to the wide variety of problems like image and film processing, human computer interaction, criminal identification etc. This has motivated researchers to develop computational models to identify the faces, which are easy and simple to implement. In this , demonstrates the face recognition system in android device using eigenface. The system can be used as the base for the development of the recognition of human identity. Test images and training images are taken directly with the camera in android device.The test results showed that the system produces high accuracy. The goal is to implement model for particular face and distinguish it with large number of stored faces. face recognition system detects the faces in picture taken by web camera or digital camera and these images then checked with training images dataset based on descriptive features. Further this algorithm can be extended to recognize the facial expressions of a person.recognition could be carried out under widely varying conditions like frontal view,scaled frontal view subjects with spectacles. The algorithm models the real time varying lightning conditions. The implemented system is able to perform real-time face detection, face recognition and can give feedback giving a window with the subject's info from database and sending an e-mail notification to interested institutions using android application.

Keywords: face detection, face recognition, eigen faces, algorithm

Procedia PDF Downloads 361
6619 Consequences of Transformation of Modern Monetary Policy during the Global Financial Crisis

Authors: Aleksandra Szunke

Abstract:

Monetary policy is an important pillar of the economy, directly affecting on the condition of banking sector. Depending on the strategy may both support functioning of banking institutions, as well as limit their excessively risky activities. The literature studies indicate a large number of publications, which include characteristics of initiatives, implemented by central banks during the global financial crisis and the potential effects of the use of non-standard monetary policy instruments. However, the empirical evidence about their effects and real consequences for the financial markets are still not final. Even before the escalation of instability, Bernanke, Reinhart, and Sack (2004) analyzed the effectiveness of various unconventional monetary tools in lowering long-term interest rates in the United States and Japan. The obtained results largely confirmed the effectiveness of the zero-interest-rate policy and Quantitative Easing (QE) in achieving the goal of reducing long-term interest rates. Japan, considered as the precursor of QE policy, also conducted research about the consequences of non-standard instruments, implemented to restore financial stability of the country. Although the literature about the effectiveness of Quantitative Easing in Japan is extensive, it does not uniquely specify whether it brought permanent effects. The main aim of the study is to identify the implications of non-standard monetary policy, implemented by selected central banks (the Federal Reserve System, Bank of England and European Central Bank), paying particular attention to the consequences into three areas: the size of money supply, financial markets, and the real economy.

Keywords: consequences of modern monetary policy, quantitative easing policy, banking sector instability, global financial crisis

Procedia PDF Downloads 478
6618 The Underestimation of Cultural Risk in the Execution of Megaprojects

Authors: Alan Walsh, Peter Walker, Michael Ellis

Abstract:

There is a real danger that both practitioners and researchers considering risks associated with megaprojects ignore or underestimate the impacts of cultural risk. The paper investigates the potential impacts of a failure to achieve cultural unity between the principal actors executing a megaproject. The principle relationships include the relationships between the principle Contractors and the project stakeholders or the project stakeholders and their principle advisors, Western Consultants. This study confirms that cultural dissonance between these parties can delay or disrupt the megaproject execution and examines why cultural issues should be prioritized as a significant risk factor in megaproject delivery. This paper addresses the practical impacts and potential mitigation measures, which may reduce cultural dissonance for a megaproject's delivery. This information is retrieved from on-going case studies in live infrastructure megaprojects in Europe and the Middle East's GCC states, from Western Consultants' perspective. The collaborating researchers each have at least 30 years of construction experience and are engaged in architecture, project management and contracts management, dealing with megaprojects in Europe or the GCC. After examining the cultural interfaces they have observed during the execution of megaprojects, they conclude that globally, culture significantly influences their efficient delivery. The study finds that cultural risk is ever-present, where different nationalities co-manage megaprojects and that cultural conflict poses a real threat to the timely delivery of megaprojects. The study indicates that the higher the cultural distance between the principal actors, the more pronounced the risk, with the risk of cultural dissonance more prominent in GCC megaprojects. The findings support a more culturally aware and cohesive team approach and recommend cross-cultural training to mitigate the effects of cultural disparity.

Keywords: cultural risk underestimation, cultural distance, megaproject characteristics, megaproject execution

Procedia PDF Downloads 106
6617 Graph Neural Network-Based Classification for Disease Prediction in Health Care Heterogeneous Data Structures of Electronic Health Record

Authors: Raghavi C. Janaswamy

Abstract:

In the healthcare sector, heterogenous data elements such as patients, diagnosis, symptoms, conditions, observation text from physician notes, and prescriptions form the essentials of the Electronic Health Record (EHR). The data in the form of clear text and images are stored or processed in a relational format in most systems. However, the intrinsic structure restrictions and complex joins of relational databases limit the widespread utility. In this regard, the design and development of realistic mapping and deep connections as real-time objects offer unparallel advantages. Herein, a graph neural network-based classification of EHR data has been developed. The patient conditions have been predicted as a node classification task using a graph-based open source EHR data, Synthea Database, stored in Tigergraph. The Synthea DB dataset is leveraged due to its closer representation of the real-time data and being voluminous. The graph model is built from the EHR heterogeneous data using python modules, namely, pyTigerGraph to get nodes and edges from the Tigergraph database, PyTorch to tensorize the nodes and edges, PyTorch-Geometric (PyG) to train the Graph Neural Network (GNN) and adopt the self-supervised learning techniques with the AutoEncoders to generate the node embeddings and eventually perform the node classifications using the node embeddings. The model predicts patient conditions ranging from common to rare situations. The outcome is deemed to open up opportunities for data querying toward better predictions and accuracy.

Keywords: electronic health record, graph neural network, heterogeneous data, prediction

Procedia PDF Downloads 86
6616 Intelligent Crop Circle: A Blockchain-Driven, IoT-Based, AI-Powered Sustainable Agriculture System

Authors: Mishak Rahul, Naveen Kumar, Bharath Kumar

Abstract:

Conceived as a high-end engine to revolutionise sustainable agri-food production, the intelligent crop circle (ICC) aims to incorporate the Internet of Things (IoT), blockchain technology and artificial intelligence (AI) to bolster resource efficiency and prevent waste, increase the volume of production and bring about sustainable solutions with long-term ecosystem conservation as the guiding principle. The operating principle of the ICC relies on bringing together multidisciplinary bottom-up collaborations between producers, researchers and consumers. Key elements of the framework include IoT-based smart sensors for sensing soil moisture, temperature, humidity, nutrient and air quality, which provide short-interval and timely data; blockchain technology for data storage on a private chain, which maintains data integrity, traceability and transparency; and AI-based predictive analysis, which actively predicts resource utilisation, plant growth and environment. This data and AI insights are built into the ICC platform, which uses the resulting DSS (Decision Support System) outlined as help in decision making, delivered through an easy-touse mobile app or web-based interface. Farmers are assumed to use such a decision-making aid behind the power of the logic informed by the data pool. Building on existing data available in the farm management systems, the ICC platform is easily interoperable with other IoT devices. ICC facilitates connections and information sharing in real-time between users, including farmers, researchers and industrial partners, enabling them to cooperate in farming innovation and knowledge exchange. Moreover, ICC supports sustainable practice in agriculture by integrating gamification techniques to stimulate farm adopters, deploying VR technologies to model and visualise 3D farm environments and farm conditions, framing the field scenarios using VR headsets and Real-Time 3D engines, and leveraging edge technologies to facilitate secure and fast communication and collaboration between users involved. And through allowing blockchain-based marketplaces, ICC offers traceability from farm to fork – that is: from producer to consumer. It empowers informed decision-making through tailor-made recommendations generated by means of AI-driven analysis and technology democratisation, enabling small-scale and resource-limited farmers to get their voice heard. It connects with traditional knowledge, brings together multi-stakeholder interactions as well as establishes a participatory ecosystem to incentivise continuous growth and development towards more sustainable agro-ecological food systems. This integrated approach leverages the power of emerging technologies to provide sustainable solutions for a resilient food system, ensuring sustainable agriculture worldwide.

Keywords: blockchain, internet of things, artificial intelligence, decision support system, virtual reality, gamification, traceability, sustainable agriculture

Procedia PDF Downloads 44
6615 Human-Centred Data Analysis Method for Future Design of Residential Spaces: Coliving Case Study

Authors: Alicia Regodon Puyalto, Alfonso Garcia-Santos

Abstract:

This article presents a method to analyze the use of indoor spaces based on data analytics obtained from inbuilt digital devices. The study uses the data generated by the in-place devices, such as smart locks, Wi-Fi routers, and electrical sensors, to gain additional insights on space occupancy, user behaviour, and comfort. Those devices, originally installed to facilitate remote operations, report data through the internet that the research uses to analyze information on human real-time use of spaces. Using an in-place Internet of Things (IoT) network enables a faster, more affordable, seamless, and scalable solution to analyze building interior spaces without incorporating external data collection systems such as sensors. The methodology is applied to a real case study of coliving, a residential building of 3000m², 7 floors, and 80 users in the centre of Madrid. The case study applies the method to classify IoT devices, assess, clean, and analyze collected data based on the analysis framework. The information is collected remotely, through the different platforms devices' platforms; the first step is to curate the data, understand what insights can be provided from each device according to the objectives of the study, this generates an analysis framework to be escalated for future building assessment even beyond the residential sector. The method will adjust the parameters to be analyzed tailored to the dataset available in the IoT of each building. The research demonstrates how human-centered data analytics can improve the future spatial design of indoor spaces.

Keywords: in-place devices, IoT, human-centred data-analytics, spatial design

Procedia PDF Downloads 197
6614 Solving a Micromouse Maze Using an Ant-Inspired Algorithm

Authors: Rolando Barradas, Salviano Soares, António Valente, José Alberto Lencastre, Paulo Oliveira

Abstract:

This article reviews the Ant Colony Optimization, a nature-inspired algorithm, and its implementation in the Scratch/m-Block programming environment. The Ant Colony Optimization is a part of Swarm Intelligence-based algorithms and is a subset of biological-inspired algorithms. Starting with a problem in which one has a maze and needs to find its path to the center and return to the starting position. This is similar to an ant looking for a path to a food source and returning to its nest. Starting with the implementation of a simple wall follower simulator, the proposed solution uses a dynamic graphical interface that allows young students to observe the ants’ movement while the algorithm optimizes the routes to the maze’s center. Things like interface usability, Data structures, and the conversion of algorithmic language to Scratch syntax were some of the details addressed during this implementation. This gives young students an easier way to understand the computational concepts of sequences, loops, parallelism, data, events, and conditionals, as they are used through all the implemented algorithms. Future work includes the simulation results with real contest mazes and two different pheromone update methods and the comparison with the optimized results of the winners of each one of the editions of the contest. It will also include the creation of a Digital Twin relating the virtual simulator with a real micromouse in a full-size maze. The first test results show that the algorithm found the same optimized solutions that were found by the winners of each one of the editions of the Micromouse contest making this a good solution for maze pathfinding.

Keywords: nature inspired algorithms, scratch, micromouse, problem-solving, computational thinking

Procedia PDF Downloads 126
6613 Evaluation of the Safety and Performance of Blood Culture Practices Using BD Safety-Lokᵀᴹ Blood Collection Sets in the Emergency Room

Authors: Jeonghyun Chang, Taegeun Lee, Heungsup Sung, Yoon-Seon Lee, Youn-Jung Kim, Mi-Na Kim

Abstract:

Background: Safety device has been applied to improve safety and performance of blood culture practice. BD vacutainer® Safety-Lokᵀᴹ blood collection sets with pre-attached holder (Safety-Lok) (BD, USA) was evaluated in the emergency room (ER) of a tertiary care hospital. Methods: From April to June 2017, interns and nurses in ER were surveyed for blood culture practices with a questionnaire before and after 2 or 3 weeks of experience of Safety-Lok. All of them participated in exercise workshop for 1 hour combined with video education prior to the initial survey. The blood volume, positive and contamination rates of Safety-Lok-drawn (SD) blood cultures were compared to those of overall blood cultures. Results: Eighteen interns and 30 nurses were enrolled. As a result of the initial survey, interns had higher rates of needlestick incidence (27.8%), carriage of the blood-filled syringe with needle (88.9%) and lower rates of vacutainer use (38.9%) than nurses (13.3%, 53.3%, and 60.0%). Interns preferred to use safety devices (88.9%) rather than nurses (40.0%). The number of overall blood cultures and SD blood cultures was 9,053 and 555, respectively. While the overall blood volume of aerobic bottles was 2.6±2.1 mL, those of SD blood cultures were 5.0±3.0 mL in aerobic bottles and 6.0±3.0 mL in anaerobic bottles. Positive and contamination rates were 6.5% and 0.72% with SD blood cultures and 6.2% and 0.3% with overall blood cultures. Conclusions: The introduction of the safety device would encourage healthcare workers to collect adequate blood volume as well as lead to safer practices in the ER.

Keywords: blood culture, needlestick, safety device, volume

Procedia PDF Downloads 207
6612 Entropy Measures on Neutrosophic Soft Sets and Its Application in Multi Attribute Decision Making

Authors: I. Arockiarani

Abstract:

The focus of the paper is to furnish the entropy measure for a neutrosophic set and neutrosophic soft set which is a measure of uncertainty and it permeates discourse and system. Various characterization of entropy measures are derived. Further we exemplify this concept by applying entropy in various real time decision making problems.

Keywords: entropy measure, Hausdorff distance, neutrosophic set, soft set

Procedia PDF Downloads 257
6611 Simultaneous Removal of Phosphate and Ammonium from Eutrophic Water Using Dolochar Based Media Filter

Authors: Prangya Ranjan Rout, Rajesh Roshan Dash, Puspendu Bhunia

Abstract:

With the aim of enhancing the nutrient (ammonium and phosphate) removal from eutrophic wastewater with reduced cost, a novel media based multistage bio filter with drop aeration facility was developed in this work. The bio filter was packed with a discarded sponge iron industry by product, ‘dolochar’ primarily to remove phosphate via physicochemical approach. In the multi stage bio-filter drop, aeration was achieved by the process of percolation of the gravity-fed wastewater through the filter media and dropping down of wastewater from stage to stage. Ammonium present in wastewater got adsorbed by the filter media and biomass grown on the filter media and subsequently, got converted to nitrate through biological nitrification in the aerobic condition, as realized by drop aeration. The performance of the bio-filter in treating real eutrophic wastewater was monitored for a period of about 2 months. The influent phosphate concentration was in the range of 16-19 mg/L, and ammonium concentration was in the range of 65-78 mg/L. The average nutrient removal efficiency observed during the study period were 95.2% for phosphate and 88.7% for ammonium, with mean final effluent concentration of 0.91, and 8.74 mg/L, respectively. Furthermore, the subsequent release of nutrient from the saturated filter media, after completion of treatment process has been undertaken in this study and thin layer funnel analytical test results reveal the slow nutrient release nature of spent dolochar, thereby, recommending its potential agricultural application. Thus, the bio-filter displays immense prospective for treating real eutrophic wastewater, significantly decreasing the level of nutrients and keeping the effluent nutrient concentrations at par with the permissible limit and more importantly, facilitating the conversion of the waste materials into usable ones.

Keywords: ammonium removal, phosphate removal, multi-stage bio-filter, dolochar

Procedia PDF Downloads 194
6610 Numerical Method for Fin Profile Optimization

Authors: Beghdadi Lotfi

Abstract:

In the present work a numerical method is proposed in order to optimize the thermal performance of finned surfaces. The bidimensional temperature distribution on the longitudinal section of the fin is calculated by restoring to the finite volumes method. The heat flux dissipated by a generic profile fin is compared with the heat flux removed by the rectangular profile fin with the same length and volume. In this study, it is shown that a finite volume method for quadrilaterals unstructured mesh is developed to predict the two dimensional steady-state solutions of conduction equation, in order to determine the sinusoidal parameter values which optimize the fin effectiveness. In this scheme, based on the integration around the polygonal control volume, the derivatives of conduction equation must be converted into closed line integrals using same formulation of the Stokes theorem. The numerical results show good agreement with analytical results. To demonstrate the accuracy of the method, the absolute and root-mean square errors versus the grid size are examined quantitatively.

Keywords: Stokes theorem, unstructured grid, heat transfer, complex geometry, effectiveness

Procedia PDF Downloads 268
6609 Buffer Allocation and Traffic Shaping Policies Implemented in Routers Based on a New Adaptive Intelligent Multi Agent Approach

Authors: M. Taheri Tehrani, H. Ajorloo

Abstract:

In this paper, an intelligent multi-agent framework is developed for each router in which agents have two vital functionalities, traffic shaping and buffer allocation and are positioned in the ports of the routers. With traffic shaping functionality agents shape the traffic forward by dynamic and real time allocation of the rate of generation of tokens in a Token Bucket algorithm and with buffer allocation functionality agents share their buffer capacity between each other based on their need and the conditions of the network. This dynamic and intelligent framework gives this opportunity to some ports to work better under burst and more busy conditions. These agents work intelligently based on Reinforcement Learning (RL) algorithm and will consider effective parameters in their decision process. As RL have limitation considering much parameter in its decision process due to the volume of calculations, we utilize our novel method which invokes Principle Component Analysis (PCA) on the RL and gives a high dimensional ability to this algorithm to consider as much as needed parameters in its decision process. This implementation when is compared to our previous work where traffic shaping was done without any sharing and dynamic allocation of buffer size for each port, the lower packet drop in the whole network specifically in the source routers can be seen. These methods are implemented in our previous proposed intelligent simulation environment to be able to compare better the performance metrics. The results obtained from this simulation environment show an efficient and dynamic utilization of resources in terms of bandwidth and buffer capacities pre allocated to each port.

Keywords: principal component analysis, reinforcement learning, buffer allocation, multi- agent systems

Procedia PDF Downloads 518
6608 Real-world Characterization of Treatment Intensified (Add-on to Metformin) Adults with Type 2 Diabetes in Pakistan: A Multi-center Retrospective Study (Converge)

Authors: Muhammad Qamar Masood, Syed Abbas Raza, Umar Yousaf Raja, Imran Hassan, Bilal Afzal, Muhammad Aleem Zahir, Atika Shaheer

Abstract:

Background: Cardiovascular disease (CVD) is a major burden among people with type 2 diabetes (T2D) with 1 in 3 reported to have CVD. Therefore, understanding real-world clinical characteristics and prescribing patterns could help in better care. Objective: The CONVERGE (Cardiovascular Outcomes and Value in the Real world with GLP-1RAs) study characterized demographics and medication usage patterns in T2D intensified (add-on to metformin) overall population. The data were further divided into subgroups {dipeptidyl peptidase-4 inhibitors (DPP-4is), sulfonylureas (SUs), insulins, glucagon-like peptide-1 receptor agonists (GLP-1 RAs) and sodium-glucose cotransporter-2 inhibitors (SGLT-2is)}, according to the latest prescribed antidiabetic agent (ADA) in India/Pakistan/Thailand. Here, we report findings from Pakistan. Methods: A multi-center retrospective study utilized data from medical records between 13-Sep-2008 (post-market approval of GLP-1RAs) and 31-Dec-2017 in adults (≥18-year-old). The data for this study were collected from 05 centers / institutes located in major cities of Pakistan, including Karachi, Lahore, Islamabad, and Multan. These centers included National Hospital, Aga Khan University Hospital, Diabetes Endocrine Clinic Lahore, Shifa International Hospital, Mukhtar A Sheikh Hospital Multan. Data were collected at start of medical record and at 6 or 12-months prior to baseline based on variable type; analyzed descriptively. Results: Overall, 1,010 patients were eligible. At baseline, overall mean age (SD) was 51.6 (11.3) years, T2D duration was 2.4 (2.6) years, HbA1c was 8.3% (1.9) and 35% received ≥1CVD medications in the past 1-year (before baseline). Most frequently prescribed ADAs post-metformin were DPP-4is and SUs (~63%). Only 6.5% received GLP-1RAs and SGLT-2is were not available in Pakistan during the study period. Overall, it took a mean of 4.4 years and 5 years to initiate GLP-1RAs and SGLT-2is, respectively. In comparison to other subgroups, more patients from GLP-1RAs received ≥3 types of ADA (58%), ≥1 CVD medication (64%) and had higher body mass index (37kg/m2). Conclusions: Utilization of GLP-1RAs and SGLT-2is was low, took longer time to initiate and not before trying multiple ADAs. This may be due to lack of evidence for CV benefits for these agents during the study period. The planned phase 2 of the CONVERGE study can provide more insights into utilization and barriers to prescribe GLP-1RAs and SGLT-2is post 2018 in Pakistan.

Keywords: type 2 diabetes, GLP-1RA, treatment intensification, cardiovascular disease

Procedia PDF Downloads 60
6607 A Comparative Approach to the Concept of Incarnation of God in Hinduism and Christianity

Authors: Cemil Kutluturk

Abstract:

This is a comparative study of the incarnation of God according to Hinduism and Christianity. After dealing with their basic ideas on the concept of the incarnation of God, the main similarities and differences between each other will be examined by quoting references from their sacred texts. In Hinduism, the term avatara is used in order to indicate the concept of the incarnation of God. The word avatara is derived from ava (down) and tri (to cross, to save, attain). Thus avatara means to come down or to descend. Although an avatara is commonly considered as an appearance of any deity on earth, the term refers particularly to descents of Vishnu. According to Hinduism, God becomes an avatara in every age and entering into diverse wombs for the sake of establishing righteousness. On the Christian side, the word incarnation means enfleshment. In Christianity, it is believed that the Logos or Word, the Second Person of Trinity, presumed human reality. Incarnation refers both to the act of God becoming a human being and to the result of his action, namely the permanent union of the divine and human natures in the one Person of the Word. When the doctrines of incarnation and avatara are compared some similarities and differences can be found between each other. The basic similarity is that both doctrines are not bound by the laws of nature as human beings are. They reveal God’s personal love and concern, and emphasize loving devotion. Their entry into the world is generally accompanied by extraordinary signs. In both cases, the descent of God allows for human beings to ascend to God. On the other hand, there are some distinctions between two religious traditions. For instance, according to Hinduism there are many and repeated avataras, while Christ comes only once. Indeed, this is related to the respective cyclic and linear worldviews of the two religions. Another difference is that in Hinduism avataras are real and perfect, while in Christianity Christ is also real, yet imperfect; that is, he has human imperfections, except sin. While Christ has never been thought of as a partial incarnation, in Hinduism there are some partial and full avataras. The other difference is that while the purpose of Christ is primarily ultimate salvation, not every avatara grants ultimate liberation, some of them come only to save a devotee from a specific predicament.

Keywords: Avatara, Christianity, Hinduism, incarnation

Procedia PDF Downloads 256
6606 Alternative Robust Estimators for the Shape Parameters of the Burr XII Distribution

Authors: Fatma Zehra Doğru, Olcay Arslan

Abstract:

In this paper, we propose alternative robust estimators for the shape parameters of the Burr XII distribution. We provide a small simulation study and a real data example to illustrate the performance of the proposed estimators over the ML and the LS estimators.

Keywords: burr xii distribution, robust estimator, m-estimator, least squares

Procedia PDF Downloads 428
6605 Laboratory Testing Regime for Quantifying Soil Collapsibility

Authors: Anne C. Okwedadi, Samson Ng’ambi, Ian Jefferson

Abstract:

Collapsible soils go through radical rearrangement of their particles when triggered by water, stress or/and vibration, causing loss of volume. This loss of volume in soil as seen in foundation failures has caused millions of dollars’ worth of damages to public facilities and infrastructure and so has an adverse effect on the society and people. Despite these consequences and the several studies that are available, more research is still required in the study of soil collapsibility. Discerning the pedogenesis (formation) of soils and investigating the combined effects of the different geological soil properties is key to elucidating and quantifying soils collapsibility. This study presents a novel laboratory testing regime that would be undertaken on soil samples where the effects of soil type, compactive variables (moisture content, density, void ratio, degree of saturation) and loading are analyzed. It is anticipated that results obtained would be useful in mapping the trend of the combined effect thus the basis for evaluating soil collapsibility or collapse potentials encountered in construction with volume loss problems attributed to collapse.

Keywords: collapsible soil, geomorphological process, soil collapsibility properties, soil test

Procedia PDF Downloads 471
6604 Disrupted or Discounted Cash Flow: Impact of Digitisation on Business Valuation

Authors: Matthias Haerri, Tobias Huettche, Clemens Kustner

Abstract:

This article discusses the impact of digitization on business valuation. In order to become and remain ‘digital’, investments are necessary whose return on investment (ROI) often remains vague. This uncertainty is contradictory for a valuation, that rely on predictable cash flows, fixed capital structures and the steady state. However digitisation does not make a company valuation impossible, but traditional approaches must be reconsidered. The authors identify four areas that are to be changing: (1) Tools instead of intuition - In the future, company valuation will neither be art nor science, but craft. This does not require intuition, but experience and good tools. Digital evaluation tools beyond Excel will therefore gain in importance. (2) Real-time instead of deadline - At present, company valuations are always carried out on a case-by-case basis and on a specific key date. This will change with the digitalization and the introduction of web-based valuation tools. Company valuations can thus not only be carried out faster and more efficiently, but can also be offered more frequently. Instead of calculating the value for a previous key date, current and real-time valuations can be carried out. (3) Predictive planning instead of analysis of the past - Past data will also be needed in the future, but its use will not be limited to monovalent time series or key figure analyses. With pictures of ‘black swans’ and the ‘turkey illusion’ it was made clear to us that we build forecasts on too few data points of the past and underestimate the power of chance. Predictive planning can help here. (4) Convergence instead of residual value - Digital transformation shortens the lifespan of viable business models. If companies want to live forever, they have to change forever. For the company valuation, this means that the business model valid on the valuation date only has a limited service life.

Keywords: business valuation, corporate finance, digitisation, disruption

Procedia PDF Downloads 133
6603 When Your Change The Business Model ~ You Change The World

Authors: H. E. Amb. Terry Earthwind Nichols

Abstract:

Over the years Ambassador Nichols observed that successful companies all have one thing in common - belief in people. His observations of people in many companies, industries, and countries have also concluded one thing - groups of achievers far exceed the expectations and timelines of their superiors. His experience with achieving this has brought forth a model for the 21st century that will not only exceed expectations of companies, but it will also set visions for the future of business globally. It is time for real discussion around the future of work and the business model that will set the example for the world. Methodologies: In-person observations over 40 years – Ambassador Nichols present during the observations. Audio-visual observations – TV, Cinema, social media (YouTube, etc.), various news outlet Reading the autobiography of some of successful leaders over the last 75 years that lead their companies from a distinct perspective your people are your commodity. Major findings: People who believe in the leader’s vision for the company so much so that they remain excited about the future of the company and want to do anything in their power to ethically achieve that vision. People who are achieving regularly in groups, division, companies, etcetera: Live more healthfully lowering both sick time off and on-the-job accidents. Cannot wait to physically get to work as much as they can to feed off the high energy present in these companies. They are fully respected and supported resulting in near zero attrition. Simply put – they do not “Burn Out”. Conclusion: To the author’s best knowledge, 20th century practices in business are no longer valid and people are not going to work in those environments any longer. The average worker in the post-covid world is better educated than 50 years ago and most importantly, they have real-time information about any subject and can stream injustices as they happen. The Consortium Model is just the model for the evolution of both humankind and business in the 21st century.

Keywords: business model, future of work, people, paradigm shift, business management

Procedia PDF Downloads 78
6602 Impact of Unusual Dust Event on Regional Climate in India

Authors: Kanika Taneja, V. K. Soni, Kafeel Ahmad, Shamshad Ahmad

Abstract:

A severe dust storm generated from a western disturbance over north Pakistan and adjoining Afghanistan affected the north-west region of India between May 28 and 31, 2014, resulting in significant reductions in air quality and visibility. The air quality of the affected region degraded drastically. PM10 concentration peaked at a very high value of around 1018 μgm-3 during dust storm hours of May 30, 2014 at New Delhi. The present study depicts aerosol optical properties monitored during the dust days using ground based multi-wavelength Sky radiometer over the National Capital Region of India. High Aerosol Optical Depth (AOD) at 500 nm was observed as 1.356 ± 0.19 at New Delhi while Angstrom exponent (Alpha) dropped to 0.287 on May 30, 2014. The variation in the Single Scattering Albedo (SSA) and real n(λ) and imaginary k(λ) parts of the refractive index indicated that the dust event influences the optical state to be more absorbing. The single scattering albedo, refractive index, volume size distribution and asymmetry parameter (ASY) values suggested that dust aerosols were predominant over the anthropogenic aerosols in the urban environment of New Delhi. The large reduction in the radiative flux at the surface level caused significant cooling at the surface. Direct Aerosol Radiative Forcing (DARF) was calculated using a radiative transfer model during the dust period. A consistent increase in surface cooling was evident, ranging from -31 Wm-2 to -82 Wm-2 and an increase in heating of the atmosphere from 15 Wm-2 to 92 Wm-2 and -2 Wm-2 to 10 Wm-2 at top of the atmosphere.

Keywords: aerosol optical properties, dust storm, radiative transfer model, sky radiometer

Procedia PDF Downloads 377
6601 Interpreting Possibilities: Teaching Without Borders

Authors: Mira Kadric

Abstract:

The proposed paper deals with a new developed approach for interpreting teaching, combining traditional didactics with a new element. The fundamental principle of the approach is taken from the theatre pedagogy (Augusto Boal`s Theatre of the Oppressed) and includes the discussion on social power relations. From the point of view of education sociology this implies strengthening students’ individual potential for self-determination on a number of levels, especially in view of the present increase in social responsibility. This knowledge constitutes a starting point and basis for the process of self-determined action. This takes place in the context of a creative didactic policy which identifies didactic goals, provides clear sequences of content, specifies interdisciplinary methods and examines their practical adequacy and ultimately serves not only individual translators and interpreters, but all parties involved. The goal of the presented didactic model is to promote independent work and problem-solving strategies; this helps to develop creative potential and self-confident behaviour. It also conveys realistic knowledge of professional reality and thus also of the real socio-political and professional parameters involved. As well as providing a discussion of fundamental questions relevant to Translation and Interpreting Studies, this also serves to improve this interdisciplinary didactic approach which simulates interpreting reality and illustrates processes and strategies which (can) take place in real life. This idea is illustrated in more detail with methods taken from the Theatre of the Oppressed created by Augusto Boal. This includes examples from (dialogue) interpreting teaching based on documentation from recordings made in a seminar in the summer term 2014.

Keywords: augusto boal, didactic model, interpreting teaching, theatre of the oppressed

Procedia PDF Downloads 432
6600 Advanced Mouse Cursor Control and Speech Recognition Module

Authors: Prasad Kalagura, B. Veeresh kumar

Abstract:

We constructed an interface system that would allow a similarly paralyzed user to interact with a computer with almost full functional capability. A real-time tracking algorithm is implemented based on adaptive skin detection and motion analysis. The clicking of the mouse is activated by the user's eye blinking through a sensor. The keyboard function is implemented by voice recognition kit.

Keywords: embedded ARM7 processor, mouse pointer control, voice recognition

Procedia PDF Downloads 578