Search results for: tide data
21371 Tools for Analysis and Optimization of Standalone Green Microgrids
Authors: William Anderson, Kyle Kobold, Oleg Yakimenko
Abstract:
Green microgrids using mostly renewable energy (RE) for generation, are complex systems with inherent nonlinear dynamics. Among a variety of different optimization tools there are only a few ones that adequately consider this complexity. This paper evaluates applicability of two somewhat similar optimization tools tailored for standalone RE microgrids and also assesses a machine learning tool for performance prediction that can enhance the reliability of any chosen optimization tool. It shows that one of these microgrid optimization tools has certain advantages over another and presents a detailed routine of preparing input data to simulate RE microgrid behavior. The paper also shows how neural-network-based predictive modeling can be used to validate and forecast solar power generation based on weather time series data, which improves the overall quality of standalone RE microgrid analysis.Keywords: microgrid, renewable energy, complex systems, optimization, predictive modeling, neural networks
Procedia PDF Downloads 28221370 A Wall Law for Two-Phase Turbulent Boundary Layers
Authors: Dhahri Maher, Aouinet Hana
Abstract:
The presence of bubbles in the boundary layer introduces corrections into the log law, which must be taken into account. In this work, a logarithmic wall law was presented for bubbly two phase flows. The wall law presented in this work was based on the postulation of additional turbulent viscosity associated with bubble wakes in the boundary layer. The presented wall law contained empirical constant accounting both for shear induced turbulence interaction and for non-linearity of bubble. This constant was deduced from experimental data. The wall friction prediction achieved with the wall law was compared to the experimental data, in the case of a turbulent boundary layer developing on a vertical flat plate in the presence of millimetric bubbles. A very good agreement between experimental and numerical wall friction prediction was verified. The agreement was especially noticeable for the low void fraction when bubble induced turbulence plays a significant role.Keywords: bubbly flows, log law, boundary layer, CFD
Procedia PDF Downloads 27821369 Secure Intelligent Information Management by Using a Framework of Virtual Phones-On Cloud Computation
Authors: Mohammad Hadi Khorashadi Zadeh
Abstract:
Many new applications and internet services have been emerged since the innovation of mobile networks and devices. However, these applications have problems of security, management, and performance in business environments. Cloud systems provide information transfer, management facilities, and security for virtual environments. Therefore, an innovative internet service and a business model are proposed in the present study for creating a secure and consolidated environment for managing the mobile information of organizations based on cloud virtual phones (CVP) infrastructures. Using this method, users can run Android and web applications in the cloud which enhance performance by connecting to other CVP users and increases privacy. It is possible to combine the CVP with distributed protocols and central control which mimics the behavior of human societies. This mix helps in dealing with sensitive data in mobile devices and facilitates data management with less application overhead.Keywords: BYOD, mobile cloud computing, mobile security, information management
Procedia PDF Downloads 31721368 Recurrent Neural Networks for Classifying Outliers in Electronic Health Record Clinical Text
Authors: Duncan Wallace, M-Tahar Kechadi
Abstract:
In recent years, Machine Learning (ML) approaches have been successfully applied to an analysis of patient symptom data in the context of disease diagnosis, at least where such data is well codified. However, much of the data present in Electronic Health Records (EHR) are unlikely to prove suitable for classic ML approaches. Furthermore, as scores of data are widely spread across both hospitals and individuals, a decentralized, computationally scalable methodology is a priority. The focus of this paper is to develop a method to predict outliers in an out-of-hours healthcare provision center (OOHC). In particular, our research is based upon the early identification of patients who have underlying conditions which will cause them to repeatedly require medical attention. OOHC act as an ad-hoc delivery of triage and treatment, where interactions occur without recourse to a full medical history of the patient in question. Medical histories, relating to patients contacting an OOHC, may reside in several distinct EHR systems in multiple hospitals or surgeries, which are unavailable to the OOHC in question. As such, although a local solution is optimal for this problem, it follows that the data under investigation is incomplete, heterogeneous, and comprised mostly of noisy textual notes compiled during routine OOHC activities. Through the use of Deep Learning methodologies, the aim of this paper is to provide the means to identify patient cases, upon initial contact, which are likely to relate to such outliers. To this end, we compare the performance of Long Short-Term Memory, Gated Recurrent Units, and combinations of both with Convolutional Neural Networks. A further aim of this paper is to elucidate the discovery of such outliers by examining the exact terms which provide a strong indication of positive and negative case entries. While free-text is the principal data extracted from EHRs for classification, EHRs also contain normalized features. Although the specific demographical features treated within our corpus are relatively limited in scope, we examine whether it is beneficial to include such features among the inputs to our neural network, or whether these features are more successfully exploited in conjunction with a different form of a classifier. In this section, we compare the performance of randomly generated regression trees and support vector machines and determine the extent to which our classification program can be improved upon by using either of these machine learning approaches in conjunction with the output of our Recurrent Neural Network application. The output of our neural network is also used to help determine the most significant lexemes present within the corpus for determining high-risk patients. By combining the confidence of our classification program in relation to lexemes within true positive and true negative cases, with an inverse document frequency of the lexemes related to these cases, we can determine what features act as the primary indicators of frequent-attender and non-frequent-attender cases, providing a human interpretable appreciation of how our program classifies cases.Keywords: artificial neural networks, data-mining, machine learning, medical informatics
Procedia PDF Downloads 13121367 Improving Road Infrastructure Safety Management Through Statistical Analysis of Road Accident Data. Case Study: Streets in Bucharest
Authors: Dimitriu Corneliu-Ioan, Gheorghe FrațIlă
Abstract:
Romania has one of the highest rates of road deaths among European Union Member States, and there is a concern that the country will not meet its goal of "zero deaths" by 2050. The European Union also aims to halve the number of people seriously injured in road accidents by 2030. Therefore, there is a need to improve road infrastructure safety management in Romania. The aim of this study is to analyze road accident data through statistical methods to assess the current state of road infrastructure safety in Bucharest. The study also aims to identify trends and make forecasts regarding serious road accidents and their consequences. The objective is to provide insights that can help prioritize measures to increase road safety, particularly in urban areas. The research utilizes statistical analysis methods, including exploratory analysis and descriptive statistics. Databases from the Traffic Police and the Romanian Road Authority are analyzed using Excel. Road risks are compared with the main causes of road accidents to identify correlations. The study emphasizes the need for better quality and more diverse collection of road accident data for effective analysis in the field of road infrastructure engineering. The research findings highlight the importance of prioritizing measures to improve road safety in urban areas, where serious accidents and their consequences are more frequent. There is a correlation between the measures ordered by road safety auditors and the main causes of serious accidents in Bucharest. The study also reveals the significant social costs of road accidents, amounting to approximately 3% of GDP, emphasizing the need for collaboration between local and central administrations in allocating resources for road safety. This research contributes to a clearer understanding of the current road infrastructure safety situation in Romania. The findings provide critical insights that can aid decision-makers in allocating resources efficiently and institutionally cooperating to achieve sustainable road safety. The data used for this study are collected from the Traffic Police and the Romanian Road Authority. The data processing involves exploratory analysis and descriptive statistics using the Excel tool. The analysis allows for a better understanding of the factors contributing to the current road safety situation and helps inform managerial decisions to eliminate or reduce road risks. The study addresses the state of road infrastructure safety in Bucharest and analyzes the trends and forecasts regarding serious road accidents and their consequences. It studies the correlation between road safety measures and the main causes of serious accidents. To improve road safety, cooperation between local and central administrations towards joint financial efforts is important. This research highlights the need for statistical data processing methods to substantiate managerial decisions in road infrastructure management. It emphasizes the importance of improving the quality and diversity of road accident data collection. The research findings provide a critical perspective on the current road safety situation in Romania and offer insights to identify appropriate solutions to reduce the number of serious road accidents in the future.Keywords: road death rate, strategic objective, serious road accidents, road safety, statistical analysis
Procedia PDF Downloads 8521366 Rotterdam in Transition: A Design Case for a Low-Carbon Transport Node in Lombardijen
Authors: Halina Veloso e Zarate, Manuela Triggianese
Abstract:
The urban challenges posed by rapid population growth, climate adaptation, and sustainable living have compelled Dutch cities to reimagine their built environment and transportation systems. As a pivotal contributor to CO₂ emissions, the transportation sector in the Netherlands demands innovative solutions for transitioning to low-carbon mobility. This study investigates the potential of transit oriented development (TOD) as a strategy for achieving carbon reduction and sustainable urban transformation. Focusing on the Lombardijen station area in Rotterdam, which is targeted for significant densification, this paper presents a design-oriented exploration of a low-carbon transport node. By employing a research-by-design methodology, this study delves into multifaceted factors and scales, aiming to propose future scenarios for Lombardijen. Drawing from a synthesis of existing literature, applied research, and practical insights, a robust design framework emerges. To inform this framework, governmental data concerning the built environment and material embodied carbon are harnessed. However, the restricted access to crucial datasets, such as property ownership information from the cadastre and embodied carbon data from De Nationale Milieudatabase, underscores the need for improved data accessibility, especially during the concept design phase. The findings of this research contribute fundamental insights not only to the Lombardijen case but also to TOD studies across Rotterdam's 13 nodes and similar global contexts. Spatial data related to property ownership facilitated the identification of potential densification sites, underscoring its importance for informed urban design decisions. Additionally, the paper highlights the disparity between the essential role of embodied carbon data in environmental assessments for building permits and its limited accessibility due to proprietary barriers. Although this study lays the groundwork for sustainable urbanization through TOD-based design, it acknowledges an area of future research worthy of exploration: the socio-economic dimension. Given the complex socio-economic challenges inherent in the Lombardijen area, extending beyond spatial constraints, a comprehensive approach demands integration of mobility infrastructure expansion, land-use diversification, programmatic enhancements, and climate adaptation. While the paper adopts a TOD lens, it refrains from an in-depth examination of issues concerning equity and inclusivity, opening doors for subsequent research to address these aspects crucial for holistic urban development.Keywords: Rotterdam zuid, transport oriented development, carbon emissions, low-carbon design, cross-scale design, data-supported design
Procedia PDF Downloads 8421365 Information Exchange Process Analysis between Authoring Design Tools and Lighting Simulation Tools
Authors: Rudan Xue, Annika Moscati, Rehel Zeleke Kebede, Peter Johansson
Abstract:
Successful buildings’ simulation and analysis inevitably require information exchange between multiple building information modeling (BIM) software. The BIM infor-mation exchange based on IFC is widely used. However, Industry Foundation Classifi-cation (IFC) files are not always reliable and information can get lost when using dif-ferent software for modeling and simulations. In this research, interviews with lighting simulation experts and a case study provided by a company producing lighting devices have been the research methods used to identify the necessary steps and data for suc-cessful information exchange between lighting simulation tools and authoring design tools. Model creation, information exchange, and model simulation have been identi-fied as key aspects for the success of information exchange. The paper concludes with recommendations for improved information exchange and more reliable simulations that take all the needed parameters into consideration.Keywords: BIM, data exchange, interoperability issues, lighting simulations
Procedia PDF Downloads 24221364 Climate Change and Sustainable Development among Agricultural Communities in Tanzania; An Analysis of Southern Highland Rural Communities
Authors: Paschal Arsein Mugabe
Abstract:
This paper examines sustainable development planning in the context of environmental concerns in rural areas of the Tanzania. It challenges mainstream approaches to development, focusing instead upon transformative action for environmental justice. The goal is to help shape future sustainable development agendas in local government, international agencies and civil society organisations. Research methods: The approach of the study is geographical, but also involves various Trans-disciplinary elements, particularly from development studies, sociology and anthropology, management, geography, agriculture and environmental science. The research methods included thematic and questionnaire interviews, participatory tools such as focus group discussion, participatory research appraisal and expert interviews for primary data. Secondary data were gathered through the analysis of land use/cover data and official documents on climate, agriculture, marketing and health. Also several earlier studies that were made in the area provided an important reference base. Findings: The findings show that, agricultural sustainability in Tanzania appears likely to deteriorate as a consequence of climate change. Noteworthy differences in impacts across households are also present both by district and by income category. Also food security cannot be explained by climate as the only influencing factor. A combination of economic, political and socio-cultural context of the community are crucial. Conclusively, it is worthy knowing that people understand their relationship between climate change and their livelihood.Keywords: agriculture, climate change, environment, sustainable development
Procedia PDF Downloads 32521363 Gravity and Magnetic Survey, Modeling and Interpretation in the Blötberget Iron-Oxide Mining Area of Central Sweden
Authors: Ezra Yehuwalashet, Alireza Malehmir
Abstract:
Blötberget mining area in central Sweden, part of the Bergslagen mineral district, is well known for its various type of mineralization particularly iron-oxide deposits since the 1600. To shed lights on the knowledge of the host rock structures, depth extent and tonnage of the mineral deposits and support deep mineral exploration potential in the study area, new ground gravity and existing aeromagnetic data (from the Geological Survey of Sweden) were used for interpretations and modelling. A major boundary separating a gravity low from a gravity high in the southern part of the study area is noticeable and likely representing a fault boundary separating two different lithological units. Gravity data and modeling offers a possible new target area in the southeast of the known mineralization while suggesting an excess high-density region down to 800 m depth.Keywords: gravity, magnetics, ore deposit, geophysics
Procedia PDF Downloads 6621362 Prediction of Damage to Cutting Tools in an Earth Pressure Balance Tunnel Boring Machine EPB TBM: A Case Study L3 Guadalajara Metro Line (Mexico)
Authors: Silvia Arrate, Waldo Salud, Eloy París
Abstract:
The wear of cutting tools is one of the most decisive elements when planning tunneling works, programming the maintenance stops and saving the optimum stock of spare parts during the evolution of the excavation. Being able to predict the behavior of cutting tools can give a very competitive advantage in terms of costs and excavation performance, optimized to the needs of the TBM itself. The incredible evolution of data science in recent years gives the option to implement it at the time of analyzing the key and most critical parameters related to machinery with the purpose of knowing how the cutting head is performing in front of the excavated ground. Taking this as a case study, Metro Line 3 of Guadalajara in Mexico will develop the feasibility of using Specific Energy versus data science applied over parameters of Torque, Penetration, and Contact Force, among others, to predict the behavior and status of cutting tools. The results obtained through both techniques are analyzed and verified in the function of the wear and the field situations observed in the excavation in order to determine its effectiveness regarding its predictive capacity. In conclusion, the possibilities and improvements offered by the application of digital tools and the programming of calculation algorithms for the analysis of wear of cutting head elements compared to purely empirical methods allow early detection of possible damage to cutting tools, which is reflected in optimization of excavation performance and a significant improvement in costs and deadlines.Keywords: cutting tools, data science, prediction, TBM, wear
Procedia PDF Downloads 4921361 Landslide Hazard Zonation Using Satellite Remote Sensing and GIS Technology
Authors: Ankit Tyagi, Reet Kamal Tiwari, Naveen James
Abstract:
Landslide is the major geo-environmental problem of Himalaya because of high ridges, steep slopes, deep valleys, and complex system of streams. They are mainly triggered by rainfall and earthquake and causing severe damage to life and property. In Uttarakhand, the Tehri reservoir rim area, which is situated in the lesser Himalaya of Garhwal hills, was selected for landslide hazard zonation (LHZ). The study utilized different types of data, including geological maps, topographic maps from the survey of India, Landsat 8, and Cartosat DEM data. This paper presents the use of a weighted overlay method in LHZ using fourteen causative factors. The various data layers generated and co-registered were slope, aspect, relative relief, soil cover, intensity of rainfall, seismic ground shaking, seismic amplification at surface level, lithology, land use/land cover (LULC), normalized difference vegetation index (NDVI), topographic wetness index (TWI), stream power index (SPI), drainage buffer and reservoir buffer. Seismic analysis is performed using peak horizontal acceleration (PHA) intensity and amplification factors in the evaluation of the landslide hazard index (LHI). Several digital image processing techniques such as topographic correction, NDVI, and supervised classification were widely used in the process of terrain factor extraction. Lithological features, LULC, drainage pattern, lineaments, and structural features are extracted using digital image processing techniques. Colour, tones, topography, and stream drainage pattern from the imageries are used to analyse geological features. Slope map, aspect map, relative relief are created by using Cartosat DEM data. DEM data is also used for the detailed drainage analysis, which includes TWI, SPI, drainage buffer, and reservoir buffer. In the weighted overlay method, the comparative importance of several causative factors obtained from experience. In this method, after multiplying the influence factor with the corresponding rating of a particular class, it is reclassified, and the LHZ map is prepared. Further, based on the land-use map developed from remote sensing images, a landslide vulnerability study for the study area is carried out and presented in this paper.Keywords: weighted overlay method, GIS, landslide hazard zonation, remote sensing
Procedia PDF Downloads 13321360 The Influence of Guided and Independent Training Toward Teachers’ Competence to Plan Early Childhood Education Learning Program
Authors: Sofia Hartati
Abstract:
This research is aimed at describing training in early childhood education program empirically, describing teachers ability to plan lessons empirically, and acquiring empirical data as well as analyzing the influence of guided and independent training toward teachers competence in planning early childhood learning program. The method used is an experiment. It collected data with a population of 76 early childhood educators in Tunjung Teja Sub District area through random sampling technique and grouped into two namely 38 people in an experiment class and 38 people in a controlled class. The technique used for data collections is a test. The result of the research shows that there is a significant influence between training for guided educators toward Teachers Ability toward Planning Early Childhood Learning Program. Guided training has been proven to improve the ability to comprehend planning a learning program. The ability to comprehend planning a learning program owned by teachers of early childhood program comprises of 1) determining the characteristics and competence of students prior to learning; 2) formulating the objective of the learning; 3) selecting materials and its sequences; 4) selecting teaching methods; 5) determining the means or learning media; 6) selecting evaluation strategy as a part of teachers pedagogic competence. The result of this research describes a difference in the competence level of teachers who have joined guided training which is relatively higher than the teachers who joined the independent training. Guided training is one of an effective way to improve the knowledge and competence of early childhood educators.Keywords: competence, planning, teachers, training
Procedia PDF Downloads 26421359 DWDM Network Implementation in the Honduran Telecommunications Company "Hondutel"
Authors: Tannia Vindel, Carlos Mejia, Damaris Araujo, Carlos Velasquez, Darlin Trejo
Abstract:
The DWDM (Dense Wavelenght Division Multiplexing) is in constant growth around the world by consumer demand to meet their needs. Since its inception in this operation arises the need for a system which enable us to expand the communication of an entire nation to improve the computing trends of their societies according to their customs and geographical location. The Honduran Company of Telecommunications (HONDUTEL), provides the internet services and data transport technology with a PDH and SDH, which represents in the Republic of Honduras C. A., the option of viability for the consumer in terms of purchase value and its ease of acquisition; but does not have the efficiency in terms of technological advance and represents an obstacle that limits the long-term socio-economic development in comparison with other countries in the region and to be able to establish a competition between telecommunications companies that are engaged in this heading. For that reason we propose to establish a new technological trend implemented in Europe and that is applied in our country that allows us to provide a data transfer in broadband as it is DWDM, in this way we will have a stable service and quality that will allow us to compete in this globalized world, and that must be replaced by one that would provide a better service and which must be in the forefront. Once implemented the DWDM is build upon the existing resources, such as the equipment used, and you will be given life to a new stage providing a business image to the Republic of Honduras C,A, as a nation, to ensure the data transport and broadband internet to a meaningful relationship. Same benefits in the first instance to existing customers and to all the institutions were bidden to these public and private need of such services.Keywords: demultiplexers, light detectors, multiplexers, optical amplifiers, optical fibers, PDH, SDH
Procedia PDF Downloads 26321358 A Study on Automotive Attack Database and Data Flow Diagram for Concretization of HEAVENS: A Car Security Model
Authors: Se-Han Lee, Kwang-Woo Go, Gwang-Hyun Ahn, Hee-Sung Park, Cheol-Kyu Han, Jun-Bo Shim, Geun-Chul Kang, Hyun-Jung Lee
Abstract:
In recent years, with the advent of smart cars and the expansion of the market, the announcement of 'Adventures in Automotive Networks and Control Units' at the DEFCON21 conference in 2013 revealed that cars are not safe from hacking. As a result, the HEAVENS model considering not only the functional safety of the vehicle but also the security has been suggested. However, the HEAVENS model only presents a simple process, and there are no detailed procedures and activities for each process, making it difficult to apply it to the actual vehicle security vulnerability check. In this paper, we propose an automated attack database that systematically summarizes attack vectors, attack types, and vulnerable vehicle models to prepare for various car hacking attacks, and data flow diagrams that can detect various vulnerabilities and suggest a way to materialize the HEAVENS model.Keywords: automotive security, HEAVENS, car hacking, security model, information security
Procedia PDF Downloads 36221357 Relationship between Growth of Non-Performing Assets and Credit Risk Management Practices in Indian Banks
Authors: Sirus Sharifi, Arunima Haldar, S. V. D. Nageswara Rao
Abstract:
The study attempts to analyze the impact of credit risk management practices of Indian scheduled commercial banks on their non-performing assets (NPAs). The data on credit risk practices was collected by administering a questionnaire to risk managers/executives at different banks. The data on NPAs (from 2012 to 2016) is sourced from Prowess, a database compiled by the Centre for Monitoring Indian Economy (CMIE). The model was estimated using cross-sectional regression method. As expected, the findings suggest that there is a negative relationship between credit risk management and NPA growth in Indian banks. The study has implications for Indian banks given the high level of losses, and the implementation of Basel III norms by the central bank, i.e. Reserve Bank of India (RBI). Evidence on credit risk management in Indian banks, and their relationship with non-performing assets held by them.Keywords: credit risk, identification, Indian Banks, NPAs, ownership
Procedia PDF Downloads 40821356 Optimization of Reaction Parameters' Influences on Production of Bio-Oil from Fast Pyrolysis of Oil Palm Empty Fruit Bunch Biomass in a Fluidized Bed Reactor
Authors: Chayanoot Sangwichien, Taweesak Reungpeerakul, Kyaw Thu
Abstract:
Oil palm mills in Southern Thailand produced a large amount of biomass solid wastes. Lignocellulose biomass is the main source for production of biofuel which can be combined or used as an alternative to fossil fuels. Biomass composed of three main constituents of cellulose, hemicellulose, and lignin. Thermochemical conversion process applied to produce biofuel from biomass. Pyrolysis of biomass is the best way to thermochemical conversion of biomass into pyrolytic products (bio-oil, gas, and char). Operating parameters play an important role to optimize the product yields from fast pyrolysis of biomass. This present work concerns with the modeling of reaction kinetics parameters for fast pyrolysis of empty fruit bunch in the fluidized bed reactor. A global kinetic model used to predict the product yields from fast pyrolysis of empty fruit bunch. The reaction temperature and vapor residence time parameters are mainly affected by product yields of EFB pyrolysis. The reaction temperature and vapor residence time parameters effects on empty fruit bunch pyrolysis are considered at the reaction temperature in the range of 450-500˚C and at a vapor residence time of 2 s, respectively. The optimum simulated bio-oil yield of 53 wt.% obtained at the reaction temperature and vapor residence time of 450˚C and 2 s, 500˚C and 1 s, respectively. The simulated data are in good agreement with the reported experimental data. These simulated data can be applied to the performance of experiment work for the fast pyrolysis of biomass.Keywords: kinetics, empty fruit bunch, fast pyrolysis, modeling
Procedia PDF Downloads 21521355 Nutritional Status of Morbidly Obese Patients Prior to Bariatric Surgery
Authors: Azadeh Mottaghi, Reyhaneh Yousefi, Saeed Safari
Abstract:
Background: Bariatric surgery is widely proposed as the most effective approach to mitigate the growing pace of morbid obesity. As bariatric surgery candidates suffer from pre-existing nutritional deficiencies, it is of great importance to assess nutritional status of candidates before surgery in order to establish appropriate nutritional interventions. Objectives: The present study assessed and represented baseline data according to the nutritional status among candidates for bariatric surgery. Methods: A cross-sectional analysis of pre-surgery data was collected on 170 morbidly obese patients undergoing bariatric surgery between October 2017 and February 2018. Dietary intake data (evaluated through 147-item food frequency questionnaire), anthropometric measures and biochemical parameters were assessed. Results: Participants included 145 females (25 males) with average age of 37.3 ± 10.2 years, BMI of 45.7 ± 6.4 kg/m² and reported to have a total of 72.3 ± 22.2 kg excess body weight. The most common nutritional deficiencies referred to iron, ferritin, transferrin, albumin, vitamin B12, and vitamin D, the prevalence of which in the study population were as followed; 6.5, 6.5, 3, 2, 17.6 and 66%, respectively. Mean energy, protein, fat, and carbohydrate intake were 3887.3 ± 1748.32 kcal/day, 121.6 ± 57.1, 144.1 ± 83.05, and 552.4 ± 240.5 gr/day, respectively. The study population consumed lower levels of iron, calcium, folic acid, and vitamin B12 compared to the Dietary Reference Intake (DRI) recommendations (2, 26, 2.5, and 13%, respectively). Conclusion: According to the poor dietary quality of bariatric surgery candidates, leading to nutritional deficiencies pre-operatively, close monitoring and tailored supplementation pre- and post-bariatric surgery are required.Keywords: bariatric surgery, food frequency questionnaire, obesity, nutritional status
Procedia PDF Downloads 17221354 Using Artificial Intelligence Technology to Build the User-Oriented Platform for Integrated Archival Service
Authors: Lai Wenfang
Abstract:
Tthis study will describe how to use artificial intelligence (AI) technology to build the user-oriented platform for integrated archival service. The platform will be launched in 2020 by the National Archives Administration (NAA) in Taiwan. With the progression of information communication technology (ICT) the NAA has built many systems to provide archival service. In order to cope with new challenges, such as new ICT, artificial intelligence or blockchain etc. the NAA will try to use the natural language processing (NLP) and machine learning (ML) skill to build a training model and propose suggestions based on the data sent to the platform. NAA expects the platform not only can automatically inform the sending agencies’ staffs which records catalogues are against the transfer or destroy rules, but also can use the model to find the details hidden in the catalogues and suggest NAA’s staff whether the records should be or not to be, to shorten the auditing time. The platform keeps all the users’ browse trails; so that the platform can predict what kinds of archives user could be interested and recommend the search terms by visualization, moreover, inform them the new coming archives. In addition, according to the Archives Act, the NAA’s staff must spend a lot of time to mark or remove the personal data, classified data, etc. before archives provided. To upgrade the archives access service process, the platform will use some text recognition pattern to black out automatically, the staff only need to adjust the error and upload the correct one, when the platform has learned the accuracy will be getting higher. In short, the purpose of the platform is to deduct the government digital transformation and implement the vision of a service-oriented smart government.Keywords: artificial intelligence, natural language processing, machine learning, visualization
Procedia PDF Downloads 17421353 Object Recognition System Operating from Different Type Vehicles Using Raspberry and OpenCV
Authors: Maria Pavlova
Abstract:
In our days, it is possible to put the camera on different vehicles like quadcopter, train, airplane and etc. The camera also can be the input sensor in many different systems. That means the object recognition like non separate part of monitoring control can be key part of the most intelligent systems. The aim of this paper is to focus of the object recognition process during vehicles movement. During the vehicle’s movement the camera takes pictures from the environment without storage in Data Base. In case the camera detects a special object (for example human or animal), the system saves the picture and sends it to the work station in real time. This functionality will be very useful in emergency or security situations where is necessary to find a specific object. In another application, the camera can be mounted on crossroad where do not have many people and if one or more persons come on the road, the traffic lights became the green and they can cross the road. In this papers is presented the system has solved the aforementioned problems. It is presented architecture of the object recognition system includes the camera, Raspberry platform, GPS system, neural network, software and Data Base. The camera in the system takes the pictures. The object recognition is done in real time using the OpenCV library and Raspberry microcontroller. An additional feature of this library is the ability to display the GPS coordinates of the captured objects position. The results from this processes will be sent to remote station. So, in this case, we can know the location of the specific object. By neural network, we can learn the module to solve the problems using incoming data and to be part in bigger intelligent system. The present paper focuses on the design and integration of the image recognition like a part of smart systems.Keywords: camera, object recognition, OpenCV, Raspberry
Procedia PDF Downloads 21821352 A Study on an Evacuation Test to Measure Delay Time in Using an Evacuation Elevator
Authors: Kyungsuk Cho, Seungun Chae, Jihun Choi
Abstract:
Elevators are examined as one of evacuation methods in super-tall buildings. However, data on the use of elevators for evacuation at a fire are extremely scarce. Therefore, a test to measure delay time in using an evacuation elevator was conducted. In the test, time taken to get on and get off an elevator was measured and the case in which people gave up boarding when the capacity of the elevator was exceeded was also taken into consideration. 170 men and women participated in the test, 130 of whom were young people (20 ~ 50 years old) and 40 were senior citizens (over 60 years old). The capacity of the elevator was 25 people and it travelled between the 2nd and 4th floors. A video recording device was used to analyze the test. An elevator at an ordinary building, not a super-tall building, was used in the test to measure delay time in getting on and getting off an elevator. In order to minimize interference from other elements, elevator platforms on the 2nd and 4th floors were partitioned off. The elevator travelled between the 2nd and 4th floors where people got on and off. If less than 20 people got on the elevator which was empty, the data were excluded. If the elevator carrying 10 passengers stopped and less than 10 new passengers got on the elevator, the data were excluded. Getting-on an empty elevator was observed 49 times. The average number of passengers was 23.7, it took 14.98 seconds for the passengers to get on the empty elevator and the load factor was 1.67 N/s. It took the passengers, whose average number was 23.7, 10.84 seconds to get off the elevator and the unload factor was 2.33 N/s. When an elevator’s capacity is exceeded, the excessive number of people should get off. Time taken for it and the probability of the case were measure in the test. 37% of the times of boarding experienced excessive number of people. As the number of people who gave up boarding increased, the load factor of the ride decreased. When 1 person gave up boarding, the load factor was 1.55 N/s. The case was observed 10 times, which was 12.7% of the total. When 2 people gave up boarding, the load factor was 1.15 N/s. The case was observed 7 times, which was 8.9% of the total. When 3 people gave up boarding, the load factor was 1.26 N/s. The case was observed 4 times, which was 5.1% of the total. When 4 people gave up boarding, the load factor was 1.03 N/s. The case was observed 5 times, which was 6.3% of the total. Getting-on and getting-off time data for people who can walk freely were obtained from the test. In addition, quantitative results were obtained from the relation between the number of people giving up boarding and time taken for getting on. This work was supported by the National Research Council of Science & Technology (NST) grant by the Korea government (MSIP) (No. CRC-16-02-KICT).Keywords: evacuation elevator, super tall buildings, evacuees, delay time
Procedia PDF Downloads 17721351 Investigating the Effect of Brand Equity on Competitive Advantage in the Banking Industry
Authors: Rohollah Asadian Kohestani, Nazanin Sedghi
Abstract:
As the number of banks and financial institutions working in Iran has been significantly increased, the attracting and retaining customers and encouraging them to continually use the modern banking services have been important and vital issues. Therefore, there would be a serious competition without a deep perception of consumers and fitness of banking services with their needs in the current economic conditions of Iran. It should be noted that concepts such as 'brand equity' is defined based on the view of consumers; however, it is also focused by shareholders, competitors and other beneficiaries of a firm in addition to bank and its consumers. This study examines the impact of brand equity on the competitive advantage in the banking industry as intensive competition between brands of different banks leads to pay more attention to the brands. This research is based on the Aaker’s model examining the impact of four dimensions of brand equity on the competitive advantage of private banks in Behshahr city. Moreover, conducting an applied research and data analysis has been carried out by a descriptive method. Data collection was done using literature review and questionnaire. A 'simple random' methodology was selected for sampling staff of banks while sampling methodology to select consumers of banks was the distribution of questionnaire between staff and consumers of five private banks including Tejarat, Mellat, Refah K., Ghavamin and, Tose’e Ta’avon banks. Results show that there is a significant relationship between brand equity and their competitive advantage. In this research, software of SPSS 16 and LISREL 8.5, as well as different methods of descriptive inferential statistics for analyzing data and test hypotheses, were employed.Keywords: brand awareness, brand loyalty, brand equity, competitive advantage
Procedia PDF Downloads 13821350 VANETs Geographic Routing Protocols: A survey
Authors: Ramin Karimi
Abstract:
One of common highly mobile wireless ad hoc networks is Vehicular Ad Hoc Networks. Hence routing in vehicular ad hoc network (VANET) has attracted much attention during the last few years. VANET is characterized by its high mobility of nodes and specific topology patterns. Moreover these networks encounter a significant loss rate and a very short duration of communication. In vehicular ad hoc networks, one of challenging is routing of data due to high speed mobility and changing topology of vehicles. Geographic routing protocols are becoming popular due to advancement and availability of GPS devices. Delay Tolerant Networks (DTNs) are a class of networks that enable communication where connectivity issues like sparse connectivity, intermittent connectivity; high latency, long delay, high error rates, asymmetric data rate, and even no end-to-end connectivity exist. In this paper, we review the existing Geographic Routing Protocols for VANETs and also provide a qualitative comparison of them.Keywords: vehicular ad hoc networks, mobility, geographic routing, delay tolerant networks
Procedia PDF Downloads 52021349 The Effects of Corporate Governance on Firm’s Financial Performance: A Study of Family and Non-family Owned Firms in Pakistan
Authors: Saad Bin Nasir
Abstract:
This research will examine the impact of corporate governance on firm performance in family and non-family owned firms in Pakistan. For the purpose of this research, corporate governance mechanisms which included are board size, board composition, leadership structure, board meetings are taken as independent variable and firm performance taken as dependent variable and it will be measured with return on asset and return on equity. Firm size and firm’s age will be taken as control variables. Secondary data will collect from audited annul reports of companies and panel data regression model will applied, to check the impact of corporate governance on firm performance.Keywords: board size, board composition, Leadership Structure, board meetings, firm performance, family and non-family owned firms
Procedia PDF Downloads 37321348 Time Series Modelling and Prediction of River Runoff: Case Study of Karkheh River, Iran
Authors: Karim Hamidi Machekposhti, Hossein Sedghi, Abdolrasoul Telvari, Hossein Babazadeh
Abstract:
Rainfall and runoff phenomenon is a chaotic and complex outcome of nature which requires sophisticated modelling and simulation methods for explanation and use. Time Series modelling allows runoff data analysis and can be used as forecasting tool. In the paper attempt is made to model river runoff data and predict the future behavioural pattern of river based on annual past observations of annual river runoff. The river runoff analysis and predict are done using ARIMA model. For evaluating the efficiency of prediction to hydrological events such as rainfall, runoff and etc., we use the statistical formulae applicable. The good agreement between predicted and observation river runoff coefficient of determination (R2) display that the ARIMA (4,1,1) is the suitable model for predicting Karkheh River runoff at Iran.Keywords: time series modelling, ARIMA model, river runoff, Karkheh River, CLS method
Procedia PDF Downloads 34121347 Outreach Intervention Addressing Crack Cocaine Addiction in Users with Co-Occurring Opioid Use Disorder
Authors: Louise Penzenstadler, Tiphaine Robet, Radu Iuga, Daniele Zullino
Abstract:
Context: The outpatient clinic of the psychiatric addiction service of Geneva University Hospital has been providing support to individuals affected by various narcotics for 30 years. However, the increasing consumption of crack cocaine in Geneva has presented a new challenge for the healthcare system. Research Aim: The aim of this research is to evaluate the impact of an outreach intervention on crack cocaine addiction in users with co-occurring opioid use disorder. Methodology: The research utilizes a combination of quantitative and qualitative retrospective data analysis to evaluate the effectiveness of the outreach intervention. Findings: The data collected from October 2023 to December 2023 show that the outreach program successfully made 1,071 contacts with drug users and led to 15 new requests for care and enrollment in treatment. Patients expressed high satisfaction with the intervention, citing easy and rapid access to treatment and social support. Theoretical Importance: This research contributes to the understanding of the challenges and specific needs of a complex group of drug users who face severe health problems. It highlights the importance of outreach interventions in establishing trust, connecting users with care, and facilitating medication-assisted treatment for opioid addiction. Data Collection: Data was collected through the outreach program's interactions with drug users, including street outreach interventions and presence at locations frequented by users. Patient satisfaction surveys were also utilized. Analysis Procedures: The collected data was analyzed using both quantitative and qualitative methods. The quantitative analysis involved examining the number of contacts made, new requests for care, and treatment enrollment. The qualitative analysis focused on patient satisfaction and their perceptions of the intervention. Questions Addressed: The research addresses the following questions: What is the impact of an outreach intervention on crack cocaine addiction in users with co-occurring opioid use disorder? How effective is the outreach program in connecting drug users with care and initiating medication-assisted treatment? Conclusion: The outreach program has proven to be an effective intervention in establishing trust with crack users, connecting them with care, and initiating medication-assisted treatment for opioid addiction. It has also highlighted the importance of addressing the specific challenges faced by this group of drug users.Keywords: crack addiction, outreach treatment, peer intervention, polydrug use
Procedia PDF Downloads 6421346 Young People and Their Parents Accessing Their Digital Health Data via a Patient Portal: The Ethical and Legal Implications
Authors: Pippa Sipanoun, Jo Wray, Kate Oulton, Faith Gibson
Abstract:
Background: With rapidly evolving digital health innovation, there is a need for digital health transformation that is accessible and sustainable, that demonstrates utility for all stakeholders while maintaining data safety. Great Ormond Street Hospital for Children aimed to future-proof the hospital by transitioning to an electronic patient record (EPR) system with a tethered patient portal (MyGOSH) in April 2019. MyGOSH patient portal enables patients 12 years or older (with their parent's consent) to access their digital health data. This includes access to results, documentation, and appointments that facilitate communication with their care team. As part of the Going Digital Study conducted between 2018-2021, data were collected from a sample of all relevant stakeholders before and after EPR and MyGOSH implementation. Data collection reach was wide and included the hospital legal and ethics teams. Aims: This study aims to understand the ethical and legal implications of young people and their parents accessing their digital health data. Methods: A focus group was conducted. Recruited participants were members of the Great Ormond Street Hospital Paediatric Bioethics Centre. Participants included expert and lay members from the Committee from a variety of professional or academic disciplines. Written informed consent was provided by all participants (n=7). The focus group was recorded, transcribed verbatim, and analyzed using thematic analysis. Results: Six themes were identified: access, competence and capacity - granting access to the system; inequalities in access resulting in inequities; burden, uncertainty and responding to change - managing expectations; documenting, risks and data safety; engagement, empowerment and understanding – how to use and manage personal information; legal considerations and obligations. Discussion: If healthcare professionals are to empower young people to be more engaged in their care, the importance of including them in decisions about their health is paramount, especially when they are approaching the age of becoming the consenter for treatment. Complexities exist in assessing competence or capacity when granting system access, when disclosing sensitive information, and maintaining confidentiality. Difficulties are also present in managing clinician burden, managing user expectations whilst providing an equitable service, and data management that meets professional and legal requirements. Conclusion: EPR and tethered-portal implementation at Great Ormond Street Hospital for Children was not only timely, due to the need for a rapid transition to remote consultations during the COVID-19 pandemic, which would not have been possible had EPR/MyGOSH not been implemented, but also integral to the digital health revolution required in healthcare today. This study is highly relevant in understanding the complexities around young people and their parents accessing their digital health data and, although the focus of this research related to portal use and access, the findings translate to young people in the wider digital health context. Ongoing support is required for all relevant stakeholders following MyGOSH patient portal implementation to navigate the ethical and legal complexities. Continued commitment is needed to balance the benefits and burdens, promote inclusion and equity, and ensure portal utility for patient benefit, whilst maintaining an individualized approach to care.Keywords: patient portal, young people and their parents, ethical, legal
Procedia PDF Downloads 11421345 Delivering Safer Clinical Trials; Using Electronic Healthcare Records (EHR) to Monitor, Detect and Report Adverse Events in Clinical Trials
Authors: Claire Williams
Abstract:
Randomised controlled Trials (RCTs) of efficacy are still perceived as the gold standard for the generation of evidence, and whilst advances in data collection methods are well developed, this progress has not been matched for the reporting of adverse events (AEs). Assessment and reporting of AEs in clinical trials are fraught with human error and inefficiency and are extremely time and resource intensive. Recent research conducted into the quality of reporting of AEs during clinical trials concluded it is substandard and reporting is inconsistent. Investigators commonly send reports to sponsors who are incorrectly categorised and lacking in critical information, which can complicate the detection of valid safety signals. In our presentation, we will describe an electronic data capture system, which has been designed to support clinical trial processes by reducing the resource burden on investigators, improving overall trial efficiencies, and making trials safer for patients. This proprietary technology was developed using expertise proven in the delivery of the world’s first prospective, phase 3b real-world trial, ‘The Salford Lung Study, ’ which enabled robust safety monitoring and reporting processes to be accomplished by the remote monitoring of patients’ EHRs. This technology enables safety alerts that are pre-defined by the protocol to be detected from the data extracted directly from the patients EHR. Based on study-specific criteria, which are created from the standard definition of a serious adverse event (SAE) and the safety profile of the medicinal product, the system alerts the investigator or study team to the safety alert. Each safety alert will require a clinical review by the investigator or delegate; examples of the types of alerts include hospital admission, death, hepatotoxicity, neutropenia, and acute renal failure. This is achieved in near real-time; safety alerts can be reviewed along with any additional information available to determine whether they meet the protocol-defined criteria for reporting or withdrawal. This active surveillance technology helps reduce the resource burden of the more traditional methods of AE detection for the investigators and study teams and can help eliminate reporting bias. Integration of multiple healthcare data sources enables much more complete and accurate safety data to be collected as part of a trial and can also provide an opportunity to evaluate a drug’s safety profile long-term, in post-trial follow-up. By utilising this robust and proven method for safety monitoring and reporting, a much higher risk of patient cohorts can be enrolled into trials, thus promoting inclusivity and diversity. Broadening eligibility criteria and adopting more inclusive recruitment practices in the later stages of drug development will increase the ability to understand the medicinal products risk-benefit profile across the patient population that is likely to use the product in clinical practice. Furthermore, this ground-breaking approach to AE detection not only provides sponsors with better-quality safety data for their products, but it reduces the resource burden on the investigator and study teams. With the data taken directly from the source, trial costs are reduced, with minimal data validation required and near real-time reporting enables safety concerns and signals to be detected more quickly than in a traditional RCT.Keywords: more comprehensive and accurate safety data, near real-time safety alerts, reduced resource burden, safer trials
Procedia PDF Downloads 8521344 Machine Learning and Internet of Thing for Smart-Hydrology of the Mantaro River Basin
Authors: Julio Jesus Salazar, Julio Jesus De Lama
Abstract:
the fundamental objective of hydrological studies applied to the engineering field is to determine the statistically consistent volumes or water flows that, in each case, allow us to size or design a series of elements or structures to effectively manage and develop a river basin. To determine these values, there are several ways of working within the framework of traditional hydrology: (1) Study each of the factors that influence the hydrological cycle, (2) Study the historical behavior of the hydrology of the area, (3) Study the historical behavior of hydrologically similar zones, and (4) Other studies (rain simulators or experimental basins). Of course, this range of studies in a certain basin is very varied and complex and presents the difficulty of collecting the data in real time. In this complex space, the study of variables can only be overcome by collecting and transmitting data to decision centers through the Internet of things and artificial intelligence. Thus, this research work implemented the learning project of the sub-basin of the Shullcas river in the Andean basin of the Mantaro river in Peru. The sensor firmware to collect and communicate hydrological parameter data was programmed and tested in similar basins of the European Union. The Machine Learning applications was programmed to choose the algorithms that direct the best solution to the determination of the rainfall-runoff relationship captured in the different polygons of the sub-basin. Tests were carried out in the mountains of Europe, and in the sub-basins of the Shullcas river (Huancayo) and the Yauli river (Jauja) with heights close to 5000 m.a.s.l., giving the following conclusions: to guarantee a correct communication, the distance between devices should not pass the 15 km. It is advisable to minimize the energy consumption of the devices and avoid collisions between packages, the distances oscillate between 5 and 10 km, in this way the transmission power can be reduced and a higher bitrate can be used. In case the communication elements of the devices of the network (internet of things) installed in the basin do not have good visibility between them, the distance should be reduced to the range of 1-3 km. The energy efficiency of the Atmel microcontrollers present in Arduino is not adequate to meet the requirements of system autonomy. To increase the autonomy of the system, it is recommended to use low consumption systems, such as the Ashton Raggatt McDougall or ARM Cortex L (Ultra Low Power) microcontrollers or even the Cortex M; and high-performance direct current (DC) to direct current (DC) converters. The Machine Learning System has initiated the learning of the Shullcas system to generate the best hydrology of the sub-basin. This will improve as machine learning and the data entered in the big data coincide every second. This will provide services to each of the applications of the complex system to return the best data of determined flows.Keywords: hydrology, internet of things, machine learning, river basin
Procedia PDF Downloads 16021343 Evaluation of the Effect of IMS on the Social Responsibility in the Oil and Gas Production Companies of National Iranian South Oil Fields Company (NISOC)
Authors: Kamran Taghizadeh
Abstract:
This study was aimed at evaluating the effect of IMS including occupational health system, environmental management system, and safety and health system on the social responsibility (case study of NISOC`s oil and gas production companies). This study`s objectives include evaluating the IMS situation and its effect on social responsibility in addition of providing appropriate solutions based on the study`s hypotheses as a basis for future. Data collection was carried out by library and field studies as well as a questionnaire. The stratified random method was the sampling method and a sample of 285 employees in addition to the collected data (from the questionnaire) were analyzed by inferential statistics methods using SPSS software. Finally, results of regression and fitted model at a significance level of 5% confirmed all hypotheses meaning that IMS and its items have a significant effect on social responsibility.Keywords: social responsibility, integrated management, oil and gas production companies, regression
Procedia PDF Downloads 25621342 A Narrative of Nationalism in Mainstream Media: The US, China, and COVID-19
Authors: Rachel Williams, Shiqi Yang
Abstract:
Our research explores the influence nationalism has had on media coverage of the COVID-19 pandemic as it relates to China in the United States through an inclusive qualitative analysis of two US news networks, Fox News and CNN. In total, the transcripts of sixteen videos uploaded on YouTube, each with more than 100,000 views, were gathered for data processing. Co-occurrence networks generated by KH Coder illuminate the themes and narratives underpinning the reports from Fox News and CNN. The results of in-depth content analysis with keywords suggest that the pandemic has been framed in an ethnopopulist nationalist manner, although to varying degrees between networks. Specifically, the authors found that Fox News is more likely to report hypotheses or statements as a fact; on the contrary, CNN is more likely to quote data and statements from official institutions. Future research into how nationalist narratives have developed in China and in other US news coverage with a more systematic and quantitative method can be conducted to expand on these findings.Keywords: nationalism, media studies, us and china, COVID-19, social media, communication studies
Procedia PDF Downloads 58