Search results for: cloud inference
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 966

Search results for: cloud inference

126 Load Forecasting in Microgrid Systems with R and Cortana Intelligence Suite

Authors: F. Lazzeri, I. Reiter

Abstract:

Energy production optimization has been traditionally very important for utilities in order to improve resource consumption. However, load forecasting is a challenging task, as there are a large number of relevant variables that must be considered, and several strategies have been used to deal with this complex problem. This is especially true also in microgrids where many elements have to adjust their performance depending on the future generation and consumption conditions. The goal of this paper is to present a solution for short-term load forecasting in microgrids, based on three machine learning experiments developed in R and web services built and deployed with different components of Cortana Intelligence Suite: Azure Machine Learning, a fully managed cloud service that enables to easily build, deploy, and share predictive analytics solutions; SQL database, a Microsoft database service for app developers; and PowerBI, a suite of business analytics tools to analyze data and share insights. Our results show that Boosted Decision Tree and Fast Forest Quantile regression methods can be very useful to predict hourly short-term consumption in microgrids; moreover, we found that for these types of forecasting models, weather data (temperature, wind, humidity and dew point) can play a crucial role in improving the accuracy of the forecasting solution. Data cleaning and feature engineering methods performed in R and different types of machine learning algorithms (Boosted Decision Tree, Fast Forest Quantile and ARIMA) will be presented, and results and performance metrics discussed.

Keywords: time-series, features engineering methods for forecasting, energy demand forecasting, Azure Machine Learning

Procedia PDF Downloads 278
125 Evaluation of Surface Roughness Condition Using App Roadroid

Authors: Diego de Almeida Pereira

Abstract:

The roughness index of a road is considered the most important parameter about the quality of the pavement, as it has a close relation with the comfort and safety of the road users. Such condition can be established by means of functional evaluation of pavement surface deviations, measured by the International Roughness Index (IRI), an index that came out of the international evaluation of pavements, coordinated by the World Bank, and currently owns, as an index of limit measure, for purposes of receiving roads in Brazil, the value of 2.7 m/km. This work make use of the e.IRI parameter, obtained by the Roadroid app. for smartphones which use Android operating system. The choice of such application is due to the practicality for the user interaction, as it possesses a data storage on a cloud of its own, and the support given to universities all around the world. Data has been collected for six months, once in each month. The studies begun in March 2018, season of precipitations that worsen the conditions of the roads, besides the opportunity to accompany the damage and the quality of the interventions performed. About 350 kilometers of sections of four federal highways were analyzed, BR-020, BR-040, BR-060 and BR-070 that connect the Federal District (area where Brasilia is located) and surroundings, chosen for their economic and tourist importance, been two of them of federal and two others of private exploitation. As well as much of the road network, the analyzed stretches are coated of Hot Mix Asphalt (HMA). Thus, this present research performs a contrastive discussion between comfort conditions and safety of the roads under private exploitation in which users pay a fee to the concessionaires so they could travel on a road that meet the minimum requirements for usage, and regarding the quality of offered service on the roads under Federal Government jurisdiction. And finally, the contrast of data collected by National Department of Transport Infrastructure – DNIT, by means of a laser perfilometer, with data achieved by Roadroid, checking the applicability, the practicality and cost-effective, considering the app limitations.

Keywords: roadroid, international roughness index, Brazilian roads, pavement

Procedia PDF Downloads 65
124 Preparing Data for Calibration of Mechanistic-Empirical Pavement Design Guide in Central Saudi Arabia

Authors: Abdulraaof H. Alqaili, Hamad A. Alsoliman

Abstract:

Through progress in pavement design developments, a pavement design method was developed, which is titled the Mechanistic Empirical Pavement Design Guide (MEPDG). Nowadays, the evolution in roads network and highways is observed in Saudi Arabia as a result of increasing in traffic volume. Therefore, the MEPDG currently is implemented for flexible pavement design by the Saudi Ministry of Transportation. Implementation of MEPDG for local pavement design requires the calibration of distress models under the local conditions (traffic, climate, and materials). This paper aims to prepare data for calibration of MEPDG in Central Saudi Arabia. Thus, the first goal is data collection for the design of flexible pavement from the local conditions of the Riyadh region. Since, the modifying of collected data to input data is needed; the main goal of this paper is the analysis of collected data. The data analysis in this paper includes processing each: Trucks Classification, Traffic Growth Factor, Annual Average Daily Truck Traffic (AADTT), Monthly Adjustment Factors (MAFi), Vehicle Class Distribution (VCD), Truck Hourly Distribution Factors, Axle Load Distribution Factors (ALDF), Number of axle types (single, tandem, and tridem) per truck class, cloud cover percent, and road sections selected for the local calibration. Detailed descriptions of input parameters are explained in this paper, which leads to providing of an approach for successful implementation of MEPDG. Local calibration of MEPDG to the conditions of Riyadh region can be performed based on the findings in this paper.

Keywords: mechanistic-empirical pavement design guide (MEPDG), traffic characteristics, materials properties, climate, Riyadh

Procedia PDF Downloads 208
123 A Structuring and Classification Method for Assigning Application Areas to Suitable Digital Factory Models

Authors: R. Hellmuth

Abstract:

The method of factory planning has changed a lot, especially when it is about planning the factory building itself. Factory planning has the task of designing products, plants, processes, organization, areas, and the building of a factory. Regular restructuring is becoming more important in order to maintain the competitiveness of a factory. Restrictions in new areas, shorter life cycles of product and production technology as well as a VUCA world (Volatility, Uncertainty, Complexity and Ambiguity) lead to more frequent restructuring measures within a factory. A digital factory model is the planning basis for rebuilding measures and becomes an indispensable tool. Furthermore, digital building models are increasingly being used in factories to support facility management and manufacturing processes. The main research question of this paper is, therefore: What kind of digital factory model is suitable for the different areas of application during the operation of a factory? First, different types of digital factory models are investigated, and their properties and usabilities for use cases are analysed. Within the scope of investigation are point cloud models, building information models, photogrammetry models, and these enriched with sensor data are examined. It is investigated which digital models allow a simple integration of sensor data and where the differences are. Subsequently, possible application areas of digital factory models are determined by means of a survey and the respective digital factory models are assigned to the application areas. Finally, an application case from maintenance is selected and implemented with the help of the appropriate digital factory model. It is shown how a completely digitalized maintenance process can be supported by a digital factory model by providing information. Among other purposes, the digital factory model is used for indoor navigation, information provision, and display of sensor data. In summary, the paper shows a structuring of digital factory models that concentrates on the geometric representation of a factory building and its technical facilities. A practical application case is shown and implemented. Thus, the systematic selection of digital factory models with the corresponding application cases is evaluated.

Keywords: building information modeling, digital factory model, factory planning, maintenance

Procedia PDF Downloads 91
122 Revolutionizing Autonomous Trucking Logistics with Customer Relationship Management Cloud

Authors: Sharda Kumari, Saiman Shetty

Abstract:

Autonomous trucking is just one of the numerous significant shifts impacting fleet management services. The Society of Automotive Engineers (SAE) has defined six levels of vehicle automation that have been adopted internationally, including by the United States Department of Transportation. On public highways in the United States, organizations are testing driverless vehicles with at least Level 4 automation which indicates that a human is present in the vehicle and can disable automation, which is usually done while the trucks are not engaged in highway driving. However, completely driverless vehicles are presently being tested in the state of California. While autonomous trucking can increase safety, decrease trucking costs, provide solutions to trucker shortages, and improve efficiencies, logistics, too, requires advancements to keep up with trucking innovations. Given that artificial intelligence, machine learning, and automated procedures enable people to do their duties in other sectors with fewer resources, CRM (Customer Relationship Management) can be applied to the autonomous trucking business to provide the same level of efficiency. In a society witnessing significant digital disruptions, fleet management is likewise being transformed by technology. Utilizing strategic alliances to enhance core services is an effective technique for capitalizing on innovations and delivering enhanced services. Utilizing analytics on CRM systems improves cost control of fuel strategy, fleet maintenance, driver behavior, route planning, road safety compliance, and capacity utilization. Integration of autonomous trucks with automated fleet management, yard/terminal management, and customer service is possible, thus having significant power to redraw the lines between the public and private spheres in autonomous trucking logistics.

Keywords: autonomous vehicles, customer relationship management, customer experience, autonomous trucking, digital transformation

Procedia PDF Downloads 77
121 Optimizing Energy Efficiency: Leveraging Big Data Analytics and AWS Services for Buildings and Industries

Authors: Gaurav Kumar Sinha

Abstract:

In an era marked by increasing concerns about energy sustainability, this research endeavors to address the pressing challenge of energy consumption in buildings and industries. This study delves into the transformative potential of AWS services in optimizing energy efficiency. The research is founded on the recognition that effective management of energy consumption is imperative for both environmental conservation and economic viability. Buildings and industries account for a substantial portion of global energy use, making it crucial to develop advanced techniques for analysis and reduction. This study sets out to explore the integration of AWS services with big data analytics to provide innovative solutions for energy consumption analysis. Leveraging AWS's cloud computing capabilities, scalable infrastructure, and data analytics tools, the research aims to develop efficient methods for collecting, processing, and analyzing energy data from diverse sources. The core focus is on creating predictive models and real-time monitoring systems that enable proactive energy management. By harnessing AWS's machine learning and data analytics capabilities, the research seeks to identify patterns, anomalies, and optimization opportunities within energy consumption data. Furthermore, this study aims to propose actionable recommendations for reducing energy consumption in buildings and industries. By combining AWS services with metrics-driven insights, the research strives to facilitate the implementation of energy-efficient practices, ultimately leading to reduced carbon emissions and cost savings. The integration of AWS services not only enhances the analytical capabilities but also offers scalable solutions that can be customized for different building and industrial contexts. The research also recognizes the potential for AWS-powered solutions to promote sustainable practices and support environmental stewardship.

Keywords: energy consumption analysis, big data analytics, AWS services, energy efficiency

Procedia PDF Downloads 38
120 Frequent Pattern Mining for Digenic Human Traits

Authors: Atsuko Okazaki, Jurg Ott

Abstract:

Some genetic diseases (‘digenic traits’) are due to the interaction between two DNA variants. For example, certain forms of Retinitis Pigmentosa (a genetic form of blindness) occur in the presence of two mutant variants, one in the ROM1 gene and one in the RDS gene, while the occurrence of only one of these mutant variants leads to a completely normal phenotype. Detecting such digenic traits by genetic methods is difficult. A common approach to finding disease-causing variants is to compare 100,000s of variants between individuals with a trait (cases) and those without the trait (controls). Such genome-wide association studies (GWASs) have been very successful but hinge on genetic effects of single variants, that is, there should be a difference in allele or genotype frequencies between cases and controls at a disease-causing variant. Frequent pattern mining (FPM) methods offer an avenue at detecting digenic traits even in the absence of single-variant effects. The idea is to enumerate pairs of genotypes (genotype patterns) with each of the two genotypes originating from different variants that may be located at very different genomic positions. What is needed is for genotype patterns to be significantly more common in cases than in controls. Let Y = 2 refer to cases and Y = 1 to controls, with X denoting a specific genotype pattern. We are seeking association rules, ‘X → Y’, with high confidence, P(Y = 2|X), significantly higher than the proportion of cases, P(Y = 2) in the study. Clearly, generally available FPM methods are very suitable for detecting disease-associated genotype patterns. We use fpgrowth as the basic FPM algorithm and built a framework around it to enumerate high-frequency digenic genotype patterns and to evaluate their statistical significance by permutation analysis. Application to a published dataset on opioid dependence furnished results that could not be found with classical GWAS methodology. There were 143 cases and 153 healthy controls, each genotyped for 82 variants in eight genes of the opioid system. The aim was to find out whether any of these variants were disease-associated. The single-variant analysis did not lead to significant results. Application of our FPM implementation resulted in one significant (p < 0.01) genotype pattern with both genotypes in the pattern being heterozygous and originating from two variants on different chromosomes. This pattern occurred in 14 cases and none of the controls. Thus, the pattern seems quite specific to this form of substance abuse and is also rather predictive of disease. An algorithm called Multifactor Dimension Reduction (MDR) was developed some 20 years ago and has been in use in human genetics ever since. This and our algorithms share some similar properties, but they are also very different in other respects. The main difference seems to be that our algorithm focuses on patterns of genotypes while the main object of inference in MDR is the 3 × 3 table of genotypes at two variants.

Keywords: digenic traits, DNA variants, epistasis, statistical genetics

Procedia PDF Downloads 104
119 Democratic Information Behavior of Social Scientists and Policy Makers in India

Authors: Mallikarjun Vaddenkeri, Suresh Jange

Abstract:

This research study reports results of information behaviour by members of faculty and research scholars of various departments of social sciences working at universities with a sample of 300 and Members of Legislative Assembly and Council with 216 samples in Karnataka State, India. The results reveal that 29.3% and 20.3% of Social Scientists indicated medium and high level of awareness of primary sources - Primary Journals are found to be at scale level 5 and 9. The usage of primary journals by social scientists is found to be 28% at level 4, 24% of the respondent’s opined use of primary Conference Proceedings at level 5 as medium level of use. Similarly, the use of Secondary Information Sources at scale 8 and 9 particularly in case of Dictionaries (31.0% and 5.0%), Encyclopaedias (22.3% and 6.3%), Indexing Periodicals (7.0% and 15.3%) and Abstracting Periodicals (5.7% and 20.7%). For searching information from Journals Literature available in CD-ROM version, Keywords (43.7%) followed by Keywords with logical operators (39.7%) have been used for finding the required information. Statistical inference reveals rejection of null hypothesis `there is no association between designation of the respondents and awareness of primary information resources’. On the other hand, educational qualification possessed by Legislative members, more than half of them possess graduate degree as their academic qualification (57.4%) and just 16.7% of the respondents possess graduate degree while only 26.8% of the respondents possess degree in law and just 1.8% possess post-graduate degree in law. About 42.6% indicated the importance of information required to discharge their duties and responsibilities as a Policy Maker in the scale 8, as a Scholar (27.8%) on a scale 6, as a politician (64.8%) on a scale 10 and as a Councillor (51.9%) on a scale 8. The most preferred information agencies/sources very often contacted for obtaining useful information are by means of contacting the people of Karnataka State Legislative Library, listening Radio programmes, viewing Television programmes and reading the newspapers. The methods adopted for obtaining needed information quite often by means of sending their assistants to libraries to gather information (35.2%) and personally visiting the information source (64.8%). The null hypotheses `There is no association between Members of Legislature and Opinion on the usefulness of the resources of the Karnataka State Legislature Library’ is accepted using F ANOVA test. The studies conclude with a note revamp the existing library system in its structure and adopt latest technologies and educate and train social scientists and Legislators in using these resources in the interest of academic, government policies and decision making of the country.

Keywords: information use behaviour, government information, searching behaviour, policy makers

Procedia PDF Downloads 115
118 Monitoring Deforestation Using Remote Sensing And GIS

Authors: Tejaswi Agarwal, Amritansh Agarwal

Abstract:

Forest ecosystem plays very important role in the global carbon cycle. It stores about 80% of all above ground and 40% of all below ground terrestrial organic carbon. There is much interest in the extent of tropical forests and their rates of deforestation for two reasons: greenhouse gas contributions and the impact of profoundly negative biodiversity. Deforestation has many ecological, social and economic consequences, one of which is the loss of biological diversity. The rapid deployment of remote sensing (RS) satellites and development of RS analysis techniques in the past three decades have provided a reliable, effective, and practical way to characterize terrestrial ecosystem properties. Global estimates of tropical deforestation vary widely and range from 50,000 to 170,000km2 /yr Recent FAO tropical deforestation estimates for 1990–1995 cite 116,756km2 / yr globally. Remote Sensing can prove to be a very useful tool in monitoring of forests and associated deforestation to a sufficient level of accuracy without the need of physically surveying the forest areas as many of them are physically inaccessible. The methodology for the assessment of forest cover using digital image processing (ERDAS) has been followed. The satellite data for the study was procured from Indian institute of remote Sensing (IIRS), Dehradoon in the digital format. While procuring the satellite data, care was taken to ensure that the data was cloud free and did not belong to dry and leafless season. The Normalized Difference Vegetation Index (NDVI) has been used as a numerical indicator of the reduction in ground biomass. NDVI = (near I.R - Red)/ (near I.R + Red). After calculating the NDVI variations and associated mean, we have analysed the change in ground biomass. Through this paper, we have tried to indicate the rate of deforestation over a given period of time by comparing the forest cover at different time intervals. With the help of remote sensing and GIS techniques, it is clearly shown that the total forest cover is continuously degrading and transforming into various land use/land cover category.

Keywords: remote sensing, deforestation, supervised classification, NDVI, change detection

Procedia PDF Downloads 1154
117 NDVI as a Measure of Change in Forest Biomass

Authors: Amritansh Agarwal, Tejaswi Agarwal

Abstract:

Forest ecosystem plays very important role in the global carbon cycle. It stores about 80% of all above ground and 40% of all below ground terrestrial organic carbon. There is much interest in the extent of tropical forests and their rates of deforestation for two reasons: greenhouse gas contributions and the impact of profoundly negative biodiversity. Deforestation has many ecological, social and economic consequences, one of which is the loss of biological diversity. The rapid deployment of remote sensing (RS) satellites and development of RS analysis techniques in the past three decades have provided a reliable, effective, and practical way to characterize terrestrial ecosystem properties. Global estimates of tropical deforestation vary widely and range from 50,000 to 170,000 km2 /yr Recent FAO tropical deforestation estimates for 1990–1995 cite 116,756km2 / yr globally. Remote Sensing can prove to be a very useful tool in monitoring of forests and associated deforestation to a sufficient level of accuracy without the need of physically surveying the forest areas as many of them are physically inaccessible. The methodology for the assessment of forest cover using digital image processing (ERDAS) has been followed. The satellite data for the study was procured from USGS website in the digital format. While procuring the satellite data, care was taken to ensure that the data was cloud and aerosol free by making using of FLAASH atmospheric correction technique. The Normalized Difference Vegetation Index (NDVI) has been used as a numerical indicator of the reduction in ground biomass. NDVI = (near I.R - Red)/ (near I.R + Red). After calculating the NDVI variations and associated mean we have analysed the change in ground biomass. Through this paper we have tried to indicate the rate of deforestation over a given period of time by comparing the forest cover at different time intervals. With the help of remote sensing and GIS techniques it is clearly shows that the total forest cover is continuously degrading and transforming into various land use/land cover category.

Keywords: remote sensing, deforestation, supervised classification, NDVI change detection

Procedia PDF Downloads 374
116 Harvesting Energy from Lightning Strikes

Authors: Vaishakh Medikeri

Abstract:

Lightning, the marvelous, spectacular and the awesome truth of nature is one of the greatest energy sources left unharnessed since ages. A single lightning bolt of lightning contains energy of about 15 billion joules. This huge amount of energy cannot be harnessed completely but partially. This paper proposes to harness the energy from lightning strikes. Throughout the globe the frequency of lightning is 40-50 flashes per second, totally 1.4 billion flashes per year; all of these flashes carrying an average energy of about 15 billion joules each. When a lightning bolt strikes the ground, tremendous amounts of energy is transferred to earth which propagates in the form of concentric circular energy waves. These waves have a frequency of about 7.83Hz. Harvesting the lightning bolt directly seems impossible, but harvesting the energy waves produced by the lightning is pretty easier. This can be done using a tricoil energy harnesser which is a new device which I have invented. We know that lightning bolt seeks the path which has minimum resistance down to the earth. For this we can make a lightning rod about 100 meters high. Now the lightning rod is attached to the tricoil energy harnesser. The tricoil energy harnesser contains three coils whose centers are collinear and all the coils are parallel to the ground. The first coil has one of its ends connected to the lightning rod and the other end grounded. There is a secondary coil wound on the first coil with one of its end grounded and the other end pointing to the ground and left unconnected and placed a little bit above the ground so that this end of the coil produces more intense currents, hence producing intense energy waves. The first coil produces very high magnetic fields and induces them in the second and third coils. Along with the magnetic fields induced by the first coil, the energy waves which are currents also flow through the second and the third coils. The second and the third coils are connected to a generator which in turn is connected to a capacitor which stores the electrical energy. The first coil is placed in the middle of the second and the third coil. The stored energy can be used for transmission of electricity. This new technique of harnessing the lightning strikes would be most efficient in places with more probability of the lightning strikes. Since we are using a lightning rod sufficiently long, the probability of cloud to ground strikes is increased. If the proposed apparatus is implemented, it would be a great source of pure and clean energy.

Keywords: generator, lightning rod, tricoil energy harnesser, harvesting energy

Procedia PDF Downloads 360
115 2016 Taiwan's 'Health and Physical Education Field of 12-Year Basic Education Curriculum Outline (Draft)' Reform and Its Implications

Authors: Hai Zeng, Yisheng Li, Jincheng Huang, Chenghui Huang, Ying Zhang

Abstract:

Children are strong; the country strong, the development of children Basketball is a strategic advantage. Common forms of basketball equipment has been difficult to meet the needs of young children teaching the game of basketball, basketball development for 3-6 years old children in the form of appropriate teaching aids is a breakthrough basketball game teaching children bottlenecks, improve teaching critical path pleasure, but also the development of early childhood basketball a necessary requirement. In this study, literature, questionnaires, focus group interviews, comparative analysis, for domestic and foreign use of 12 kinds of basketball teaching aids (cloud computing MINI basketball, adjustable basketball MINI, MINI basketball court, shooting assist paw print ball, dribble goggles, dribbling machine, machine cartoon shooting, rebounding machine, against the mat, elastic belt, ladder, fitness ball), from fun and improve early childhood shooting technique, dribbling technology, as well as offensive and defensive rebounding against technology conduct research on conversion technology. The results show that by using appropriate forms of teaching children basketball aids, can effectively improve children's fun basketball game, targeted to improve a technology, different types of aids from different perspectives enrich the connotation of children basketball game. Recommended for children of color psychology, cartoon and environmentally friendly material production aids, and increase research efforts basketball aids children, encourage children to sports teachers aids applications.

Keywords: health and physical education field of curriculum outline, health fitness, sports and health curriculum reform, Taiwan, twelve years basic education

Procedia PDF Downloads 371
114 Shape Management Method of Large Structure Based on Octree Space Partitioning

Authors: Gichun Cha, Changgil Lee, Seunghee Park

Abstract:

The objective of the study is to construct the shape management method contributing to the safety of the large structure. In Korea, the research of the shape management is lack because of the new attempted technology. Terrestrial Laser Scanning (TLS) is used for measurements of large structures. TLS provides an efficient way to actively acquire accurate the point clouds of object surfaces or environments. The point clouds provide a basis for rapid modeling in the industrial automation, architecture, construction or maintenance of the civil infrastructures. TLS produce a huge amount of point clouds. Registration, Extraction and Visualization of data require the processing of a massive amount of scan data. The octree can be applied to the shape management of the large structure because the scan data is reduced in the size but, the data attributes are maintained. The octree space partitioning generates the voxel of 3D space, and the voxel is recursively subdivided into eight sub-voxels. The point cloud of scan data was converted to voxel and sampled. The experimental site is located at Sungkyunkwan University. The scanned structure is the steel-frame bridge. The used TLS is Leica ScanStation C10/C5. The scan data was condensed 92%, and the octree model was constructed with 2 millimeter in resolution. This study presents octree space partitioning for handling the point clouds. The basis is created by shape management of the large structures such as double-deck tunnel, building and bridge. The research will be expected to improve the efficiency of structural health monitoring and maintenance. "This work is financially supported by 'U-City Master and Doctor Course Grant Program' and the National Research Foundation of Korea(NRF) grant funded by the Korea government (MSIP) (NRF- 2015R1D1A1A01059291)."

Keywords: 3D scan data, octree space partitioning, shape management, structural health monitoring, terrestrial laser scanning

Procedia PDF Downloads 277
113 Swift Rising Pattern of Emerging Construction Technology Trends in the Construction Management

Authors: Gayatri Mahajan

Abstract:

Modern Construction Technology (CT) includes a broad range of advanced techniques and practices that bound the recent developments in material technology, design methods, quantity surveying, facility management, services, structural analysis and design, and other management education. Adoption of recent digital transformation technology is the need of today to speed up the business and is also the basis of construction improvement. Incorporating and practicing the technologies such as cloud-based communication and collaboration solution, Mobile Apps and 5G,3D printing, BIM and Digital Twins, CAD / CAM, AR/ VR, Big Data, IoT, Wearables, Blockchain, Modular Construction, Offsite Manifesting, Prefabrication, Robotic, Drones and GPS controlled equipment expedite the progress in the Construction industry (CI). Resources used are journaled research articles, web/net surfing, books, thesis, reports/surveys, magazines, etc. The outline of the research organization for this study is framed at four distinct levels in context to conceptualization, resources, innovative and emerging trends in CI, and better methods for completion of the construction projects. The present study conducted during 2020-2022 reveals that implementing these technologies improves the level of standards, planning, security, well-being, sustainability, and economics too. Application uses, benefits, impact, advantages/disadvantages, limitations and challenges, and policies are dealt with to provide information to architects and builders for smooth completion of the project. Results explain that construction technology trends vary from 4 to 15 for CI, and eventually, it reaches 27 for Civil Engineering (CE). The perspective of the most recent innovations, trends, tools, challenges, and solutions is highly embraced in the field of construction. The incorporation of the above said technologies in the pandemic Covid -19 and post-pandemic might lead to a focus on finding out effective ways to adopt new-age technologies for CI.

Keywords: BIM, drones, GPS, mobile apps, 5G, modular construction, robotics, 3D printing

Procedia PDF Downloads 83
112 Training for Digital Manufacturing: A Multilevel Teaching Model

Authors: Luís Rocha, Adam Gąska, Enrico Savio, Michael Marxer, Christoph Battaglia

Abstract:

The changes observed in the last years in the field of manufacturing and production engineering, popularly known as "Fourth Industry Revolution", utilizes the achievements in the different areas of computer sciences, introducing new solutions at almost every stage of the production process, just to mention such concepts as mass customization, cloud computing, knowledge-based engineering, virtual reality, rapid prototyping, or virtual models of measuring systems. To effectively speed up the production process and make it more flexible, it is necessary to tighten the bonds connecting individual stages of the production process and to raise the awareness and knowledge of employees of individual sectors about the nature and specificity of work in other stages. It is important to discover and develop a suitable education method adapted to the specificities of each stage of the production process, becoming an extremely crucial issue to exploit the potential of the fourth industrial revolution properly. Because of it, the project “Train4Dim” (T4D) intends to develop complex training material for digital manufacturing, including content for design, manufacturing, and quality control, with a focus on coordinate metrology and portable measuring systems. In this paper, the authors present an approach to using an active learning methodology for digital manufacturing. T4D main objective is to develop a multi-degree (apprenticeship up to master’s degree studies) and educational approach that can be adapted to different teaching levels. It’s also described the process of creating the underneath methodology. The paper will share the steps to achieve the aims of the project (training model for digital manufacturing): 1) surveying the stakeholders, 2) Defining the learning aims, 3) producing all contents and curriculum, 4) training for tutors, and 5) Pilot courses test and improvements.

Keywords: learning, Industry 4.0, active learning, digital manufacturing

Procedia PDF Downloads 74
111 Investigating the Effectiveness of Multilingual NLP Models for Sentiment Analysis

Authors: Othmane Touri, Sanaa El Filali, El Habib Benlahmar

Abstract:

Natural Language Processing (NLP) has gained significant attention lately. It has proved its ability to analyze and extract insights from unstructured text data in various languages. It is found that one of the most popular NLP applications is sentiment analysis which aims to identify the sentiment expressed in a piece of text, such as positive, negative, or neutral, in multiple languages. While there are several multilingual NLP models available for sentiment analysis, there is a need to investigate their effectiveness in different contexts and applications. In this study, we aim to investigate the effectiveness of different multilingual NLP models for sentiment analysis on a dataset of online product reviews in multiple languages. The performance of several NLP models, including Google Cloud Natural Language API, Microsoft Azure Cognitive Services, Amazon Comprehend, Stanford CoreNLP, spaCy, and Hugging Face Transformers are being compared. The models based on several metrics, including accuracy, precision, recall, and F1 score, are being evaluated and compared to their performance across different categories of product reviews. In order to run the study, preprocessing of the dataset has been performed by cleaning and tokenizing the text data in multiple languages. Then training and testing each model has been applied using a cross-validation approach where randomly dividing the dataset into training and testing sets and repeating the process multiple times has been used. A grid search approach to optimize the hyperparameters of each model and select the best-performing model for each category of product reviews and language has been applied. The findings of this study provide insights into the effectiveness of different multilingual NLP models for Multilingual Sentiment Analysis and their suitability for different languages and applications. The strengths and limitations of each model were identified, and recommendations for selecting the most performant model based on the specific requirements of a project were provided. This study contributes to the advancement of research methods in multilingual NLP and provides a practical guide for researchers and practitioners in the field.

Keywords: NLP, multilingual, sentiment analysis, texts

Procedia PDF Downloads 66
110 Transient Phenomena in a 100 W Hall Thrusters: Experimental Measurements of Discharge Current and Plasma Parameter Evolution

Authors: Clémence Royer, Stéphane Mazouffre

Abstract:

Nowadays, electric propulsion systems play a crucial role in space exploration missions due to their high specific impulse and long operational life. The Hall thrusters are one of the most mature EP technologies. It is a gridless ion thruster that has proved reliable and high-performance for decades in various space missions. Operation of HT relies on electron emissions through a cathode placed outside a hollow dielectric channel that includes an anode at the back. Negatively charged particles are trapped in a magnetic field and efficiently slow down. By collisions, the electron cloud ionizes xenon atoms. A large electric field is generated in the axial direction due to the low electron transverse mobility in the region of a strong magnetic field. Positive particles are pulled out of the chamber at high velocity and are neutralized directly at the exhaust area. This phenomenon leads to the acceleration of the spacecraft system at a high specific impulse. While HT’s architecture and operating principle are relatively simple, the physics behind thrust is complex and still partly unknown. Current and voltage oscillations, as well as electron properties, have been captured over a 30 mn time period after ignition. The observed low-frequency oscillations exhibited specific frequency ranges, amplitudes, and stability patterns. Correlations between the oscillations and plasma characteristics we analyzed. The impact of these instabilities on thruster performance, including thrust efficiency, has been evaluated as well. Moreover, strategies for mitigating and controlling these instabilities have been developed, such as filtering. In this contribution, in addition to presenting a summary of the results obtained in the transient regime, we will present and discuss recent advances in Hall thruster plasma discharge filtering and control.

Keywords: electric propulsion, Hall Thruster, plasma diagnostics, low-frequency oscillations

Procedia PDF Downloads 63
109 Aerial Survey and 3D Scanning Technology Applied to the Survey of Cultural Heritage of Su-Paiwan, an Aboriginal Settlement, Taiwan

Authors: April Hueimin Lu, Liangj-Ju Yao, Jun-Tin Lin, Susan Siru Liu

Abstract:

This paper discusses the application of aerial survey technology and 3D laser scanning technology in the surveying and mapping work of the settlements and slate houses of the old Taiwanese aborigines. The relics of old Taiwanese aborigines with thousands of history are widely distributed in the deep mountains of Taiwan, with a vast area and inconvenient transportation. When constructing the basic data of cultural assets, it is necessary to apply new technology to carry out efficient and accurate settlement mapping work. In this paper, taking the old Paiwan as an example, the aerial survey of the settlement of about 5 hectares and the 3D laser scanning of a slate house were carried out. The obtained orthophoto image was used as an important basis for drawing the settlement map. This 3D landscape data of topography and buildings derived from the aerial survey is important for subsequent preservation planning as well as building 3D scan provides a more detailed record of architectural forms and materials. The 3D settlement data from the aerial survey can be further applied to the 3D virtual model and animation of the settlement for virtual presentation. The information from the 3D scanning of the slate house can also be used for further digital archives and data queries through network resources. The results of this study show that, in large-scale settlement surveys, aerial surveying technology is used to construct the topography of settlements with buildings and spatial information of landscape, as well as the application of 3D scanning for small-scale records of individual buildings. This application of 3D technology, greatly increasing the efficiency and accuracy of survey and mapping work of aboriginal settlements, is much helpful for further preservation planning and rejuvenation of aboriginal cultural heritage.

Keywords: aerial survey, 3D scanning, aboriginal settlement, settlement architecture cluster, ecological landscape area, old Paiwan settlements, slat house, photogrammetry, SfM, MVS), Point cloud, SIFT, DSM, 3D model

Procedia PDF Downloads 136
108 The Historical Background of Physical Changing Towards Ancient Mosques in Aceh, Indonesia

Authors: Karima Adilla

Abstract:

Aceh province, into which Islam convinced to have entered Indonesia in the 12th Century before spreading throughout the archipelago and the rest of Southeast Asia, has several early Islamic mosques that still exist until today. However, due to some circumstances, the restoration and rehabilitation towards those mosques have been made in some periods, while the background was diverse. Concerning this, the research will examine the physical changing aspects of 3 prominent historical mosques in Aceh Besar and Banda Aceh; those are, Indrapuri Mosque, Baiturrahman Grand Mosque, and Baiturrahim Mosque built coincided with the beginning of Islam’s development in Aceh and regarded as eventful mosques. The existence of Indrapuri Mosque built on the remains of the Lamuri Kingdom’s temple is a historical trace that there was Hindu-Buddhist civilization in Aceh before Islam entered and became the majority religion about 98% from Aceh total population. Also, there was the Dutch who colonialized Aceh behind the existence of two famous mosques in Aceh, namely Baiturrahman Grand Mosque and Baiturrahim Mosque, as the colonizer also assisted to rebuild those 2 sacred Mosques to quell the anger of the Acehnese people because their mosque was burnt by the Dutch. Interestingly, despite underwent a long history successively since the rise of Islam after the Hindu-Buddhist kingdom had collapsed, colonialization, conflict, in Aceh, and even experienced the earthquake and tsunami disaster in 2004, those mosques still exist. Therefore, those mosques have been considered as historical silent witnesses. However, it was not merely those reasons that led the mosques underwent several physical changes, otherwise economic, political, social, cultural and religious factors were also highly influential. Instead of directly illustrating the physical changing of those three mosques, this research intends to identify under what condition the physical appearance continuously changing during the sultanate era, the colonial period until post-independent in terms of the architectural style, detail elements, design philosophy, and how the remnants buildings act as medium to bridge the history. A framework will use qualitative research methods by collecting actual data of the mosque's physical change figures through field studies, investigations, library studies and interviews. This research aims to define every trace of historical issues embedded in the physical changing of those mosques as they are intertwined in collecting historical proof. Thus, the result will reveal the characteristic interrelation between history, the mosque architectural style in a certain period, the physical changes background and its impact. Eventually, this research will also explicate a clear inference of each mosque’s role in representing history in Aceh Besar and Banda Aceh specifically, as well as Aceh generally through architectural design concepts.

Keywords: Aceh ancient mosques, Aceh history, Islamic architecture, physical changing

Procedia PDF Downloads 116
107 Portable System for the Acquisition and Processing of Electrocardiographic Signals to Obtain Different Metrics of Heart Rate Variability

Authors: Daniel F. Bohorquez, Luis M. Agudelo, Henry H. León

Abstract:

Heart rate variability (HRV) is defined as the temporary variation between heartbeats or RR intervals (distance between R waves in an electrocardiographic signal). This distance is currently a recognized biomarker. With the analysis of the distance, it is possible to assess the sympathetic and parasympathetic nervous systems. These systems are responsible for the regulation of the cardiac muscle. The analysis allows health specialists and researchers to diagnose various pathologies based on this variation. For the acquisition and analysis of HRV taken from a cardiac electrical signal, electronic equipment and analysis software that work independently are currently used. This complicates and delays the process of interpretation and diagnosis. With this delay, the health condition of patients can be put at greater risk. This can lead to an untimely treatment. This document presents a single portable device capable of acquiring electrocardiographic signals and calculating a total of 19 HRV metrics. This reduces the time required, resulting in a timelier intervention. The device has an electrocardiographic signal acquisition card attached to a microcontroller capable of transmitting the cardiac signal wirelessly to a mobile device. In addition, a mobile application was designed to analyze the cardiac waveform. The device calculates the RR and different metrics. The application allows a user to visualize in real-time the cardiac signal and the 19 metrics. The information is exported to a cloud database for remote analysis. The study was performed under controlled conditions in the simulated hospital of the Universidad de la Sabana, Colombia. A total of 60 signals were acquired and analyzed. The device was compared against two reference systems. The results show a strong level of correlation (r > 0.95, p < 0.05) between the 19 metrics compared. Therefore, the use of the portable system evaluated in clinical scenarios controlled by medical specialists and researchers is recommended for the evaluation of the condition of the cardiac system.

Keywords: biological signal análisis, heart rate variability (HRV), HRV metrics, mobile app, portable device.

Procedia PDF Downloads 165
106 Enhanced Disk-Based Databases towards Improved Hybrid in-Memory Systems

Authors: Samuel Kaspi, Sitalakshmi Venkatraman

Abstract:

In-memory database systems are becoming popular due to the availability and affordability of sufficiently large RAM and processors in modern high-end servers with the capacity to manage large in-memory database transactions. While fast and reliable in-memory systems are still being developed to overcome cache misses, CPU/IO bottlenecks and distributed transaction costs, disk-based data stores still serve as the primary persistence. In addition, with the recent growth in multi-tenancy cloud applications and associated security concerns, many organisations consider the trade-offs and continue to require fast and reliable transaction processing of disk-based database systems as an available choice. For these organizations, the only way of increasing throughput is by improving the performance of disk-based concurrency control. This warrants a hybrid database system with the ability to selectively apply an enhanced disk-based data management within the context of in-memory systems that would help improve overall throughput. The general view is that in-memory systems substantially outperform disk-based systems. We question this assumption and examine how a modified variation of access invariance that we call enhanced memory access, (EMA) can be used to allow very high levels of concurrency in the pre-fetching of data in disk-based systems. We demonstrate how this prefetching in disk-based systems can yield close to in-memory performance, which paves the way for improved hybrid database systems. This paper proposes a novel EMA technique and presents a comparative study between disk-based EMA systems and in-memory systems running on hardware configurations of equivalent power in terms of the number of processors and their speeds. The results of the experiments conducted clearly substantiate that when used in conjunction with all concurrency control mechanisms, EMA can increase the throughput of disk-based systems to levels quite close to those achieved by in-memory system. The promising results of this work show that enhanced disk-based systems facilitate in improving hybrid data management within the broader context of in-memory systems.

Keywords: in-memory database, disk-based system, hybrid database, concurrency control

Procedia PDF Downloads 393
105 Quantifying Multivariate Spatiotemporal Dynamics of Malaria Risk Using Graph-Based Optimization in Southern Ethiopia

Authors: Yonas Shuke Kitawa

Abstract:

Background: Although malaria incidence has substantially fallen sharply over the past few years, the rate of decline varies by district, time, and malaria type. Despite this turn-down, malaria remains a major public health threat in various districts of Ethiopia. Consequently, the present study is aimed at developing a predictive model that helps to identify the spatio-temporal variation in malaria risk by multiple plasmodium species. Methods: We propose a multivariate spatio-temporal Bayesian model to obtain a more coherent picture of the temporally varying spatial variation in disease risk. The spatial autocorrelation in such a data set is typically modeled by a set of random effects that assign a conditional autoregressive prior distribution. However, the autocorrelation considered in such cases depends on a binary neighborhood matrix specified through the border-sharing rule. Over here, we propose a graph-based optimization algorithm for estimating the neighborhood matrix that merely represents the spatial correlation by exploring the areal units as the vertices of a graph and the neighbor relations as the series of edges. Furthermore, we used aggregated malaria count in southern Ethiopia from August 2013 to May 2019. Results: We recognized that precipitation, temperature, and humidity are positively associated with the malaria threat in the area. On the other hand, enhanced vegetation index, nighttime light (NTL), and distance from coastal areas are negatively associated. Moreover, nonlinear relationships were observed between malaria incidence and precipitation, temperature, and NTL. Additionally, lagged effects of temperature and humidity have a significant effect on malaria risk by either species. More elevated risk of P. falciparum was observed following the rainy season, and unstable transmission of P. vivax was observed in the area. Finally, P. vivax risks are less sensitive to environmental factors than those of P. falciparum. Conclusion: The improved inference was gained by employing the proposed approach in comparison to the commonly used border-sharing rule. Additionally, different covariates are identified, including delayed effects, and elevated risks of either of the cases were observed in districts found in the central and western regions. As malaria transmission operates in a spatially continuous manner, a spatially continuous model should be employed when it is computationally feasible.

Keywords: disease mapping, MSTCAR, graph-based optimization algorithm, P. falciparum, P. vivax, waiting matrix

Procedia PDF Downloads 50
104 Realizing Teleportation Using Black-White Hole Capsule Constructed by Space-Time Microstrip Circuit Control

Authors: Mapatsakon Sarapat, Mongkol Ketwongsa, Somchat Sonasang, Preecha Yupapin

Abstract:

The designed and performed preliminary tests on a space-time control circuit using a two-level system circuit with a 4-5 cm diameter microstrip for realistic teleportation have been demonstrated. It begins by calculating the parameters that allow a circuit that uses the alternative current (AC) at a specified frequency as the input signal. A method that causes electrons to move along the circuit perimeter starting at the speed of light, which found satisfaction based on the wave-particle duality. It is able to establish the supersonic speed (faster than light) for the electron cloud in the middle of the circuit, creating a timeline and propulsive force as well. The timeline is formed by the stretching and shrinking time cancellation in the relativistic regime, in which the absolute time has vanished. In fact, both black holes and white holes are created from time signals at the beginning, where the speed of electrons travels close to the speed of light. They entangle together like a capsule until they reach the point where they collapse and cancel each other out, which is controlled by the frequency of the circuit. Therefore, we can apply this method to large-scale circuits such as potassium, from which the same method can be applied to form the system to teleport living things. In fact, the black hole is a hibernation system environment that allows living things to live and travel to the destination of teleportation, which can be controlled from position and time relative to the speed of light. When the capsule reaches its destination, it increases the frequency of the black holes and white holes canceling each other out to a balanced environment. Therefore, life can safely teleport to the destination. Therefore, there must be the same system at the origin and destination, which could be a network. Moreover, it can also be applied to space travel as well. The design system will be tested on a small system using a microstrip circuit system that we can create in the laboratory on a limited budget that can be used in both wired and wireless systems.

Keywords: quantum teleportation, black-white hole, time, timeline, relativistic electronics

Procedia PDF Downloads 54
103 A Strategy to Oil Production Placement Zones Based on Maximum Closeness

Authors: Waldir Roque, Gustavo Oliveira, Moises Santos, Tatiana Simoes

Abstract:

Increasing the oil recovery factor of an oil reservoir has been a concern of the oil industry. Usually, the production placement zones are defined after some analysis of geological and petrophysical parameters, being the rock porosity, permeability and oil saturation of fundamental importance. In this context, the determination of hydraulic flow units (HFUs) renders an important step in the process of reservoir characterization since it may provide specific regions in the reservoir with similar petrophysical and fluid flow properties and, in particular, techniques supporting the placement of production zones that favour the tracing of directional wells. A HFU is defined as a representative volume of a total reservoir rock in which petrophysical and fluid flow properties are internally consistent and predictably distinct of other reservoir rocks. Technically, a HFU is characterized as a rock region that exhibit flow zone indicator (FZI) points lying on a straight line of the unit slope. The goal of this paper is to provide a trustful indication for oil production placement zones for the best-fit HFUs. The FZI cloud of points can be obtained from the reservoir quality index (RQI), a function of effective porosity and permeability. Considering log and core data the HFUs are identified and using the discrete rock type (DRT) classification, a set of connected cell clusters can be found and by means a graph centrality metric, the maximum closeness (MaxC) cell is obtained for each cluster. Considering the MaxC cells as production zones, an extensive analysis, based on several oil recovery factor and oil cumulative production simulations were done for the SPE Model 2 and the UNISIM-I-D synthetic fields, where the later was build up from public data available from the actual Namorado Field, Campos Basin, in Brazil. The results have shown that the MaxC is actually technically feasible and very reliable as high performance production placement zones.

Keywords: hydraulic flow unit, maximum closeness centrality, oil production simulation, production placement zone

Procedia PDF Downloads 304
102 Robust Inference with a Skew T Distribution

Authors: M. Qamarul Islam, Ergun Dogan, Mehmet Yazici

Abstract:

There is a growing body of evidence that non-normal data is more prevalent in nature than the normal one. Examples can be quoted from, but not restricted to, the areas of Economics, Finance and Actuarial Science. The non-normality considered here is expressed in terms of fat-tailedness and asymmetry of the relevant distribution. In this study a skew t distribution that can be used to model a data that exhibit inherent non-normal behavior is considered. This distribution has tails fatter than a normal distribution and it also exhibits skewness. Although maximum likelihood estimates can be obtained by solving iteratively the likelihood equations that are non-linear in form, this can be problematic in terms of convergence and in many other respects as well. Therefore, it is preferred to use the method of modified maximum likelihood in which the likelihood estimates are derived by expressing the intractable non-linear likelihood equations in terms of standardized ordered variates and replacing the intractable terms by their linear approximations obtained from the first two terms of a Taylor series expansion about the quantiles of the distribution. These estimates, called modified maximum likelihood estimates, are obtained in closed form. Hence, they are easy to compute and to manipulate analytically. In fact the modified maximum likelihood estimates are equivalent to maximum likelihood estimates, asymptotically. Even in small samples the modified maximum likelihood estimates are found to be approximately the same as maximum likelihood estimates that are obtained iteratively. It is shown in this study that the modified maximum likelihood estimates are not only unbiased but substantially more efficient than the commonly used moment estimates or the least square estimates that are known to be biased and inefficient in such cases. Furthermore, in conventional regression analysis, it is assumed that the error terms are distributed normally and, hence, the well-known least square method is considered to be a suitable and preferred method for making the relevant statistical inferences. However, a number of empirical researches have shown that non-normal errors are more prevalent. Even transforming and/or filtering techniques may not produce normally distributed residuals. Here, a study is done for multiple linear regression models with random error having non-normal pattern. Through an extensive simulation it is shown that the modified maximum likelihood estimates of regression parameters are plausibly robust to the distributional assumptions and to various data anomalies as compared to the widely used least square estimates. Relevant tests of hypothesis are developed and are explored for desirable properties in terms of their size and power. The tests based upon modified maximum likelihood estimates are found to be substantially more powerful than the tests based upon least square estimates. Several examples are provided from the areas of Economics and Finance where such distributions are interpretable in terms of efficient market hypothesis with respect to asset pricing, portfolio selection, risk measurement and capital allocation, etc.

Keywords: least square estimates, linear regression, maximum likelihood estimates, modified maximum likelihood method, non-normality, robustness

Procedia PDF Downloads 383
101 Public-Private Partnership for Critical Infrastructure Resilience

Authors: Anjula Negi, D. T. V. Raghu Ramaswamy, Rajneesh Sareen

Abstract:

Road infrastructure is emphatically one of the top most critical infrastructure to the Indian economy. Road network in the country of around 3.3 million km is the second largest in the world. Nationwide statistics released by Ministry of Road, Transport and Highways reveal that every minute an accident happens and one death every 3.7 minutes. This reported scale in terms of safety is a matter of grave concern, and economically represents a national loss of 3% to the GDP. Union Budget 2016-17 has allocated USD 12 billion annually for development and strengthening of roads, an increase of 56% from last year. Thus, highlighting the importance of roads as critical infrastructure. National highway alone represent only 1.7% of the total road linkages, however, carry over 40% of traffic. Further, trends analysed from 2002 -2011 on national highways, indicate that in less than a decade, a 22 % increase in accidents have been reported, but, 68% increase in death fatalities. Paramount inference is that accident severity has increased with time. Over these years many measures to increase road safety, lessening damage to physical assets, reducing vulnerabilities leading to a build-up for resilient road infrastructure have been taken. In the context of national highway development program, policy makers proposed implementation of around 20 % of such road length on PPP mode. These roads were taken up on high-density traffic considerations and for qualitative implementation. In order to understand resilience impacts and safety parameters, enshrined in various PPP concession agreements executed with the private sector partners, such highway specific projects would be appraised. This research paper would attempt to assess such safety measures taken and the possible reasons behind an increase in accident severity through these PPP case study projects. Delving further on safety features to understand policy measures adopted in these cases and an introspection on reasons of severity, whether an outcome of increased speeds, faulty road design and geometrics, driver negligence, or due to lack of discipline in following lane traffic with increased speed. Assessment exercise would study these aspects hitherto to PPP and post PPP project structures, based on literature review and opinion surveys with sectoral experts. On the way forward, it is understood that the Ministry of Road, Transport and Highway’s estimate for strengthening the national highway network is USD 77 billion within next five years. The outcome of this paper would provide an understanding of resilience measures adopted, possible options for accessible and safe road network and its expansion to policy makers for possible policy initiatives and funding allocation in securing critical infrastructure.

Keywords: national highways, policy, PPP, safety

Procedia PDF Downloads 233
100 Debris Flow Mapping Using Geographical Information System Based Model and Geospatial Data in Middle Himalayas

Authors: Anand Malik

Abstract:

The Himalayas with high tectonic activities poses a great threat to human life and property. Climate change is another reason which triggering extreme events multiple fold effect on high mountain glacial environment, rock falls, landslides, debris flows, flash flood and snow avalanches. One such extreme event of cloud burst along with breach of moraine dammed Chorabri Lake occurred from June 14 to June 17, 2013, triggered flooding of Saraswati and Mandakini rivers in the Kedarnath Valley of Rudraprayag district of Uttrakhand state of India. As a result, huge volume of water with its high velocity created a catastrophe of the century, which resulted into loss of large number of human/animals, pilgrimage, tourism, agriculture and property. Thus a comprehensive assessment of debris flow hazards requires GIS-based modeling using numerical methods. The aim of present study is to focus on analysis and mapping of debris flow movements using geospatial data with flow-r (developed by team at IGAR, University of Lausanne). The model is based on combined probabilistic and energetic algorithms for the assessment of spreading of flow with maximum run out distances. Aster Digital Elevation Model (DEM) with 30m x 30m cell size (resolution) is used as main geospatial data for preparing the run out assessment, while Landsat data is used to analyze land use land cover change in the study area. The results of the study area show that model can be applied with great accuracy as the model is very useful in determining debris flow areas. The results are compared with existing available landslides/debris flow maps. ArcGIS software is used in preparing run out susceptibility maps which can be used in debris flow mitigation and future land use planning.

Keywords: debris flow, geospatial data, GIS based modeling, flow-R

Procedia PDF Downloads 250
99 Enhancing Healthcare Data Protection and Security

Authors: Joseph Udofia, Isaac Olufadewa

Abstract:

Everyday, the size of Electronic Health Records data keeps increasing as new patients visit health practitioner and returning patients fulfil their appointments. As these data grow, so is their susceptibility to cyber-attacks from criminals waiting to exploit this data. In the US, the damages for cyberattacks were estimated at $8 billion (2018), $11.5 billion (2019) and $20 billion (2021). These attacks usually involve the exposure of PII. Health data is considered PII, and its exposure carry significant impact. To this end, an enhancement of Health Policy and Standards in relation to data security, especially among patients and their clinical providers, is critical to ensure ethical practices, confidentiality, and trust in the healthcare system. As Clinical accelerators and applications that contain user data are used, it is expedient to have a review and revamp of policies like the Payment Card Industry Data Security Standard (PCI DSS), the Health Insurance Portability and Accountability Act (HIPAA), the Fast Healthcare Interoperability Resources (FHIR), all aimed to ensure data protection and security in healthcare. FHIR caters for healthcare data interoperability, FHIR caters to healthcare data interoperability, as data is being shared across different systems from customers to health insurance and care providers. The astronomical cost of implementation has deterred players in the space from ensuring compliance, leading to susceptibility to data exfiltration and data loss on the security accuracy of protected health information (PHI). Though HIPAA hones in on the security accuracy of protected health information (PHI) and PCI DSS on the security of payment card data, they intersect with the shared goal of protecting sensitive information in line with industry standards. With advancements in tech and the emergence of new technology, it is necessary to revamp these policies to address the complexity and ambiguity, cost barrier, and ever-increasing threats in cyberspace. Healthcare data in the wrong hands is a recipe for disaster, and we must enhance its protection and security to protect the mental health of the current and future generations.

Keywords: cloud security, healthcare, cybersecurity, policy and standard

Procedia PDF Downloads 58
98 AI for Efficient Geothermal Exploration and Utilization

Authors: Velimir "monty" Vesselinov, Trais Kliplhuis, Hope Jasperson

Abstract:

Artificial intelligence (AI) is a powerful tool in the geothermal energy sector, aiding in both exploration and utilization. Identifying promising geothermal sites can be challenging due to limited surface indicators and the need for expensive drilling to confirm subsurface resources. Geothermal reservoirs can be located deep underground and exhibit complex geological structures, making traditional exploration methods time-consuming and imprecise. AI algorithms can analyze vast datasets of geological, geophysical, and remote sensing data, including satellite imagery, seismic surveys, geochemistry, geology, etc. Machine learning algorithms can identify subtle patterns and relationships within this data, potentially revealing hidden geothermal potential in areas previously overlooked. To address these challenges, a SIML (Science-Informed Machine Learning) technology has been developed. SIML methods are different from traditional ML techniques. In both cases, the ML models are trained to predict the spatial distribution of an output (e.g., pressure, temperature, heat flux) based on a series of inputs (e.g., permeability, porosity, etc.). The traditional ML (a) relies on deep and wide neural networks (NNs) based on simple algebraic mappings to represent complex processes. In contrast, the SIML neurons incorporate complex mappings (including constitutive relationships and physics/chemistry models). This results in ML models that have a physical meaning and satisfy physics laws and constraints. The prototype of the developed software, called GeoTGO, is accessible through the cloud. Our software prototype demonstrates how different data sources can be made available for processing, executed demonstrative SIML analyses, and presents the results in a table and graphic form.

Keywords: science-informed machine learning, artificial inteligence, exploration, utilization, hidden geothermal

Procedia PDF Downloads 19
97 Tri/Tetra-Block Copolymeric Nanocarriers as a Potential Ocular Delivery System of Lornoxicam: Experimental Design-Based Preparation, in-vitro Characterization and in-vivo Estimation of Transcorneal Permeation

Authors: Alaa Hamed Salama, Rehab Nabil Shamma

Abstract:

Introduction: Polymeric micelles that can deliver drug to intended sites of the eye have attracted much scientific attention recently. The aim of this study was to review the aqueous-based formulation of drug-loaded polymeric micelles that hold significant promise for ophthalmic drug delivery. This study investigated the synergistic performance of mixed polymeric micelles made of linear and branched poly (ethylene oxide)-poly (propylene oxide) for the more effective encapsulation of Lornoxicam (LX) as a hydrophobic model drug. Methods: The co-micellization process of 10% binary systems combining different weight ratios of the highly hydrophilic poloxamers; Synperonic® PE/P84, and Synperonic® PE/F127 and the hydrophobic poloxamine counterpart (Tetronic® T701) was investigated by means of photon correlation spectroscopy and cloud point. The drug-loaded micelles were tested for their solubilizing capacity towards LX. Results: Results showed a sharp solubility increase from 0.46 mg/ml up to more than 4.34 mg/ml, representing about 136-fold increase. Optimized formulation was selected to achieve maximum drug solubilizing power and clarity with lowest possible particle size. The optimized formulation was characterized by 1HNMR analysis which revealed complete encapsulation of the drug within the micelles. Further investigations by histopathological and confocal laser studies revealed the non-irritant nature and good corneal penetrating power of the proposed nano-formulation. Conclusion: LX-loaded polymeric nanomicellar formulation was fabricated allowing easy application of the drug in the form of clear eye drops that do not cause blurred vision or discomfort, thus achieving high patient compliance.

Keywords: confocal laser scanning microscopy, Histopathological studies, Lornoxicam, micellar solubilization

Procedia PDF Downloads 432