Search results for: standardization artificial intelligence
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2840

Search results for: standardization artificial intelligence

980 Identification of Suitable Sites for Rainwater Harvesting in Salt Water Intruded Area by Using Geospatial Techniques in Jafrabad, Amreli District, India

Authors: Pandurang Balwant, Ashutosh Mishra, Jyothi V., Abhay Soni, Padmakar C., Rafat Quamar, Ramesh J.

Abstract:

The sea water intrusion in the coastal aquifers has become one of the major environmental concerns. Although, it is a natural phenomenon but, it can be induced with anthropogenic activities like excessive exploitation of groundwater, seacoast mining, etc. The geological and hydrogeological conditions including groundwater heads and groundwater pumping pattern in the coastal areas also influence the magnitude of seawater intrusion. However, this problem can be remediated by taking some preventive measures like rainwater harvesting and artificial recharge. The present study is an attempt to identify suitable sites for rainwater harvesting in salt intrusion affected area near coastal aquifer of Jafrabad town, Amreli district, Gujrat, India. The physico-chemical water quality results show that out of 25 groundwater samples collected from the study area most of samples were found to contain high concentration of Total Dissolved Solids (TDS) with major fractions of Na and Cl ions. The Cl/HCO3 ratio was also found greater than 1 which indicates the salt water contamination in the study area. The geophysical survey was conducted at nine sites within the study area to explore the extent of contamination of sea water. From the inverted resistivity sections, low resistivity zone (<3 Ohm m) associated with seawater contamination were demarcated in North block pit and south block pit of NCJW mines, Mitiyala village Lotpur and Lunsapur village at the depth of 33 m, 12 m, 40 m, 37 m, 24 m respectively. Geospatial techniques in combination of Analytical Hierarchy Process (AHP) considering hydrogeological factors, geographical features, drainage pattern, water quality and geophysical results for the study area were exploited to identify potential zones for the Rainwater Harvesting. Rainwater harvesting suitability model was developed in ArcGIS 10.1 software and Rainwater harvesting suitability map for the study area was generated. AHP in combination of the weighted overlay analysis is an appropriate method to identify rainwater harvesting potential zones. The suitability map can be further utilized as a guidance map for the development of rainwater harvesting infrastructures in the study area for either artificial groundwater recharge facilities or for direct use of harvested rainwater.

Keywords: analytical hierarchy process, groundwater quality, rainwater harvesting, seawater intrusion

Procedia PDF Downloads 174
979 An Easy Approach for Fabrication of Macroporous Apatite-Based Bone Cement Used As Potential Trabecular Bone Substitute

Authors: Vimal Kumar Dewangan, T. S. Sampath Kumar, Mukesh Doble, Viju Daniel Varghese

Abstract:

The apatite-based, i.e., calcium-deficient hydroxyapatite (CDHAp) bone cement is well-known potential bone graft/substitute in orthopaedics due to its similar chemical composition with natural bone minerals. Therefore, an easy approach was attempted to fabricate the apatite-based (CDHAp) bone cement with improved injectability, bioresorbability, and macroporosity. In this study, the desired bone cement was developed by mixing the solid phase (consisting of wet chemically synthesized nanocrystalline hydroxyapatite and commercially available (synthetic) tricalcium phosphate) and the liquid phase (consisting of cement binding accelerator with few biopolymers in a dilute acidic solution) along with a liquid porogen as polysorbate or a solid porogen as mannitol (for comparison) in an optimized liquid-to-powder ratio. The fabricated cement sets within clinically preferred setting time (≤20 minutes) are better injectable (>70%) and also stable at ~7.3-7.4 (physiological pH). The CDHAp phased bone cement was resulted by immersing the fabricated after-set cement in phosphate buffer solution and other similar artificial body fluids and incubated at physiological conditions for seven days, confirmed through the X-ray diffraction and Fourier transform-infrared spectroscopy analyses. The so-formed synthetic apatite-based bone cement holds the acceptable compressive strength (within the range of trabecular bone) with average interconnected pores size falls in a macropores range (~50-200μm) inside the cement, verified by scanning electron microscopy (SEM), mercury intrusion porosimetry and micro-CT analysis techniques. Also, it is biodegradable (degrades ~19-22% within 10-12 weeks) when incubated in artificial body fluids under physiological conditions. The biocompatibility study of the bone cement, when incubated with MG63 cells, shows a significant increase in the cell viability after 3rd day of incubation compared with the control, and the cells were well-attached and spread completely on the surface of the bone cement, confirmed through SEM and fluorescence microscopy analyses. With this all, we can conclude that the developed synthetic macroporous apatite-based bone cement may have the potential to become promising material used as a trabecular bone substitute.

Keywords: calcium deficient hydroxyapatite, synthetic apatite-based bone cement, injectability, macroporosity, trabecular bone substitute

Procedia PDF Downloads 87
978 Medicompills Architecture: A Mathematical Precise Tool to Reduce the Risk of Diagnosis Errors on Precise Medicine

Authors: Adriana Haulica

Abstract:

Powered by Machine Learning, Precise medicine is tailored by now to use genetic and molecular profiling, with the aim of optimizing the therapeutic benefits for cohorts of patients. As the majority of Machine Language algorithms come from heuristics, the outputs have contextual validity. This is not very restrictive in the sense that medicine itself is not an exact science. Meanwhile, the progress made in Molecular Biology, Bioinformatics, Computational Biology, and Precise Medicine, correlated with the huge amount of human biology data and the increase in computational power, opens new healthcare challenges. A more accurate diagnosis is needed along with real-time treatments by processing as much as possible from the available information. The purpose of this paper is to present a deeper vision for the future of Artificial Intelligence in Precise medicine. In fact, actual Machine Learning algorithms use standard mathematical knowledge, mostly Euclidian metrics and standard computation rules. The loss of information arising from the classical methods prevents obtaining 100% evidence on the diagnosis process. To overcome these problems, we introduce MEDICOMPILLS, a new architectural concept tool of information processing in Precise medicine that delivers diagnosis and therapy advice. This tool processes poly-field digital resources: global knowledge related to biomedicine in a direct or indirect manner but also technical databases, Natural Language Processing algorithms, and strong class optimization functions. As the name suggests, the heart of this tool is a compiler. The approach is completely new, tailored for omics and clinical data. Firstly, the intrinsic biological intuition is different from the well-known “a needle in a haystack” approach usually used when Machine Learning algorithms have to process differential genomic or molecular data to find biomarkers. Also, even if the input is seized from various types of data, the working engine inside the MEDICOMPILLS does not search for patterns as an integrative tool. This approach deciphers the biological meaning of input data up to the metabolic and physiologic mechanisms, based on a compiler with grammars issued from bio-algebra-inspired mathematics. It translates input data into bio-semantic units with the help of contextual information iteratively until Bio-Logical operations can be performed on the base of the “common denominator “rule. The rigorousness of MEDICOMPILLS comes from the structure of the contextual information on functions, built to be analogous to mathematical “proofs”. The major impact of this architecture is expressed by the high accuracy of the diagnosis. Detected as a multiple conditions diagnostic, constituted by some main diseases along with unhealthy biological states, this format is highly suitable for therapy proposal and disease prevention. The use of MEDICOMPILLS architecture is highly beneficial for the healthcare industry. The expectation is to generate a strategic trend in Precise medicine, making medicine more like an exact science and reducing the considerable risk of errors in diagnostics and therapies. The tool can be used by pharmaceutical laboratories for the discovery of new cures. It will also contribute to better design of clinical trials and speed them up.

Keywords: bio-semantic units, multiple conditions diagnosis, NLP, omics

Procedia PDF Downloads 70
977 Scope of Rainwater Harvesting in Residential Plots of Dhaka City

Authors: Jubaida Gulshan Ara, Zebun Nasreen Ahmed

Abstract:

Urban flood and drought has been a major problem of Dhaka city, particularly in recent years. Continuous increase of the city built up area, and limiting rainwater infiltration zone, are thought to be the main causes of the problem. Proper rainwater management, even at the individual plot level, might bring significant improvement in this regard. As residential use pattern occupies a significant portion of the city surface, the scope of rainwater harvesting (RWH) in residential buildings can be investigated. This paper reports on a research which explored the scope of rainwater harvesting in residential plots, with multifamily apartment buildings, in Dhaka city. The research investigated the basics of RWH, contextual information, i.e., hydro-geological, meteorological data of Dhaka city and the rules and legislations for residential building construction. The study also explored contemporary rainwater harvesting practices in the local and international contexts. On the basis of theoretical understanding, 21 sample case-studies, in different phases of construction, were selected from seven different categories of plot sizes, in different residential areas of Dhaka city. Primary data from the 21 case-study buildings were collected from a physical survey, from design drawings, accompanied by a questionnaire survey. All necessary secondary data were gathered from published and other relevant sources. Collected primary and secondary data were used to calculate and analyze the RWH needs for each case study, based on the theoretical understanding. The main findings have been compiled and compared, to observe residential development trends with regards to building rainwater harvesting system. The study has found that, in ‘Multifamily Apartment Building’ of Dhaka city, storage, and recharge structure size for rainwater harvesting, increases along with occupants’ number, and with the increasing size of the plot. Hence, demand vs. supply ratio remains almost the same for different sizes of plots, and consequently, the size of the storage structure increases significantly, in large-scale plots. It has been found that rainwater can meet only 12%-30% of the total restricted water demand of these residential buildings of Dhaka city. Therefore, artificial groundwater recharge might be the more suitable option for RWH, than storage. The study came up with this conclusion that, in multifamily residential apartments of Dhaka city, artificial groundwater recharge might be the more suitable option for RWH, than storing the rainwater on site.

Keywords: Dhaka city, rainwater harvesting, residential plots, urban flood

Procedia PDF Downloads 195
976 Market Index Trend Prediction using Deep Learning and Risk Analysis

Authors: Shervin Alaei, Reza Moradi

Abstract:

Trading in financial markets is subject to risks due to their high volatilities. Here, using an LSTM neural network, and by doing some risk-based feature engineering tasks, we developed a method that can accurately predict trends of the Tehran stock exchange market index from a few days ago. Our test results have shown that the proposed method with an average prediction accuracy of more than 94% is superior to the other common machine learning algorithms. To the best of our knowledge, this is the first work incorporating deep learning and risk factors to accurately predict market trends.

Keywords: deep learning, LSTM, trend prediction, risk management, artificial neural networks

Procedia PDF Downloads 156
975 ESRA: An End-to-End System for Re-identification and Anonymization of Swiss Court Decisions

Authors: Joel Niklaus, Matthias Sturmer

Abstract:

The publication of judicial proceedings is a cornerstone of many democracies. It enables the court system to be made accountable by ensuring that justice is made in accordance with the laws. Equally important is privacy, as a fundamental human right (Article 12 in the Declaration of Human Rights). Therefore, it is important that the parties (especially minors, victims, or witnesses) involved in these court decisions be anonymized securely. Today, the anonymization of court decisions in Switzerland is performed either manually or semi-automatically using primitive software. While much research has been conducted on anonymization for tabular data, the literature on anonymization for unstructured text documents is thin and virtually non-existent for court decisions. In 2019, it has been shown that manual anonymization is not secure enough. In 21 of 25 attempted Swiss federal court decisions related to pharmaceutical companies, pharmaceuticals, and legal parties involved could be manually re-identified. This was achieved by linking the decisions with external databases using regular expressions. An automated re-identification system serves as an automated test for the safety of existing anonymizations and thus promotes the right to privacy. Manual anonymization is very expensive (recurring annual costs of over CHF 20M in Switzerland alone, according to an estimation). Consequently, many Swiss courts only publish a fraction of their decisions. An automated anonymization system reduces these costs substantially, further leading to more capacity for publishing court decisions much more comprehensively. For the re-identification system, topic modeling with latent dirichlet allocation is used to cluster an amount of over 500K Swiss court decisions into meaningful related categories. A comprehensive knowledge base with publicly available data (such as social media, newspapers, government documents, geographical information systems, business registers, online address books, obituary portal, web archive, etc.) is constructed to serve as an information hub for re-identifications. For the actual re-identification, a general-purpose language model is fine-tuned on the respective part of the knowledge base for each category of court decisions separately. The input to the model is the court decision to be re-identified, and the output is a probability distribution over named entities constituting possible re-identifications. For the anonymization system, named entity recognition (NER) is used to recognize the tokens that need to be anonymized. Since the focus lies on Swiss court decisions in German, a corpus for Swiss legal texts will be built for training the NER model. The recognized named entities are replaced by the category determined by the NER model and an identifier to preserve context. This work is part of an ongoing research project conducted by an interdisciplinary research consortium. Both a legal analysis and the implementation of the proposed system design ESRA will be performed within the next three years. This study introduces the system design of ESRA, an end-to-end system for re-identification and anonymization of Swiss court decisions. Firstly, the re-identification system tests the safety of existing anonymizations and thus promotes privacy. Secondly, the anonymization system substantially reduces the costs of manual anonymization of court decisions and thus introduces a more comprehensive publication practice.

Keywords: artificial intelligence, courts, legal tech, named entity recognition, natural language processing, ·privacy, topic modeling

Procedia PDF Downloads 148
974 Deployment of Attack Helicopters in Conventional Warfare: The Gulf War

Authors: Mehmet Karabekir

Abstract:

Attack helicopters (AHs) are usually deployed in conventional warfare to destroy armored and mechanized forces of enemy. In addition, AHs are able to perform various tasks in the deep, and close operations – intelligence, surveillance, reconnaissance, air assault operations, and search and rescue operations. Apache helicopters were properly employed in the Gulf Wars and contributed the success of campaign by destroying a large number of armored and mechanized vehicles of Iraq Army. The purpose of this article is to discuss the deployment of AHs in conventional warfare in the light of Gulf Wars. First, the employment of AHs in deep and close operations will be addressed regarding the doctrine. Second, the US armed forces AH-64 doctrinal and tactical usage will be argued in the 1st and 2nd Gulf Wars.

Keywords: attack helicopter, conventional warfare, gulf wars

Procedia PDF Downloads 473
973 The Extent of Virgin Olive-Oil Prices' Distribution Revealing the Behavior of Market Speculators

Authors: Fathi Abid, Bilel Kaffel

Abstract:

The olive tree, the olive harvest during winter season and the production of olive oil better known by professionals under the name of the crushing operation have interested institutional traders such as olive-oil offices and private companies such as food industry refining and extracting pomace olive oil as well as export-import public and private companies specializing in olive oil. The major problem facing producers of olive oil each winter campaign, contrary to what is expected, it is not whether the harvest will be good or not but whether the sale price will allow them to cover production costs and achieve a reasonable margin of profit or not. These questions are entirely legitimate if we judge by the importance of the issue and the heavy complexity of the uncertainty and competition made tougher by a high level of indebtedness and the experience and expertise of speculators and producers whose objectives are sometimes conflicting. The aim of this paper is to study the formation mechanism of olive oil prices in order to learn about speculators’ behavior and expectations in the market, how they contribute by their industry knowledge and their financial alliances and the size the financial challenge that may be involved for them to build private information hoses globally to take advantage. The methodology used in this paper is based on two stages, in the first stage we study econometrically the formation mechanisms of olive oil price in order to understand the market participant behavior by implementing ARMA, SARMA, GARCH and stochastic diffusion processes models, the second stage is devoted to prediction purposes, we use a combined wavelet- ANN approach. Our main findings indicate that olive oil market participants interact with each other in a way that they promote stylized facts formation. The unstable participant’s behaviors create the volatility clustering, non-linearity dependent and cyclicity phenomena. By imitating each other in some periods of the campaign, different participants contribute to the fat tails observed in the olive oil price distribution. The best prediction model for the olive oil price is based on a back propagation artificial neural network approach with input information based on wavelet decomposition and recent past history.

Keywords: olive oil price, stylized facts, ARMA model, SARMA model, GARCH model, combined wavelet-artificial neural network, continuous-time stochastic volatility mode

Procedia PDF Downloads 339
972 Enhancing Human Security Through Conmprehensive Counter-terrorism Measures

Authors: Alhaji Khuzaima Mohammed Osman, Zaeem Sheikh Abdul Wadudi Haruna

Abstract:

This article aims to explore the crucial link between counter-terrorism efforts and the preservation of human security. As acts of terrorism continue to pose significant threats to societies worldwide, it is imperative to develop effective strategies that mitigate risks while safeguarding the rights and well-being of individuals. This paper discusses key aspects of counter-terrorism and human security, emphasizing the need for a comprehensive approach that integrates intelligence, prevention, response, and resilience-building measures. By highlighting successful case studies and lessons learned, this article provides valuable insights for policymakers, law enforcement agencies, and practitioners in their quest to address terrorism and foster human security.

Keywords: human security, risk mitigation, terrorist activities, civil liberties

Procedia PDF Downloads 88
971 Pattern Identification in Statistical Process Control Using Artificial Neural Networks

Authors: M. Pramila Devi, N. V. N. Indra Kiran

Abstract:

Control charts, predominantly in the form of X-bar chart, are important tools in statistical process control (SPC). They are useful in determining whether a process is behaving as intended or there are some unnatural causes of variation. A process is out of control if a point falls outside the control limits or a series of point’s exhibit an unnatural pattern. In this paper, a study is carried out on four training algorithms for CCPs recognition. For those algorithms optimal structure is identified and then they are studied for type I and type II errors for generalization without early stopping and with early stopping and the best one is proposed.

Keywords: control chart pattern recognition, neural network, backpropagation, generalization, early stopping

Procedia PDF Downloads 372
970 Effectiveness of Gamified Simulators in the Health Sector

Authors: Nuno Biga

Abstract:

The integration of serious games with gamification in management education and training has gained significant importance in recent years as innovative strategies are sought to improve target audience engagement and learning outcomes. This research builds on the author's previous work in this field and presents a case study that evaluates the ex-post impact of a sample of applications of the BIGAMES management simulator in the training of top managers from various hospital institutions. The methodology includes evaluating the reaction of participants after each edition of BIGAMES Accident & Emergency (A&E) carried out over the last 3 years, as well as monitoring the career path of a significant sample of participants and their feedback more than a year after their experience with this simulator. Control groups will be set up, according to the type of role their members held when they took part in the BIGAMES A&E simulator: Administrators, Clinical Directors and Nursing Directors. Former participants are invited to answer a questionnaire structured for this purpose, where they are asked, among other questions, about the importance and impact that the BIGAMES A&E simulator has had on their professional activity. The research methodology also includes an exhaustive literature review, focusing on empirical studies in the field of education and training in management and business that investigate the effectiveness of gamification and serious games in improving learning, team collaboration, critical thinking, problem-solving skills and overall performance, with a focus on training contexts in the health sector. The results of the research carried out show that gamification and serious games that simulate real scenarios, such as Business Interactive Games - BIGAMES©, can significantly increase the motivation and commitment of participants, stimulating the development of transversal skills, the mobilization of group synergies and the acquisition and retention of knowledge through interactive user-centred scenarios. Individuals who participate in game-based learning series show a higher level of commitment to learning because they find these teaching methods more enjoyable and interactive. This research study aims to demonstrate that, as executive education and training programs develop to meet the current needs of managers, gamification and serious games stand out as effective means of bridging the gap between traditional teaching methods and modern educational and training requirements. To this end, this research evaluates the medium/long-term effects of gamified learning on the professional performance of participants in the BIGAMES simulator applied to healthcare. Based on the conclusions of the evaluation of the effectiveness of training using gamification and taking into account the results of the opinion poll of former A&E participants, this research study proposes an integrated approach for the transversal application of the A&E Serious Game in various educational contexts, covering top management (traditionally the target audience of BIGAMES A&E), middle and operational management in healthcare institutions (functional area heads and professionals with career development potential), as well as higher education in medicine and nursing courses. The integrated solution called “BIGAMES A&E plus”, developed as part of this research, includes the digitalization of key processes and the incorporation of AI.

Keywords: artificial intelligence (AI), executive training, gamification, higher education, management simulators, serious games (SG), training effectiveness

Procedia PDF Downloads 13
969 Application of the Pattern Method to Form the Stable Neural Structures in the Learning Process as a Way of Solving Modern Problems in Education

Authors: Liudmyla Vesper

Abstract:

The problems of modern education are large-scale and diverse. The aspirations of parents, teachers, and experts converge - everyone interested in growing up a generation of whole, well-educated persons. Both the family and society are expected in the future generation to be self-sufficient, desirable in the labor market, and capable of lifelong learning. Today's children have a powerful potential that is difficult to realize in the conditions of traditional school approaches. Focusing on STEM education in practice often ends with the simple use of computers and gadgets during class. "Science", "technology", "engineering" and "mathematics" are difficult to combine within school and university curricula, which have not changed much during the last 10 years. Solving the problems of modern education largely depends on teachers - innovators, teachers - practitioners who develop and implement effective educational methods and programs. Teachers who propose innovative pedagogical practices that allow students to master large-scale knowledge and apply it to the practical plane. Effective education considers the creation of stable neural structures during the learning process, which allow to preserve and increase knowledge throughout life. The author proposed a method of integrated lessons – cases based on the maths patterns for forming a holistic perception of the world. This method and program are scientifically substantiated and have more than 15 years of practical application experience in school and student classrooms. The first results of the practical application of the author's methodology and curriculum were announced at the International Conference "Teaching and Learning Strategies to Promote Elementary School Success", 2006, April 22-23, Yerevan, Armenia, IREX-administered 2004-2006 Multiple Component Education Project. This program is based on the concept of interdisciplinary connections and its implementation in the process of continuous learning. This allows students to save and increase knowledge throughout life according to a single pattern. The pattern principle stores information on different subjects according to one scheme (pattern), using long-term memory. This is how neural structures are created. The author also admits that a similar method can be successfully applied to the training of artificial intelligence neural networks. However, this assumption requires further research and verification. The educational method and program proposed by the author meet the modern requirements for education, which involves mastering various areas of knowledge, starting from an early age. This approach makes it possible to involve the child's cognitive potential as much as possible and direct it to the preservation and development of individual talents. According to the methodology, at the early stages of learning students understand the connection between school subjects (so-called "sciences" and "humanities") and in real life, apply the knowledge gained in practice. This approach allows students to realize their natural creative abilities and talents, which makes it easier to navigate professional choices and find their place in life.

Keywords: science education, maths education, AI, neuroplasticity, innovative education problem, creativity development, modern education problem

Procedia PDF Downloads 62
968 A Study on Big Data Analytics, Applications and Challenges

Authors: Chhavi Rana

Abstract:

The aim of the paper is to highlight the existing development in the field of big data analytics. Applications like bioinformatics, smart infrastructure projects, Healthcare, and business intelligence contain voluminous and incremental data, which is hard to organise and analyse and can be dealt with using the framework and model in this field of study. An organization's decision-making strategy can be enhanced using big data analytics and applying different machine learning techniques and statistical tools on such complex data sets that will consequently make better things for society. This paper reviews the current state of the art in this field of study as well as different application domains of big data analytics. It also elaborates on various frameworks in the process of Analysis using different machine-learning techniques. Finally, the paper concludes by stating different challenges and issues raised in existing research.

Keywords: big data, big data analytics, machine learning, review

Procedia PDF Downloads 83
967 A Study on Big Data Analytics, Applications, and Challenges

Authors: Chhavi Rana

Abstract:

The aim of the paper is to highlight the existing development in the field of big data analytics. Applications like bioinformatics, smart infrastructure projects, healthcare, and business intelligence contain voluminous and incremental data which is hard to organise and analyse and can be dealt with using the framework and model in this field of study. An organisation decision-making strategy can be enhanced by using big data analytics and applying different machine learning techniques and statistical tools to such complex data sets that will consequently make better things for society. This paper reviews the current state of the art in this field of study as well as different application domains of big data analytics. It also elaborates various frameworks in the process of analysis using different machine learning techniques. Finally, the paper concludes by stating different challenges and issues raised in existing research.

Keywords: big data, big data analytics, machine learning, review

Procedia PDF Downloads 95
966 Big Data Applications for Transportation Planning

Authors: Antonella Falanga, Armando Cartenì

Abstract:

"Big data" refers to extremely vast and complex sets of data, encompassing extraordinarily large and intricate datasets that require specific tools for meaningful analysis and processing. These datasets can stem from diverse origins like sensors, mobile devices, online transactions, social media platforms, and more. The utilization of big data is pivotal, offering the chance to leverage vast information for substantial advantages across diverse fields, thereby enhancing comprehension, decision-making, efficiency, and fostering innovation in various domains. Big data, distinguished by its remarkable attributes of enormous volume, high velocity, diverse variety, and significant value, represent a transformative force reshaping the industry worldwide. Their pervasive impact continues to unlock new possibilities, driving innovation and advancements in technology, decision-making processes, and societal progress in an increasingly data-centric world. The use of these technologies is becoming more widespread, facilitating and accelerating operations that were once much more complicated. In particular, big data impacts across multiple sectors such as business and commerce, healthcare and science, finance, education, geography, agriculture, media and entertainment and also mobility and logistics. Within the transportation sector, which is the focus of this study, big data applications encompass a wide variety, spanning across optimization in vehicle routing, real-time traffic management and monitoring, logistics efficiency, reduction of travel times and congestion, enhancement of the overall transportation systems, but also mitigation of pollutant emissions contributing to environmental sustainability. Meanwhile, in public administration and the development of smart cities, big data aids in improving public services, urban planning, and decision-making processes, leading to more efficient and sustainable urban environments. Access to vast data reservoirs enables deeper insights, revealing hidden patterns and facilitating more precise and timely decision-making. Additionally, advancements in cloud computing and artificial intelligence (AI) have further amplified the potential of big data, enabling more sophisticated and comprehensive analyses. Certainly, utilizing big data presents various advantages but also entails several challenges regarding data privacy and security, ensuring data quality, managing and storing large volumes of data effectively, integrating data from diverse sources, the need for specialized skills to interpret analysis results, ethical considerations in data use, and evaluating costs against benefits. Addressing these difficulties requires well-structured strategies and policies to balance the benefits of big data with privacy, security, and efficient data management concerns. Building upon these premises, the current research investigates the efficacy and influence of big data by conducting an overview of the primary and recent implementations of big data in transportation systems. Overall, this research allows us to conclude that big data better provide to enhance rational decision-making for mobility choices and is imperative for adeptly planning and allocating investments in transportation infrastructures and services.

Keywords: big data, public transport, sustainable mobility, transport demand, transportation planning

Procedia PDF Downloads 60
965 Sample Preparation and Coring of Highly Friable and Heterogeneous Bonded Geomaterials

Authors: Mohammad Khoshini, Arman Khoshghalb, Meghdad Payan, Nasser Khalili

Abstract:

Most of the Earth’s crust surface rocks are technically categorized as weak rocks or weakly bonded geomaterials. Deeply weathered, weakly cemented, friable and easily erodible, they demonstrate complex material behaviour and understanding the overlooked mechanical behaviour of such materials is of particular importance in geotechnical engineering practice. Weakly bonded geomaterials are so susceptible to surface shear and moisture that conventional methods of core drilling fail to extract high-quality undisturbed samples out of them. Moreover, most of these geomaterials are of high heterogeneity rendering less reliable and feasible material characterization. In order to compensate for the unpredictability of the material response, either numerous experiments are needed to be conducted or large factors of safety must be implemented in the design process. However, none of these approaches is sustainable. In this study, a method for dry core drilling of such materials is introduced to take high-quality undisturbed core samples. By freezing the material at certain moisture content, a secondary structure is developed throughout the material which helps the whole structure to remain intact during the core drilling process. Moreover, to address the heterogeneity issue, the natural material was reconstructed artificially to obtain a homogeneous material with very high similarity to the natural one in both micro and macro-mechanical perspectives. The method is verified for both micro and macro scale. In terms of micro-scale analysis, using Scanning Electron Microscopy (SEM), pore spaces and inter-particle bonds were investigated and compared between natural and artificial materials. X-Ray Diffraction, XRD, analyses are also performed to control the chemical composition. At the macro scale, several uniaxial compressive strength tests, as well as triaxial tests, were performed to verify the similar mechanical response of the materials. A high level of agreement is observed between micro and macro results of natural and artificially bonded geomaterials. The proposed methods can play an important role to cut down the costs of experimental programs for material characterization and also to promote the accuracy of the numerical modellings based on the experimental results.

Keywords: Artificial geomaterial, core drilling, macro-mechanical behavior, micro-scale, sample preparation, SEM photography, weakly bonded geomaterials

Procedia PDF Downloads 216
964 An Efficient Algorithm for Solving the Transmission Network Expansion Planning Problem Integrating Machine Learning with Mathematical Decomposition

Authors: Pablo Oteiza, Ricardo Alvarez, Mehrdad Pirnia, Fuat Can

Abstract:

To effectively combat climate change, many countries around the world have committed to a decarbonisation of their electricity, along with promoting a large-scale integration of renewable energy sources (RES). While this trend represents a unique opportunity to effectively combat climate change, achieving a sound and cost-efficient energy transition towards low-carbon power systems poses significant challenges for the multi-year Transmission Network Expansion Planning (TNEP) problem. The objective of the multi-year TNEP is to determine the necessary network infrastructure to supply the projected demand in a cost-efficient way, considering the evolution of the new generation mix, including the integration of RES. The rapid integration of large-scale RES increases the variability and uncertainty in the power system operation, which in turn increases short-term flexibility requirements. To meet these requirements, flexible generating technologies such as energy storage systems must be considered within the TNEP as well, along with proper models for capturing the operational challenges of future power systems. As a consequence, TNEP formulations are becoming more complex and difficult to solve, especially for its application in realistic-sized power system models. To meet these challenges, there is an increasing need for developing efficient algorithms capable of solving the TNEP problem with reasonable computational time and resources. In this regard, a promising research area is the use of artificial intelligence (AI) techniques for solving large-scale mixed-integer optimization problems, such as the TNEP. In particular, the use of AI along with mathematical optimization strategies based on decomposition has shown great potential. In this context, this paper presents an efficient algorithm for solving the multi-year TNEP problem. The algorithm combines AI techniques with Column Generation, a traditional decomposition-based mathematical optimization method. One of the challenges of using Column Generation for solving the TNEP problem is that the subproblems are of mixed-integer nature, and therefore solving them requires significant amounts of time and resources. Hence, in this proposal we solve a linearly relaxed version of the subproblems, and trained a binary classifier that determines the value of the binary variables, based on the results obtained from the linearized version. A key feature of the proposal is that we integrate the binary classifier into the optimization algorithm in such a way that the optimality of the solution can be guaranteed. The results of a study case based on the HRP 38-bus test system shows that the binary classifier has an accuracy above 97% for estimating the value of the binary variables. Since the linearly relaxed version of the subproblems can be solved with significantly less time than the integer programming counterpart, the integration of the binary classifier into the Column Generation algorithm allowed us to reduce the computational time required for solving the problem by 50%. The final version of this paper will contain a detailed description of the proposed algorithm, the AI-based binary classifier technique and its integration into the CG algorithm. To demonstrate the capabilities of the proposal, we evaluate the algorithm in case studies with different scenarios, as well as in other power system models.

Keywords: integer optimization, machine learning, mathematical decomposition, transmission planning

Procedia PDF Downloads 85
963 Optimization of Water Pipeline Routes Using a GIS-Based Multi-Criteria Decision Analysis and a Geometric Search Algorithm

Authors: Leon Mortari

Abstract:

The Metropolitan East region of Rio de Janeiro state, Brazil, faces a historic water scarcity. Among the alternatives studied to solve this situation, the possibility of adduction of the available water in the reservoir Lagoa de Juturnaíba to supply the region's municipalities stands out. The allocation of a linear engineering project must occur through an evaluation of different aspects, such as altitude, slope, proximity to roads, distance from watercourses, land use and occupation, and physical and chemical features of the soil. This work aims to apply a multi-criteria model that combines geoprocessing techniques, decision-making, and geometric search algorithm to optimize a hypothetical adductor system in the scenario of expanding the water supply system that serves this region, known as Imunana-Laranjal, using the Lagoa de Juturnaíba as the source. It is proposed in this study, the construction of a spatial database related to the presented evaluation criteria, treatment and rasterization of these data, and standardization and reclassification of this information in a Geographic Information System (GIS) platform. The methodology involves the integrated analysis of these criteria, using their relative importance defined by weighting them based on expert consultations and the Analytic Hierarchy Process (AHP) method. Three approaches are defined for weighting the criteria by AHP: the first treats all criteria as equally important, the second considers weighting based on a pairwise comparison matrix, and the third establishes a hierarchy based on the priority of the criteria. For each approach, a distinct group of weightings is defined. In the next step, map algebra tools are used to overlay the layers and generate cost surfaces, that indicates the resistance to the passage of the adductor route, using the three groups of weightings. The Dijkstra algorithm, a geometric search algorithm, is then applied to these cost surfaces to find an optimized path within the geographical space, aiming to minimize resources, time, investment, maintenance, and environmental and social impacts.

Keywords: geometric search algorithm, GIS, pipeline, route optimization, spatial multi-criteria analysis model

Procedia PDF Downloads 31
962 Cross-Sectional Study Investigating the Prevalence of Uncorrected Refractive Error and Visual Acuity through Mobile Vision Screening in the Homeless in Wales

Authors: Pakinee Pooprasert, Wanxin Wang, Tina Parmar, Dana Ahnood, Tafadzwa Young-Zvandasara, James Morgan

Abstract:

Homelessness has been shown to be correlated to poor health outcomes, including increased visual health morbidity. Despite this, there are relatively few studies regarding visual health in the homeless population, especially in the UK. This research aims to investigate visual disability and access barriers prevalent in the homeless population in Cardiff, South Wales. Data was collected from 100 homeless participants in three different shelters. Visual outcomes included near and distance visual acuity as well as non-cycloplegic refraction. Qualitative data was collected via a questionnaire and included socio-demographic profile, ocular history, subjective visual acuity and level of access to healthcare facilities. Based on the participants’ presenting visual acuity, the total prevalence of myopia and hyperopia was 17.0% and 19.0% respectively based on spherical equivalent from the eye with the greatest absolute value. The prevalence of astigmatism was 8.0%. The mean absolute spherical equivalent was 0.841D and 0.853D for right and left eye respectively. The number of participants with sight loss (as defined by VA= 6/12-6/60 in the better-seeing eye) was 27.0% in comparison to 0.89% and 1.1% in the general Cardiff and Wales population respectively (p-value is < 0.05). Additionally, 1.0% of the homeless subjects were registered blind (VA less than 3/60), in comparison to 0.17% for the national consensus after age standardization. Most participants had good knowledge regarding access to prescription glasses and eye examination services. Despite this, 85.0% never had their eyes examined by a doctor and 73.0% had their last optometrist appointment in more than 5 years. These findings suggested that there was a significant disparity in ocular health, including visual acuity and refractive error amongst the homeless in comparison to the general population. Further, the homeless were less likely to receive the same level of support and continued care in the community due to access barriers. These included a number of socio-economic factors such as travel expenses and regional availability of services, as well as administrative shortcomings. In conclusion, this research demonstrated unmet visual health needs within the homeless, and that inclusive policy changes may need to be implemented for better healthcare outcomes within this marginalized community.

Keywords: homelessness, refractive error, visual disability, Wales

Procedia PDF Downloads 172
961 Generating Individualized Wildfire Risk Assessments Utilizing Multispectral Imagery and Geospatial Artificial Intelligence

Authors: Gus Calderon, Richard McCreight, Tammy Schwartz

Abstract:

Forensic analysis of community wildfire destruction in California has shown that reducing or removing flammable vegetation in proximity to buildings and structures is one of the most important wildfire defenses available to homeowners. State laws specify the requirements for homeowners to create and maintain defensible space around all structures. Unfortunately, this decades-long effort had limited success due to noncompliance and minimal enforcement. As a result, vulnerable communities continue to experience escalating human and economic costs along the wildland-urban interface (WUI). Quantifying vegetative fuels at both the community and parcel scale requires detailed imaging from an aircraft with remote sensing technology to reduce uncertainty. FireWatch has been delivering high spatial resolution (5” ground sample distance) wildfire hazard maps annually to the community of Rancho Santa Fe, CA, since 2019. FireWatch uses a multispectral imaging system mounted onboard an aircraft to create georeferenced orthomosaics and spectral vegetation index maps. Using proprietary algorithms, the vegetation type, condition, and proximity to structures are determined for 1,851 properties in the community. Secondary data processing combines object-based classification of vegetative fuels, assisted by machine learning, to prioritize mitigation strategies within the community. The remote sensing data for the 10 sq. mi. community is divided into parcels and sent to all homeowners in the form of defensible space maps and reports. Follow-up aerial surveys are performed annually using repeat station imaging of fixed GPS locations to address changes in defensible space, vegetation fuel cover, and condition over time. These maps and reports have increased wildfire awareness and mitigation efforts from 40% to over 85% among homeowners in Rancho Santa Fe. To assist homeowners fighting increasing insurance premiums and non-renewals, FireWatch has partnered with Black Swan Analytics, LLC, to leverage the multispectral imagery and increase homeowners’ understanding of wildfire risk drivers. For this study, a subsample of 100 parcels was selected to gain a comprehensive understanding of wildfire risk and the elements which can be mitigated. Geospatial data from FireWatch’s defensible space maps was combined with Black Swan’s patented approach using 39 other risk characteristics into a 4score Report. The 4score Report helps property owners understand risk sources and potential mitigation opportunities by assessing four categories of risk: Fuel sources, ignition sources, susceptibility to loss, and hazards to fire protection efforts (FISH). This study has shown that susceptibility to loss is the category residents and property owners must focus their efforts. The 4score Report also provides a tool to measure the impact of homeowner actions on risk levels over time. Resiliency is the only solution to breaking the cycle of community wildfire destruction and it starts with high-quality data and education.

Keywords: defensible space, geospatial data, multispectral imaging, Rancho Santa Fe, susceptibility to loss, wildfire risk.

Procedia PDF Downloads 108
960 A Case Study of the Ground Collapse Due to Excavation Using Non-Destructive Testing

Authors: Ki-Cheong Yoo, Yushik Han, Heejeung Sohn, Jinwoo Kim

Abstract:

A ground collapse can be caused by natural and artificial factors. Ground collapses that have occurred frequently in Korea were observed and classified into different types by the main contributing factor. In this study, ground collapse induced by groundwater level disturbance in an excavation site was analyzed. Also, ground loosening region around the excavation site was detected and analyzed using non-destructive testing, such as GPR (Ground Penetrating Radar) survey and Electrical Resistivity. The result of the surveys showed that the ground was loosened widely over the surrounding area of the excavation due to groundwater discharge.

Keywords: electrical resistivity, ground collapse, groundwater level, GPR (ground penetrating radar)

Procedia PDF Downloads 194
959 Setting up a Prototype for the Artificial Interactive Reality Unified System to Transform Psychosocial Intervention in Occupational Therapy

Authors: Tsang K. L. V., Lewis L. A., Griffith S., Tucker P.

Abstract:

Background:  Many children with high incidence disabilities, such as autism spectrum disorder (ASD), struggle to participate in the community in a socially acceptable manner. There are limitations for clinical settings to provide natural, real-life scenarios for them to practice the life skills needed to meet their real-life challenges. Virtual reality (VR) offers potential solutions to resolve the existing limitations faced by clinicians to create simulated natural environments for their clients to generalize the facilitated skills. Research design: The research aimed to develop a prototype of an interactive VR system to provide realistic and immersive environments for clients to practice skills. The descriptive qualitative methodology is employed to design and develop the Artificial Interactive Reality Unified System (AIRUS) prototype, which provided insights on how to use advanced VR technology to create simulated real-life social scenarios and enable users to interact with the objects and people inside the virtual environment using natural eye-gazes, hand and body movements. The eye tracking (e.g., selective or joint attention), hand- or body-tracking (e.g., repetitive stimming or fidgeting), and facial tracking (e.g., emotion recognition) functions allowed behavioral data to be captured and managed in the AIRUS architecture. Impact of project: Instead of using external controllers or sensors, hand tracking software enabled the users to interact naturally with the simulated environment using daily life behavior such as handshaking and waving to control and interact with the virtual objects and people. The AIRUS protocol offers opportunities for breakthroughs in future VR-based psychosocial assessment and intervention in occupational therapy. Implications for future projects: AI technology can allow more efficient data capturing and interpretation of object identification and human facial emotion recognition at any given moment. The data points captured can be used to pinpoint our users’ focus and where their interests lie. AI can further help advance the data interpretation system.

Keywords: occupational therapy, psychosocial assessment and intervention, simulated interactive environment, virtual reality

Procedia PDF Downloads 35
958 Retrospective Analysis of 142 Cases of Incision Infection Complicated with Sternal Osteomyelitis after Cardiac Surgery Treated by Activated PRP Gel Filling

Authors: Daifeng Hao, Guang Feng, Jingfeng Zhao, Tao Li, Xiaoye Tuo

Abstract:

Objective: To retrospectively analyze the clinical characteristics of incision infection with sternal osteomyelitis sinus tract after cardiac surgery and the operation method and therapeutic effect of filling and repairing with activated PRP gel. Methods: From March 2011 to October 2022, 142 cases of incision infection after cardiac surgery with sternal osteomyelitis sinus were retrospectively analyzed, and the causes of poor wound healing after surgery, wound characteristics, perioperative wound management were summarized. Treatment during operation, collection and storage process of autologous PRP before debridement surgery, PRP filling repair and activation method after debridement surgery, effect of anticoagulant drugs on surgery, postoperative complications and average wound healing time, etc.. Results: Among the cases in this group, 53.3% underwent coronary artery bypass grafting, 36.8% underwent artificial heart valve replacement, 8.2% underwent aortic artificial vessel replacement, and 1.7% underwent allogeneic heart transplantation. The main causes of poor incision healing were suture reaction, fat liquefaction, osteoporosis, diabetes, and metal allergy in sequence. The wound is characterized by an infected sinus tract. Before the operation, 100-150ml of PRP with 4 times the physiological concentration was collected separately with a blood component separation device. After sinus debridement, PRP was perfused to fill the bony defect in the middle of the sternum, activated with thrombin freeze-dried powder and calcium gluconate injection to form a gel, and the outer skin and subcutaneous tissue were sutured freely. 62.9% of patients discontinued warfarin during the perioperative period, and 37.1% of patients maintained warfarin treatment. There was no significant difference in the incidence of postoperative wound hematoma. The average postoperative wound healing time was 12.9±4.7 days, and there was no obvious postoperative complication. Conclusions: Application of activated PRP gel to fill incision infection with sternal osteomyelitis sinus after cardiac surgery has a less surgical injury and satisfactory and stable curative effect. It can completely replace the previously used pectoralis major muscle flap transplantation operation scheme.

Keywords: platelet-rich plasma, negative-pressure wound therapy, sternal osteomyelitis, cardiac surgery

Procedia PDF Downloads 78
957 Internet of Things, Edge and Cloud Computing in Rock Mechanical Investigation for Underground Surveys

Authors: Esmael Makarian, Ayub Elyasi, Fatemeh Saberi, Olusegun Stanley Tomomewo

Abstract:

Rock mechanical investigation is one of the most crucial activities in underground operations, especially in surveys related to hydrocarbon exploration and production, geothermal reservoirs, energy storage, mining, and geotechnics. There is a wide range of traditional methods for driving, collecting, and analyzing rock mechanics data. However, these approaches may not be suitable or work perfectly in some situations, such as fractured zones. Cutting-edge technologies have been provided to solve and optimize the mentioned issues. Internet of Things (IoT), Edge, and Cloud Computing technologies (ECt & CCt, respectively) are among the most widely used and new artificial intelligence methods employed for geomechanical studies. IoT devices act as sensors and cameras for real-time monitoring and mechanical-geological data collection of rocks, such as temperature, movement, pressure, or stress levels. Structural integrity, especially for cap rocks within hydrocarbon systems, and rock mass behavior assessment, to further activities such as enhanced oil recovery (EOR) and underground gas storage (UGS), or to improve safety risk management (SRM) and potential hazards identification (P.H.I), are other benefits from IoT technologies. EC techniques can process, aggregate, and analyze data immediately collected by IoT on a real-time scale, providing detailed insights into the behavior of rocks in various situations (e.g., stress, temperature, and pressure), establishing patterns quickly, and detecting trends. Therefore, this state-of-the-art and useful technology can adopt autonomous systems in rock mechanical surveys, such as drilling and production (in hydrocarbon wells) or excavation (in mining and geotechnics industries). Besides, ECt allows all rock-related operations to be controlled remotely and enables operators to apply changes or make adjustments. It must be mentioned that this feature is very important in environmental goals. More often than not, rock mechanical studies consist of different data, such as laboratory tests, field operations, and indirect information like seismic or well-logging data. CCt provides a useful platform for storing and managing a great deal of volume and different information, which can be very useful in fractured zones. Additionally, CCt supplies powerful tools for predicting, modeling, and simulating rock mechanical information, especially in fractured zones within vast areas. Also, it is a suitable source for sharing extensive information on rock mechanics, such as the direction and size of fractures in a large oil field or mine. The comprehensive review findings demonstrate that digital transformation through integrated IoT, Edge, and Cloud solutions is revolutionizing traditional rock mechanical investigation. These advanced technologies have empowered real-time monitoring, predictive analysis, and data-driven decision-making, culminating in noteworthy enhancements in safety, efficiency, and sustainability. Therefore, by employing IoT, CCt, and ECt, underground operations have experienced a significant boost, allowing for timely and informed actions using real-time data insights. The successful implementation of IoT, CCt, and ECt has led to optimized and safer operations, optimized processes, and environmentally conscious approaches in underground geological endeavors.

Keywords: rock mechanical studies, internet of things, edge computing, cloud computing, underground surveys, geological operations

Procedia PDF Downloads 62
956 Stimulus-Dependent Polyrhythms of Central Pattern Generator Hardware

Authors: Le Zhao, Alain Nogaret

Abstract:

We have built universal Central Pattern Generator (CPG) hardware by interconnecting Hodgkin-Huxley neurons with reciprocally inhibitory synapses. We investigate the dynamics of neuron oscillations as a function of the time delay between current steps applied to individual neurons. We demonstrate stimulus dependent switching between spiking polyrhythms and map the phase portraits of the neuron oscillations to reveal the basins of attraction of the system. We experimentally study the dependence of the attraction basins on the network parameters: the neuron response time and the strength of inhibitory connections.

Keywords: central pattern generator, winnerless competition principle, artificial neural networks, synapses

Procedia PDF Downloads 474
955 Miniaturized PVC Sensors for Determination of Fe2+, Mn2+ and Zn2+ in Buffalo-Cows’ Cervical Mucus Samples

Authors: Ahmed S. Fayed, Umima M. Mansour

Abstract:

Three polyvinyl chloride membrane sensors were developed for the electrochemical evaluation of ferrous, manganese and zinc ions. The sensors were used for assaying metal ions in cervical mucus (CM) of Egyptian river buffalo-cows (Bubalus bubalis) as their levels vary dependent on cyclical hormone variation during different phases of estrus cycle. The presented sensors are based on using ionophores, β-cyclodextrin (β-CD), hydroxypropyl β-cyclodextrin (HP-β-CD) and sulfocalix-4-arene (SCAL) for sensors 1, 2 and 3 for Fe2+, Mn2+ and Zn2+, respectively. Dioctyl phthalate (DOP) was used as the plasticizer in a polymeric matrix of polyvinylchloride (PVC). For increasing the selectivity and sensitivity of the sensors, each sensor was enriched with a suitable complexing agent, which enhanced the sensor’s response. For sensor 1, β-CD was mixed with bathophenanthroline; for sensor 2, porphyrin was incorporated with HP-β-CD; while for sensor 3, oxine was the used complexing agent with SCAL. Linear responses of 10-7-10-2 M with cationic slopes of 53.46, 45.01 and 50.96 over pH range 4-8 were obtained using coated graphite sensors for ferrous, manganese and zinc ionic solutions, respectively. The three sensors were validated, according to the IUPAC guidelines. The obtained results by the presented potentiometric procedures were statistically analyzed and compared with those obtained by atomic absorption spectrophotometric method (AAS). No significant differences for either accuracy or precision were observed between the two techniques. Successful application for the determination of the three studied cations in CM, for the purpose to determine the proper time for artificial insemination (AI) was achieved. The results were compared with those obtained upon analyzing the samples by AAS. Proper detection of estrus and correct time of AI was necessary to maximize the production of buffaloes. In this experiment, 30 multi-parous buffalo-cows were in second to third lactation and weighting 415-530 kg, and were synchronized with OVSynch protocol. Samples were taken in three times around ovulation, on day 8 of OVSynch protocol, on day 9 (20 h before AI) and on day 10 (1 h before AI). Beside analysis of trace elements (Fe2+, Mn2+ and Zn2+) in CM using the three sensors, the samples were analyzed for the three cations and also Cu2+ by AAS in the CM samples and blood samples. The results obtained were correlated with hormonal analysis of serum samples and ultrasonography for the purpose of determining of the optimum time of AI. The results showed significant differences and powerful correlation with Zn2+ composition of CM during heat phase and the ovulation time, indicating that the parameter could be used as a tool to decide optimal time of AI in buffalo-cows.

Keywords: PVC Sensors, buffalo-cows, cyclodextrins, atomic absorption spectrophotometry, artificial insemination, OVSynch protocol

Procedia PDF Downloads 219
954 Municipal Asset Management Planning 2.0 – A New Framework For Policy And Program Design In Ontario

Authors: Scott R. Butler

Abstract:

Ontario, Canada’s largest province, is in the midst of an interesting experiment in mandated asset management planning for local governments. At the beginning of 2021, Ontario’s 444 municipalities were responsible for the management of 302,864 lane kilometers of roads that have a replacement cost of $97.545 billion CDN. Roadways are by far the most complex, expensive, and extensive assets that a municipality is responsible for overseeing. Since adopting Ontario Regulation 588/47: Asset Management Planning for Municipal Infrastructure in 2017, the provincial government has established prescriptions for local road authorities regarding asset category and levels of service being provided. This provincial regulation further stipulates that asset data such as extent, condition, and life cycle costing are to be captured in manner compliant with qualitative descriptions and technical metrics. The Ontario Good Roads Association undertook an exercise to aggregate the road-related data contained within the 444 asset management plans that municipalities have filed with the provincial government. This analysis concluded that collectively Ontario municipal roadways have a $34.7 billion CDN in deferred maintenance. The ill-state of repair of Ontario municipal roads has lasting implications for province’s economic competitiveness and has garnered considerable political attention. Municipal efforts to address the maintenance backlog are stymied by the extremely limited fiscal parameters municipalities must operate within in Ontario. Further exacerbating the program are provincially designed programs that are ineffective, administratively burdensome, and not necessarily aligned with local priorities or strategies. This paper addresses how municipal asset management plans – and more specifically, the data contained in these plans – can be used to design innovative policy frameworks, flexible funding programs, and new levels of service that respond to these funding challenges, as well as emerging issues such as local economic development and climate change. To fully unlock the potential that Ontario Regulation 588/17 has imposed will require a resolute commitment to data standardization and horizontal collaboration between municipalities within regions.

Keywords: transportation, municipal asset management, subnational policy design, subnational funding program design

Procedia PDF Downloads 94
953 Accelerated Aging of Photopolymeric Material Used in Flexography

Authors: S. Mahovic Poljacek, T. Tomasegovic, T. Cigula, D. Donevski, R. Szentgyörgyvölgyi, S. Jakovljevic

Abstract:

In this paper, a degradation of the photopolymeric material (PhPM), used as printing plate in the flexography reproduction technique, caused by accelerated aging has been observed. Since the basis process for production of printing plates from the PhPM is a radical cross-linking process caused by exposing to UV wavelengths, the assumption was that improper storage or irregular handling of the PhPM plate can change the surface and structure characteristics of the plates. Results have shown that the aging process causes degradation in the structure and changes in the surface of the PhPM printing plate.

Keywords: aging process, artificial treatment, flexography, photopolymeric material (PhPM)

Procedia PDF Downloads 349
952 Development of Non-Intrusive Speech Evaluation Measure Using S-Transform and Light-Gbm

Authors: Tusar Kanti Dash, Ganapati Panda

Abstract:

The evaluation of speech quality and intelligence is critical to the overall effectiveness of the Speech Enhancement Algorithms. Several intrusive and non-intrusive measures are employed to calculate these parameters. Non-Intrusive Evaluation is most challenging as, very often, the reference clean speech data is not available. In this paper, a novel non-intrusive speech evaluation measure is proposed using audio features derived from the Stockwell transform. These features are used with the Light Gradient Boosting Machine for the effective prediction of speech quality and intelligibility. The proposed model is analyzed using noisy and reverberant speech from four databases, and the results are compared with the standard Intrusive Evaluation Measures. It is observed from the comparative analysis that the proposed model is performing better than the standard Non-Intrusive models.

Keywords: non-Intrusive speech evaluation, S-transform, light GBM, speech quality, and intelligibility

Procedia PDF Downloads 259
951 A Congenital Case of Dandy-Walker Malformation

Authors: Neerja Meena, Paresh Sukhani

Abstract:

Dandy walker malformation is a generalised disorder of mesenchymal development that affect both the cerebellum and overlying meninges. Classically dandy-walker malformation consists of a triad of- 1:vermian and hemispheric cerebellar hypoplasia 2:cystic dilatation of 4th ventricle 3: enlarged posterior fossa with the upward migration of tentorium(lambdoid- torcular inversion). Clinical presentation: four months old female child with hydrocephalus and neurological symptoms. Generally- early death is common in classic dandy walker malformation. However, if it is relatively mild and uncomplicated by other CNS anomalies, intelligence can be normal and neurologic deficits minimal. Usually, VP shunting is the treatment of choice for this hydrocephalus. Conclusion: MRI is the modality of choice to diagnose posterior fossa malformation. However, it can be ruled out through using during the antenatal check as the prognosis of this malformation is not good; it's better to diagnose it inutero.

Keywords: Dandy Walker, Mri, Earlydaignosis, Treatment

Procedia PDF Downloads 76