Search results for: minimum data set
24069 Copyright Clearance for Artificial Intelligence Training Data: Challenges and Solutions
Authors: Erva Akin
Abstract:
– The use of copyrighted material for machine learning purposes is a challenging issue in the field of artificial intelligence (AI). While machine learning algorithms require large amounts of data to train and improve their accuracy and creativity, the use of copyrighted material without permission from the authors may infringe on their intellectual property rights. In order to overcome copyright legal hurdle against the data sharing, access and re-use of data, the use of copyrighted material for machine learning purposes may be considered permissible under certain circumstances. For example, if the copyright holder has given permission to use the data through a licensing agreement, then the use for machine learning purposes may be lawful. It is also argued that copying for non-expressive purposes that do not involve conveying expressive elements to the public, such as automated data extraction, should not be seen as infringing. The focus of such ‘copy-reliant technologies’ is on understanding language rules, styles, and syntax and no creative ideas are being used. However, the non-expressive use defense is within the framework of the fair use doctrine, which allows the use of copyrighted material for research or educational purposes. The questions arise because the fair use doctrine is not available in EU law, instead, the InfoSoc Directive provides for a rigid system of exclusive rights with a list of exceptions and limitations. One could only argue that non-expressive uses of copyrighted material for machine learning purposes do not constitute a ‘reproduction’ in the first place. Nevertheless, the use of machine learning with copyrighted material is difficult because EU copyright law applies to the mere use of the works. Two solutions can be proposed to address the problem of copyright clearance for AI training data. The first is to introduce a broad exception for text and data mining, either mandatorily or for commercial and scientific purposes, or to permit the reproduction of works for non-expressive purposes. The second is that copyright laws should permit the reproduction of works for non-expressive purposes, which opens the door to discussions regarding the transposition of the fair use principle from the US into EU law. Both solutions aim to provide more space for AI developers to operate and encourage greater freedom, which could lead to more rapid innovation in the field. The Data Governance Act presents a significant opportunity to advance these debates. Finally, issues concerning the balance of general public interests and legitimate private interests in machine learning training data must be addressed. In my opinion, it is crucial that robot-creation output should fall into the public domain. Machines depend on human creativity, innovation, and expression. To encourage technological advancement and innovation, freedom of expression and business operation must be prioritised.Keywords: artificial intelligence, copyright, data governance, machine learning
Procedia PDF Downloads 8324068 Assessment the Correlation of Rice Yield Traits by Simulation and Modelling Methods
Authors: Davood Barari Tari
Abstract:
In order to investigate the correlation of rice traits in different nitrogen management methods by modeling programming, an experiment was laid out in rice paddy field in an experimental field at Caspian Coastal Sea region from 2013 to 2014. Variety used was Shiroudi as a high yielding variety. Nitrogen management was in two methods. Amount of nitrogen at four levels (30, 60, 90, and 120 Kg N ha-1 and control) and nitrogen-splitting at four levels (T1: 50% in base + 50% in maximum tillering stage, T2= 33.33% basal +33.33% in maximum tillering stage +33.33% in panicle initiation stage, T3=25% basal+37.5% in maximum tillering stage +37.5% in panicle initiation stage, T4: 25% in basal + 25% in maximum tillering stage + 50% in panicle initiation stage). Results showed that nitrogen traits, total grain number, filled spikelets, panicle number per m2 had a significant correlation with grain yield. Results related to calibrated and validation of rice model methods indicated that correlation between rice yield and yield components was accurate. The correlation between panicle length and grain yield was minimum. Physiological indices was simulated with low accuracy. According to results, investigation of the correlation between rice traits in physiological, morphological and phenological characters and yield by modeling and simulation methods are very useful.Keywords: rice, physiology, modelling, simulation, yield traits
Procedia PDF Downloads 34324067 Biosorption of Phenol onto Water Hyacinth Activated Carbon: Kinetics and Isotherm Study
Authors: Manoj Kumar Mahapatra, Arvind Kumar
Abstract:
Batch adsorption experiments were carried out for the removal of phenol from its aqueous solution using water hyancith activated carbon (WHAC) as an adsorbent. The sorption kinetics were analysed using pseudo-first order kinetics and pseudo-second order model, and it was observed that the sorption data tend to fit very well in pseudo-second order model for the entire sorption time. The experimental data were analyzed by the Langmuir and Freundlich isotherm models. Equilibrium data fitted well to the Freundlich model with a maximum biosorption capacity of 31.45 mg/g estimated using Langmuir model. The adsorption intensity 3.7975 represents a favorable adsorption condition.Keywords: adsorption, isotherm, kinetics, phenol
Procedia PDF Downloads 44624066 Mixing Time: Influence on the Compressive Strength
Authors: J. Alvarez Muñoz, Dominguez Lepe J. A.
Abstract:
A suitable mixing time of the concrete, allows form a homogeneous mass, quality that leads to greater compressive strength and durability. Although there are recommendations as ASTM C94 standard these mention the time and the number of minimum and maximum speed for a ready-mix concrete of good quality, the specific behavior that would have a concrete mixed on site to variability of the mixing time is unknown. In this study was evaluated the behavior a design of mixture structural of f´c=250 kg/cm2, elaborate on site with limestone aggregate in warm sub-humid climate, subjected to different mixing times. Based on the recommendation for ready-mixed concrete ASTM C94, different times were set at 70, 90, 100, 110, 120, 140 total revolutions. A field study in which 14 works were observed where structural concrete made on site was used, allowed to set at 24 the number of revolutions to the reference mixture. For the production of concrete was used a hand feed concrete mixer with drum speed 28 RPM, the ratio w/c was 0.36 corrected, with a slump of 5-6 cm, for all mixtures. The compressive strength tests were performed at 3, 7, 14, and 28 days. The most outstanding results show increases in resistance in the mixtures of 24 to 70 revolutions between 8 and 17 percent and 70 to 90 revolutions of 3 to 8 percent. Increasing the number of revolutions at 110, 120 and 140, there was a reduction of the compressive strength of 0.5 to 8 percent. Regarding mixtures consistencies, they had a slump of 5 cm to 24, 70 and 90 rpm and less than 5 cm from 100 revolutions. Clearly, those made with more than 100 revolutions mixtures not only decrease the compressive strength but also the workability.Keywords: compressive strength, concrete, mixing time, workability
Procedia PDF Downloads 40024065 Enhancing the Rollability of Cu-Ge-Ni Alloy through Heat Treatment Methods
Authors: Morteza Hadi
Abstract:
This research investigates the potential enhancement of the rollability of Cu-Ge-Ni alloy through the mitigation of microstructural and compositional inhomogeneities via two distinct heat treatment methods: homogenization and solution treatment. To achieve this objective, the alloy with the desired composition was fabricated using a vacuum arc remelting furnace (VAR), followed by sample preparation for microstructural, compositional, and heat treatment analyses at varying temperatures and durations. Characterization was conducted employing optical and scanning electron microscopy (SEM), X-ray diffraction (XRD), and Vickers hardness testing. The results obtained indicate that a minimum duration of 10 hours is necessary for adequate homogenization of the alloy at 750°C. This heat treatment effectively removes coarse dendrites from the casting microstructure and significantly reduces elemental separations. However, despite these improvements, the presence of a second phase with markedly different hardness from the matrix results in poor rolling ability for the alloy. The optimal time for solution treatment at various temperatures was determined, with the most effective cycle identified as 750°C for 2 hours, followed by rapid quenching in water. This process induces the formation of a single-phase microstructure and complete elimination of the second phase, as confirmed by X-ray diffraction analysis. Results demonstrate a reduction in hardness by 30 Vickers, and the elimination of microstructural unevenness enables successful thickness reduction by up to 50% through rolling without encountering cracking.Keywords: Cu-Ge-Ni alloy, homogenization. solution treatment, rollability
Procedia PDF Downloads 5224064 A West Coast Estuarine Case Study: A Predictive Approach to Monitor Estuarine Eutrophication
Authors: Vedant Janapaty
Abstract:
Estuaries are wetlands where fresh water from streams mixes with salt water from the sea. Also known as “kidneys of our planet”- they are extremely productive environments that filter pollutants, absorb floods from sea level rise, and shelter a unique ecosystem. However, eutrophication and loss of native species are ailing our wetlands. There is a lack of uniform data collection and sparse research on correlations between satellite data and in situ measurements. Remote sensing (RS) has shown great promise in environmental monitoring. This project attempts to use satellite data and correlate metrics with in situ observations collected at five estuaries. Images for satellite data were processed to calculate 7 bands (SIs) using Python. Average SI values were calculated per month for 23 years. Publicly available data from 6 sites at ELK was used to obtain 10 parameters (OPs). Average OP values were calculated per month for 23 years. Linear correlations between the 7 SIs and 10 OPs were made and found to be inadequate (correlation = 1 to 64%). Fourier transform analysis on 7 SIs was performed. Dominant frequencies and amplitudes were extracted for 7 SIs, and a machine learning(ML) model was trained, validated, and tested for 10 OPs. Better correlations were observed between SIs and OPs, with certain time delays (0, 3, 4, 6 month delay), and ML was again performed. The OPs saw improved R² values in the range of 0.2 to 0.93. This approach can be used to get periodic analyses of overall wetland health with satellite indices. It proves that remote sensing can be used to develop correlations with critical parameters that measure eutrophication in situ data and can be used by practitioners to easily monitor wetland health.Keywords: estuary, remote sensing, machine learning, Fourier transform
Procedia PDF Downloads 10424063 Agricultural Water Consumption Estimation in the Helmand Basin
Authors: Mahdi Akbari, Ali Torabi Haghighi
Abstract:
Hamun Lakes, located in the Helmand Basin, consisting of four water bodies, were the greatest (>8500 km2) freshwater bodies in Iran plateau but have almost entirely desiccated over the last 20 years. The desiccation of the lakes caused dust storm in the region which has huge economic and health consequences on the inhabitants. The flow of the Hirmand (or Helmand) River, the most important feeding river, has decreased from 4 to 1.9 km3 downstream due to anthropogenic activities. In this basin, water is mainly consumed for farming. Due to the lack of in-situ data in the basin, this research utilizes remote-sensing data to show how croplands and consequently consumed water in the agricultural sector have changed. Based on Landsat NDVI, we suggest using a threshold of around 0.35-0.4 to detect croplands in the basin. Croplands of this basin has doubled since 1990, especially in the downstream of the Kajaki Dam (the biggest dam of the basin). Using PML V2 Actual Evapotranspiration (AET) data and considering irrigation efficiency (≈0.3), we estimate that the consumed water (CW) for farming. We found that CW has increased from 2.5 to over 7.5 km3 from 2002 to 2017 in this basin. Also, the annual average Potential Evapotranspiration (PET) of the basin has had a negative trend in the recent years, although the AET over croplands has an increasing trend. In this research, using remote sensing data, we covered lack of data in the studied area and highlighted anthropogenic activities in the upstream which led to the lakes desiccation in the downstream.Keywords: Afghanistan-Iran transboundary Basin, Iran-Afghanistan water treaty, water use, lake desiccation
Procedia PDF Downloads 13124062 Data-Driven Strategies for Enhancing Food Security in Vulnerable Regions: A Multi-Dimensional Analysis of Crop Yield Predictions, Supply Chain Optimization, and Food Distribution Networks
Authors: Sulemana Ibrahim
Abstract:
Food security remains a paramount global challenge, with vulnerable regions grappling with issues of hunger and malnutrition. This study embarks on a comprehensive exploration of data-driven strategies aimed at ameliorating food security in such regions. Our research employs a multifaceted approach, integrating data analytics to predict crop yields, optimizing supply chains, and enhancing food distribution networks. The study unfolds as a multi-dimensional analysis, commencing with the development of robust machine learning models harnessing remote sensing data, historical crop yield records, and meteorological data to foresee crop yields. These predictive models, underpinned by convolutional and recurrent neural networks, furnish critical insights into anticipated harvests, empowering proactive measures to confront food insecurity. Subsequently, the research scrutinizes supply chain optimization to address food security challenges, capitalizing on linear programming and network optimization techniques. These strategies intend to mitigate loss and wastage while streamlining the distribution of agricultural produce from field to fork. In conjunction, the study investigates food distribution networks with a particular focus on network efficiency, accessibility, and equitable food resource allocation. Network analysis tools, complemented by data-driven simulation methodologies, unveil opportunities for augmenting the efficacy of these critical lifelines. This study also considers the ethical implications and privacy concerns associated with the extensive use of data in the realm of food security. The proposed methodology outlines guidelines for responsible data acquisition, storage, and usage. The ultimate aspiration of this research is to forge a nexus between data science and food security policy, bestowing actionable insights to mitigate the ordeal of food insecurity. The holistic approach converging data-driven crop yield forecasts, optimized supply chains, and improved distribution networks aspire to revitalize food security in the most vulnerable regions, elevating the quality of life for millions worldwide.Keywords: data-driven strategies, crop yield prediction, supply chain optimization, food distribution networks
Procedia PDF Downloads 6224061 A Statistical Approach to Classification of Agricultural Regions
Authors: Hasan Vural
Abstract:
Turkey is a favorable country to produce a great variety of agricultural products because of her different geographic and climatic conditions which have been used to divide the country into four main and seven sub regions. This classification into seven regions traditionally has been used in order to data collection and publication especially related with agricultural production. Afterwards, nine agricultural regions were considered. Recently, the governmental body which is responsible of data collection and dissemination (Turkish Institute of Statistics-TIS) has used 12 classes which include 11 sub regions and Istanbul province. This study aims to evaluate these classification efforts based on the acreage of ten main crops in a ten years time period (1996-2005). The panel data grouped in 11 subregions has been evaluated by cluster and multivariate statistical methods. It was concluded that from the agricultural production point of view, it will be rather meaningful to consider three main and eight sub-agricultural regions throughout the country.Keywords: agricultural region, factorial analysis, cluster analysis,
Procedia PDF Downloads 41624060 Evaluation of Mechanical Properties of Welds Fabricated at a Close Proximity on Offshore Structures
Authors: T. Nakkeran, C. Dhamodharan, Win Myint Soe , Ramasamy Deverajan, M. Ganesh Babu
Abstract:
This manuscript presents the results of an experimental investigation performed to study the material and mechanical properties of two weld joints fabricated within close proximity. The experiment was designed using welded S355 D Z35 with distances between two parallel adjacent weld toes at 8 mm. These distances were less than the distance that has normally been recommended in standards, codes, and specifications. The main idea of the analysis is to determine any significant effects when welding the joints with the close proximity of 8mm using the SAW welding process of the one joint with high heat put and one joint welded with the FCAW welding process and evaluating the destructing and nondestructive testing between the welded joints. Further, we have evaluated the joints with Mechanical Testing for evaluating by performing Tensile test, bend testing, Macrostructure, Microstructure, Hardness test, and Impact testing. After evaluating the final outcome of the result, no significant changes were observed for welding the close proximity of weld of 8mm distance between the joints as compared to the specification minimum distance between the weldments of any design should be 50mm.Keywords: S355 carbon steel, weld proximity, SAW process, FCAW process, heat input, bend test, tensile test, hardness test, impact test, macro and microscopic examinations
Procedia PDF Downloads 9824059 Automatic Thresholding for Data Gap Detection for a Set of Sensors in Instrumented Buildings
Authors: Houda Najeh, Stéphane Ploix, Mahendra Pratap Singh, Karim Chabir, Mohamed Naceur Abdelkrim
Abstract:
Building systems are highly vulnerable to different kinds of faults and failures. In fact, various faults, failures and human behaviors could affect the building performance. This paper tackles the detection of unreliable sensors in buildings. Different literature surveys on diagnosis techniques for sensor grids in buildings have been published but all of them treat only bias and outliers. Occurences of data gaps have also not been given an adequate span of attention in the academia. The proposed methodology comprises the automatic thresholding for data gap detection for a set of heterogeneous sensors in instrumented buildings. Sensor measurements are considered to be regular time series. However, in reality, sensor values are not uniformly sampled. So, the issue to solve is from which delay each sensor become faulty? The use of time series is required for detection of abnormalities on the delays. The efficiency of the method is evaluated on measurements obtained from a real power plant: an office at Grenoble Institute of technology equipped by 30 sensors.Keywords: building system, time series, diagnosis, outliers, delay, data gap
Procedia PDF Downloads 24524058 Anti-Microbial Activity of Senna garrettiana Extract
Authors: Pun Jankrajangjaeng
Abstract:
Senna garrettiana is a climatic tropical plant in Southeast Asia. Senna garrettiana (Craib) is used as a medicinal plant in Thailand, in which the experiment reported that the plant contains triterpenoids, ligans, phenolics, and fungal metabolites. Thus, it is also reported that the plant possesses interesting biological activity such as antioxidant activity. Therefore, Senna garrettiana is selected to examine the antimicrobial activity. The purpose of this study is to examine the antimicrobial activity of Senna garrettiana (crab) extract against Gram-positive Staphylococcus aureus and Gram-negative Salmonella typhi, and the fungus Candida albicans. This study performed the agar disk-diffusion method and broth microdilution by using five concentrations of plant extract to determine the minimum inhibitory concentration (MIC) of S. garrettiana extract. The result showed that S. garrettiana extract gave the maximum zone inhibition of 11.7 mm, 13.7 mm, and 14.0 mm against S. aureus, S. typhi, and C. albicans, respectively. The MIC value of S. garrettiana against S. aureus was 125 µg/mL while the MIC in S. typhi and C. albicans greater than 2000 µg/mL. To conclude, S. garrettiana extract showed higher sensitivity of antibacterial activity against gram-positive bacteria than gram-negative bacteria. In addition, the plant extracts also possessed antifungal activity. Therefore, further investigation to confirm the mechanism of action of antimicrobial activity in S. garrettiana extract should be performed to identify the target of the antimicrobial action.Keywords: antimicrobial activity, Candida albicans, Salmonella typhi, Senna garrettiana, Staphylococcus aureus
Procedia PDF Downloads 19624057 Unbalanced Distribution Optimal Power Flow to Minimize Losses with Distributed Photovoltaic Plants
Authors: Malinwo Estone Ayikpa
Abstract:
Electric power systems are likely to operate with minimum losses and voltage meeting international standards. This is made possible generally by control actions provide by automatic voltage regulators, capacitors and transformers with on-load tap changer (OLTC). With the development of photovoltaic (PV) systems technology, their integration on distribution networks has increased over the last years to the extent of replacing the above mentioned techniques. The conventional analysis and simulation tools used for electrical networks are no longer able to take into account control actions necessary for studying distributed PV generation impact. This paper presents an unbalanced optimal power flow (OPF) model that minimizes losses with association of active power generation and reactive power control of single-phase and three-phase PV systems. Reactive power can be generated or absorbed using the available capacity and the adjustable power factor of the inverter. The unbalance OPF is formulated by current balance equations and solved by primal-dual interior point method. Several simulation cases have been carried out varying the size and location of PV systems and the results show a detailed view of the impact of PV distributed generation on distribution systems.Keywords: distribution system, loss, photovoltaic generation, primal-dual interior point method
Procedia PDF Downloads 33224056 Artificial Reproduction System and Imbalanced Dataset: A Mendelian Classification
Authors: Anita Kushwaha
Abstract:
We propose a new evolutionary computational model called Artificial Reproduction System which is based on the complex process of meiotic reproduction occurring between male and female cells of the living organisms. Artificial Reproduction System is an attempt towards a new computational intelligence approach inspired by the theoretical reproduction mechanism, observed reproduction functions, principles and mechanisms. A reproductive organism is programmed by genes and can be viewed as an automaton, mapping and reducing so as to create copies of those genes in its off springs. In Artificial Reproduction System, the binding mechanism between male and female cells is studied, parameters are chosen and a network is constructed also a feedback system for self regularization is established. The model then applies Mendel’s law of inheritance, allele-allele associations and can be used to perform data analysis of imbalanced data, multivariate, multiclass and big data. In the experimental study Artificial Reproduction System is compared with other state of the art classifiers like SVM, Radial Basis Function, neural networks, K-Nearest Neighbor for some benchmark datasets and comparison results indicates a good performance.Keywords: bio-inspired computation, nature- inspired computation, natural computing, data mining
Procedia PDF Downloads 27224055 Critical Evaluation and Analysis of Effects of Different Queuing Disciplines on Packets Delivery and Delay for Different Applications
Authors: Omojokun Gabriel Aju
Abstract:
Communication network is a process of exchanging data between two or more devices via some forms of transmission medium using communication protocols. The data could be in form of text, images, audio, video or numbers which can be grouped into FTP, Email, HTTP, VOIP or Video applications. The effectiveness of such data exchange will be proved if they are accurately delivered within specified time. While some senders will not really mind when the data is actually received by the receiving device, inasmuch as it is acknowledged to have been received by the receiver. The time a data takes to get to a receiver could be very important to another sender, as any delay could cause serious problem or even in some cases rendered the data useless. The validity or invalidity of a data after delay will therefore definitely depend on the type of data (information). It is therefore imperative for the network device (such as router) to be able to differentiate among the packets which are time sensitive and those that are not, when they are passing through the same network. So, here is where the queuing disciplines comes to play, to handle network resources when such network is designed to service widely varying types of traffics and manage the available resources according to the configured policies. Therefore, as part of the resources allocation mechanisms, a router within the network must implement some queuing discipline that governs how packets (data) are buffered while waiting to be transmitted. The implementation of the queuing discipline will regulate how the packets are buffered while waiting to be transmitted. In achieving this, various queuing disciplines are being used to control the transmission of these packets, by determining which of the packets get the highest priority, less priority and which packets are dropped. The queuing discipline will therefore control the packets latency by determining how long a packet can wait to be transmitted or dropped. The common queuing disciplines are first-in-first-out queuing, Priority queuing and Weighted-fair queuing (FIFO, PQ and WFQ). This paper critically evaluates and analyse through the use of Optimized Network Evaluation Tool (OPNET) Modeller, Version 14.5 the effects of three queuing disciplines (FIFO, PQ and WFQ) on the performance of 5 different applications (FTP, HTTP, E-Mail, Voice and Video) within specified parameters using packets sent, packets received and transmission delay as performance metrics. The paper finally suggests some ways in which networks can be designed to provide better transmission performance while using these queuing disciplines.Keywords: applications, first-in-first-out queuing (FIFO), optimised network evaluation tool (OPNET), packets, priority queuing (PQ), queuing discipline, weighted-fair queuing (WFQ)
Procedia PDF Downloads 35824054 Computer Assisted Strategies Help to Pharmacist
Authors: Komal Fizza
Abstract:
All around the world in every field professionals are taking great support from their computers. Computer assisted strategies not only increase the efficiency of the professionals but also in case of healthcare they help in life-saving interventions. The background of this current research is aimed towards two things; first to find out if computer assisted strategies are useful for Pharmacist for not and secondly how much these assist a Pharmacist to do quality interventions. Shifa International Hospital is a 500 bedded hospital, and it is running Antimicrobial Stewardship, during their stewardship rounds pharmacists observed that a lot of wrong doses of antibiotics were coming at times those were being overlooked by the other pharmacist even. So, with the help of MIS team the patients were categorized into adult and peads depending upon their age. Minimum and maximum dose of every single antibiotic present in the pharmacy that could be dispensed to the patient was developed. These were linked to the order entry window. So whenever pharmacist would type any order and the dose would be below or above the therapeutic limit this would give an alert to the pharmacist. Whenever this message pop-up this was recorded at the back end along with the antibiotic name, pharmacist ID, date, and time. From 14th of January 2015 and till 14th of March 2015 the software stopped different users 350 times. Out of this 300 were found to be major errors which if reached to the patient could have harmed them to the greater extent. While 50 were due to typing errors and minor deviations. The pilot study showed that computer assisted strategies can be of great help to the pharmacist. They can improve the efficacy and quality of interventions.Keywords: antibiotics, computer assisted strategies, pharmacist, stewardship
Procedia PDF Downloads 49124053 Data Confidentiality in Public Cloud: A Method for Inclusion of ID-PKC Schemes in OpenStack Cloud
Authors: N. Nalini, Bhanu Prakash Gopularam
Abstract:
The term data security refers to the degree of resistance or protection given to information from unintended or unauthorized access. The core principles of information security are the confidentiality, integrity and availability, also referred as CIA triad. Cloud computing services are classified as SaaS, IaaS and PaaS services. With cloud adoption the confidential enterprise data are moved from organization premises to untrusted public network and due to this the attack surface has increased manifold. Several cloud computing platforms like OpenStack, Eucalyptus, Amazon EC2 offer users to build and configure public, hybrid and private clouds. While the traditional encryption based on PKI infrastructure still works in cloud scenario, the management of public-private keys and trust certificates is difficult. The Identity based Public Key Cryptography (also referred as ID-PKC) overcomes this problem by using publicly identifiable information for generating the keys and works well with decentralized systems. The users can exchange information securely without having to manage any trust information. Another advantage is that access control (role based access control policy) information can be embedded into data unlike in PKI where it is handled by separate component or system. In OpenStack cloud platform the keystone service acts as identity service for authentication and authorization and has support for public key infrastructure for auto services. In this paper, we explain OpenStack security architecture and evaluate the PKI infrastructure piece for data confidentiality. We provide method to integrate ID-PKC schemes for securing data while in transit and stored and explain the key measures for safe guarding data against security attacks. The proposed approach uses JPBC crypto library for key-pair generation based on IEEE P1636.3 standard and secure communication to other cloud services.Keywords: data confidentiality, identity based cryptography, secure communication, open stack key stone, token scoping
Procedia PDF Downloads 38424052 Peculiarities of Snow Cover in Belarus
Authors: Aleh Meshyk, Anastasiya Vouchak
Abstract:
On the average snow covers Belarus for 75 days in the south-west and 125 days in the north-east. During the cold season snowpack often destroys due to thaws, especially at the beginning and end of winter. Over 50% of thawing days have a positive mean daily temperature, which results in complete snow melting. For instance, in December 10% of thaws occur at 4 С mean daily temperature. Stable snowpack lying for over a month forms in the north-east in the first decade of December but in the south-west in the third decade of December. The cover disappears in March: in the north-east in the last decade but in the south-west in the first decade. This research takes into account that precipitation falling during a cold season could be not only liquid and solid but also a mixed type (about 10-15 % a year). Another important feature of snow cover is its density. In Belarus, the density of freshly fallen snow ranges from 0.08-0.12 g/cm³ in the north-east to 0.12-0.17 g/cm³ in the south-west. Over time, snow settles under its weight and after melting and refreezing. Averaged annual density of snow at the end of January is 0.23-0.28 g/сm³, in February – 0.25-0.30 g/сm³, in March – 0.29-0.36 g/сm³. Sometimes it can be over 0.50 g/сm³ if the snow melts too fast. The density of melting snow saturated with water can reach 0.80 g/сm³. Average maximum of snow depth is 15-33 cm: minimum is in Brest, maximum is in Lyntupy. Maximum registered snow depth ranges within 40-72 cm. The water content in snowpack, as well as its depth and density, reaches its maximum in the second half of February – beginning of March. Spatial distribution of the amount of liquid in snow corresponds to the trend described above, i.e. it increases in the direction from south-west to north-east and on the highlands. Average annual value of maximum water content in snow ranges from 35 mm in the south-west to 80-100 mm in the north-east. The water content in snow is over 80 mm on the central Belarusian highland. In certain years it exceeds 2-3 times the average annual values. Moderate water content in snow (80-95 mm) is characteristic of western highlands. Maximum water content in snow varies over the country from 107 mm (Brest) to 207 mm (Novogrudok). Maximum water content in snow varies significantly in time (in years), which is confirmed by high variation coefficient (Cv). Maximums (0.62-0.69) are in the south and south-west of Belarus. Minimums (0.42-0.46) are in central and north-eastern Belarus where snow cover is more stable. Since 1987 most gauge stations in Belarus have observed a trend to a decrease in water content in snow. It is confirmed by the research. The biggest snow cover forms on the highlands in central and north-eastern Belarus. Novogrudok, Minsk, Volkovysk, and Sventayny highlands are a natural orographic barrier which prevents snow-bringing air masses from penetrating inside the country. The research is based on data from gauge stations in Belarus registered from 1944 to 2014.Keywords: density, depth, snow, water content in snow
Procedia PDF Downloads 16124051 Findings in Vascular Catheter Cultures at the Laboratory of Microbiology of General Hospital during One Year
Authors: P. Christodoulou, M. Gerasimou, S. Mantzoukis, N. Varsamis, G. Kolliopoulou, N. Zotos
Abstract:
Abstract— Purpose: The Intensive Care Unit (ICU) environment is conducive to the growth of microorganisms. A variety of microorganisms gain access to the intravascular area and are transported throughout the circulatory system. Therefore, examination of the catheters used in ICU patients is of paramount importance. Material and Method: The culture medium is a catheter tip, which is enriched with Tryptic soy broth (TSB). After one day of incubation, the broth is passaged in the following selective media: Blood, Mac conkey No. 2, chocolate, Mueller Hinton, Chapman, and Saboureaud agar. The above selective media is incubated for 2 days. After this period, if any number of microbial colonies is detected, gram staining is performed and then the microorganisms are identified by biochemical techniques in the automated Microscan (Siemens) system followed by a sensitivity test in the same system using the minimum inhibitory concentration (MIC) technique. The sensitivity test is verified by a Kirby Bauer test. Results: In 2017, the Microbiology Laboratory received 84 catheters from the ICU. 42 were found positive. Of these, S. epidermidis was identified at 8, A. baumannii in 10, K. pneumoniae in 6, P. aeruginosa in 6, P. mirabilis in 3, S. simulans in 1, S. haemolyticus in 4, S. aureus in 3 and S. hominis in 1. Conclusions: The results show that the placement and maintenance of the catheters in ICU patients are relatively successful, despite the unfavorable environment of the unit.Keywords: culture, intensive care unit, microorganisms, vascular catheters
Procedia PDF Downloads 28324050 Difference Between Planning Target Volume (PTV) Based Slow-Ct and Internal Target Volume (ITV) Based 4DCT Imaging Techniques in Stereotactic Body Radiotherapy for Lung Cancer: A Comparative Study
Authors: Madhumita Sahu, S. S. Tiwary
Abstract:
The Radiotherapy of Carcinoma Lung has always been difficult and a matter of great concern. The significant movement due to fractional motion caused due to non-rhythmic respiratory motion poses a great challenge for the treatment of Lung cancer using Ionizing Radiation. The present study compares the accuracy in the measurement of Target Volume using Slow-CT and 4DCT Imaging in SBRT for Lung Tumor. The experimental samples were extracted from patients with Lung Cancer who underwent SBRT. Slow-CT and 4DCT images were acquired under free breathing for each patient. PTV were delineated on Slow CT images. Similarly, ITV was also delineated on each of the 4DCT volumes. Volumetric and Statistical analysis were performed for each patient by measuring corresponding PTV and ITV volumes. The study showed (1) The Maximum Deviation observed between Slow-CT-based PTV and 4DCT imaging-based ITV is 248.58 cc. (2) The Minimum Deviation observed between Slow-CT-based PTV and 4DCT imaging-based ITV is 5.22 cc. (3) The Mean Deviation observed between Slow-CT-based PTV and 4DCT imaging-based ITV is 63.21 cc. The present study concludes that irradiated volume ITV with 4DCT is less as compared to the PTV with Slow-CT. A better and more precise treatment could be given more accurately with 4DCT Imaging by sparing 63.21 CC of mean body volume.Keywords: CT imaging, 4DCT imaging, lung cancer, statistical analysis
Procedia PDF Downloads 2424049 The Relationship Between The Two-spatial World And The Decrease In The Area Of Commercial Properties
Authors: Syedhossein Vakili
Abstract:
According to the opinion of some experts, the world's two-spatialization means the establishment of a new virtual space and placing this new space next to the physical space. This dualization of space has left various effects, one of which is reducing the need for buildings and making the area of business premises economical through the use of virtual space instead of a part of physical space. In such a way that before the virtual space was known, a commercial or educational institution had to block a large part of its capital to acquire physical spaces and buildings in order to provide physical space and places needed for daily activities, but today, Thanks to the addition of the virtual space to the physical space, it has been possible to carry out its activities more widely in a limited environment with a minimum of physical space and drastically reduce costs. In order to understand the impact of virtual space on the reduction of physical space, the researcher used the official reports of the countries regarding the average area mentioned in the permits for the construction of commercial and educational units in the period from 2014 to 2023 and compared the average capital required for the absolute physical period with The period of two-spatialization of the world in the mentioned ten-year period, while using the analytical and comparative method, has proven that virtual space has greatly reduced the amount of investment of business owners to provide the required place for their activities by reducing the need for physical space. And economically, it has made commercial activities more profitable.Keywords: two spatialization, building area, cyberspace, physical space, virtual place
Procedia PDF Downloads 6024048 Improved Distance Estimation in Dynamic Environments through Multi-Sensor Fusion with Extended Kalman Filter
Authors: Iffat Ara Ebu, Fahmida Islam, Mohammad Abdus Shahid Rafi, Mahfuzur Rahman, Umar Iqbal, John Ball
Abstract:
The application of multi-sensor fusion for enhanced distance estimation accuracy in dynamic environments is crucial for advanced driver assistance systems (ADAS) and autonomous vehicles. Limitations of single sensors such as cameras or radar in adverse conditions motivate the use of combined camera and radar data to improve reliability, adaptability, and object recognition. A multi-sensor fusion approach using an extended Kalman filter (EKF) is proposed to combine sensor measurements with a dynamic system model, achieving robust and accurate distance estimation. The research utilizes the Mississippi State University Autonomous Vehicular Simulator (MAVS) to create a controlled environment for data collection. Data analysis is performed using MATLAB. Qualitative (visualization of fused data vs ground truth) and quantitative metrics (RMSE, MAE) are employed for performance assessment. Initial results with simulated data demonstrate accurate distance estimation compared to individual sensors. The optimal sensor measurement noise variance and plant noise variance parameters within the EKF are identified, and the algorithm is validated with real-world data from a Chevrolet Blazer. In summary, this research demonstrates that multi-sensor fusion with an EKF significantly improves distance estimation accuracy in dynamic environments. This is supported by comprehensive evaluation metrics, with validation transitioning from simulated to real-world data, paving the way for safer and more reliable autonomous vehicle control.Keywords: sensor fusion, EKF, MATLAB, MAVS, autonomous vehicle, ADAS
Procedia PDF Downloads 4324047 A User Identification Technique to Access Big Data Using Cloud Services
Authors: A. R. Manu, V. K. Agrawal, K. N. Balasubramanya Murthy
Abstract:
Authentication is required in stored database systems so that only authorized users can access the data and related cloud infrastructures. This paper proposes an authentication technique using multi-factor and multi-dimensional authentication system with multi-level security. The proposed technique is likely to be more robust as the probability of breaking the password is extremely low. This framework uses a multi-modal biometric approach and SMS to enforce additional security measures with the conventional Login/password system. The robustness of the technique is demonstrated mathematically using a statistical analysis. This work presents the authentication system along with the user authentication architecture diagram, activity diagrams, data flow diagrams, sequence diagrams, and algorithms.Keywords: design, implementation algorithms, performance, biometric approach
Procedia PDF Downloads 47624046 Input Data Balancing in a Neural Network PM-10 Forecasting System
Authors: Suk-Hyun Yu, Heeyong Kwon
Abstract:
Recently PM-10 has become a social and global issue. It is one of major air pollutants which affect human health. Therefore, it needs to be forecasted rapidly and precisely. However, PM-10 comes from various emission sources, and its level of concentration is largely dependent on meteorological and geographical factors of local and global region, so the forecasting of PM-10 concentration is very difficult. Neural network model can be used in the case. But, there are few cases of high concentration PM-10. It makes the learning of the neural network model difficult. In this paper, we suggest a simple input balancing method when the data distribution is uneven. It is based on the probability of appearance of the data. Experimental results show that the input balancing makes the neural networks’ learning easy and improves the forecasting rates.Keywords: artificial intelligence, air quality prediction, neural networks, pattern recognition, PM-10
Procedia PDF Downloads 23224045 Metabolic Predictive Model for PMV Control Based on Deep Learning
Authors: Eunji Choi, Borang Park, Youngjae Choi, Jinwoo Moon
Abstract:
In this study, a predictive model for estimating the metabolism (MET) of human body was developed for the optimal control of indoor thermal environment. Human body images for indoor activities and human body joint coordinated values were collected as data sets, which are used in predictive model. A deep learning algorithm was used in an initial model, and its number of hidden layers and hidden neurons were optimized. Lastly, the model prediction performance was analyzed after the model being trained through collected data. In conclusion, the possibility of MET prediction was confirmed, and the direction of the future study was proposed as developing various data and the predictive model.Keywords: deep learning, indoor quality, metabolism, predictive model
Procedia PDF Downloads 25724044 Analysis of Heat Exchanger Area of Two Stage Cascade Refrigeration System Using Taguchi
Authors: A. D. Parekh
Abstract:
The present work describes relative contributions of operating parameters on required heat transfer area of three heat exchangers viz. evaporator, condenser and cascade condenser of two stage R404A-R508B cascade refrigeration system using Taguchi method. The operating parameters considered in present study includes (1) condensing temperature of high temperature cycle and low temperature cycle (2) evaporating temperature of low temperature cycle (3) degree of superheating in low temperature cycle (4) refrigerating effect. Heat transfer areas of three heat exchangers are studied with variation of above operating parameters and also optimum working levels of each operating parameter has been obtained for minimum heat transfer area of each heat exchanger using Taguchi method. The analysis using Taguchi method reveals that evaporating temperature of low temperature cycle and refrigerating effect contribute relatively largely on the area of evaporator. Condenser area is mainly influenced by both condensing temperature of high temperature cycle and refrigerating effect. Area of cascade condenser is mainly affected by refrigerating effect and the effects of other operating parameters are minimal.Keywords: cascade refrigeration system, Taguchi method, heat transfer area, ANOVA, optimal solution
Procedia PDF Downloads 33824043 Analysis of Brownfield Soil Contamination Using Local Government Planning Data
Authors: Emma E. Hellawell, Susan J. Hughes
Abstract:
BBrownfield sites are currently being redeveloped for residential use. Information on soil contamination on these former industrial sites is collected as part of the planning process by the local government. This research project analyses this untapped resource of environmental data, using site investigation data submitted to a local Borough Council, in Surrey, UK. Over 150 site investigation reports were collected and interrogated to extract relevant information. This study involved three phases. Phase 1 was the development of a database for soil contamination information from local government reports. This database contained information on the source, history, and quality of the data together with the chemical information on the soil that was sampled. Phase 2 involved obtaining site investigation reports for development within the study area and extracting the required information for the database. Phase 3 was the data analysis and interpretation of key contaminants to evaluate typical levels of contaminants, their distribution within the study area, and relating these results to current guideline levels of risk for future site users. Preliminary results for a pilot study using a sample of the dataset have been obtained. This pilot study showed there is some inconsistency in the quality of the reports and measured data, and careful interpretation of the data is required. Analysis of the information has found high levels of lead in shallow soil samples, with mean and median levels exceeding the current guidance for residential use. The data also showed elevated (but below guidance) levels of potentially carcinogenic polyaromatic hydrocarbons. Of particular concern from the data was the high detection rate for asbestos fibers. These were found at low concentrations in 25% of the soil samples tested (however, the sample set was small). Contamination levels of the remaining chemicals tested were all below the guidance level for residential site use. These preliminary pilot study results will be expanded, and results for the whole local government area will be presented at the conference. The pilot study has demonstrated the potential for this extensive dataset to provide greater information on local contamination levels. This can help inform regulators and developers and lead to more targeted site investigations, improving risk assessments, and brownfield development.Keywords: Brownfield development, contaminated land, local government planning data, site investigation
Procedia PDF Downloads 14024042 Improving the Performance of Road Salt on Anti-Icing
Authors: Mohsen Abotalebi Esfahani, Amin Rahimi
Abstract:
Maintenance and management of route and roads infrastructure is one of the most important and the most fundamental principles of the countries. Several methods have been under investigation as preventive proceedings for the maintenance of asphalt pavements for many years. Using a mixture of salt, sand and gravel is the most common method of deicing, which could have numerous harmful consequences. Icy or snow-covered road is one of the major reasons of accidents in rainy seasons, which causes substantial damages such as loss of time and energy, environmental pollution, destruction of buildings, traffic congestion and rising possibility of accidents. Regarding this, every year the government incurred enormous costs to secure traverses. In this study, asphalt pavements have been cured, in terms of compressive strength, tensile strength and resilient modulus of asphalt samples, under the influence of Magnesium Chloride, Calcium Chloride, Sodium Chloride, Urea and pure water; and showed that de-icing with the calcium chloride solution and urea have the minimum negative effect and de-icing with pure water has most negative effect on laboratory specimens. Hence some simple techniques and new equipment and less use of sand and salt, can reduce significantly the risks and harmful effects of excessive use of salt, sand and gravel and at the same time use the safer roads.Keywords: maintenance, sodium chloride, icyroad, calcium chloride
Procedia PDF Downloads 28524041 Carbon Footprint Assessment Initiative and Trees: Role in Reducing Emissions
Authors: Omar Alelweet
Abstract:
Carbon emissions are quantified in terms of carbon dioxide equivalents, generated through a specific activity or accumulated throughout the life stages of a product or service. Given the growing concern about climate change and the role of carbon dioxide emissions in global warming, this initiative aims to create awareness and understanding of the impact of human activities and identify potential areas for improvement regarding the management of the carbon footprint on campus. Given that trees play a vital role in reducing carbon emissions by absorbing CO₂ during the photosynthesis process, this paper evaluated the contribution of each tree to reducing those emissions. Collecting data over an extended period of time is essential to monitoring carbon dioxide levels. This will help capture changes at different times and identify any patterns or trends in the data. By linking the data to specific activities, events, or environmental factors, it is possible to identify sources of emissions and areas where carbon dioxide levels are rising. Analyzing the collected data can provide valuable insights into ways to reduce emissions and mitigate the impact of climate change.Keywords: sustainability, green building, environmental impact, CO₂
Procedia PDF Downloads 7024040 Detection of Change Points in Earthquakes Data: A Bayesian Approach
Authors: F. A. Al-Awadhi, D. Al-Hulail
Abstract:
In this study, we applied the Bayesian hierarchical model to detect single and multiple change points for daily earthquake body wave magnitude. The change point analysis is used in both backward (off-line) and forward (on-line) statistical research. In this study, it is used with the backward approach. Different types of change parameters are considered (mean, variance or both). The posterior model and the conditional distributions for single and multiple change points are derived and implemented using BUGS software. The model is applicable for any set of data. The sensitivity of the model is tested using different prior and likelihood functions. Using Mb data, we concluded that during January 2002 and December 2003, three changes occurred in the mean magnitude of Mb in Kuwait and its vicinity.Keywords: multiple change points, Markov Chain Monte Carlo, earthquake magnitude, hierarchical Bayesian mode
Procedia PDF Downloads 456