Search results for: data security assurance
23513 Urban Noise and Air Quality: Correlation between Air and Noise Pollution; Sensors, Data Collection, Analysis and Mapping in Urban Planning
Authors: Massimiliano Condotta, Paolo Ruggeri, Chiara Scanagatta, Giovanni Borga
Abstract:
Architects and urban planners, when designing and renewing cities, have to face a complex set of problems, including the issues of noise and air pollution which are considered as hot topics (i.e., the Clean Air Act of London and the Soundscape definition). It is usually taken for granted that these problems go by together because the noise pollution present in cities is often linked to traffic and industries, and these produce air pollutants as well. Traffic congestion can create both noise pollution and air pollution, because NO₂ is mostly created from the oxidation of NO, and these two are notoriously produced by processes of combustion at high temperatures (i.e., car engines or thermal power stations). We can see the same process for industrial plants as well. What have to be investigated – and is the topic of this paper – is whether or not there really is a correlation between noise pollution and air pollution (taking into account NO₂) in urban areas. To evaluate if there is a correlation, some low-cost methodologies will be used. For noise measurements, the OpeNoise App will be installed on an Android phone. The smartphone will be positioned inside a waterproof box, to stay outdoor, with an external battery to allow it to collect data continuously. The box will have a small hole to install an external microphone, connected to the smartphone, which will be calibrated to collect the most accurate data. For air, pollution measurements will be used the AirMonitor device, an Arduino board to which the sensors, and all the other components, are plugged. After assembling the sensors, they will be coupled (one noise and one air sensor) and placed in different critical locations in the area of Mestre (Venice) to map the existing situation. The sensors will collect data for a fixed period of time to have an input for both week and weekend days, in this way it will be possible to see the changes of the situation during the week. The novelty is that data will be compared to check if there is a correlation between the two pollutants using graphs that should show the percentage of pollution instead of the values obtained with the sensors. To do so, the data will be converted to fit on a scale that goes up to 100% and will be shown thru a mapping of the measurement using GIS methods. Another relevant aspect is that this comparison can help to choose which are the right mitigation solutions to be applied in the area of the analysis because it will make it possible to solve both the noise and the air pollution problem making only one intervention. The mitigation solutions must consider not only the health aspect but also how to create a more livable space for citizens. The paper will describe in detail the methodology and the technical solution adopted for the realization of the sensors, the data collection, noise and pollution mapping and analysis.Keywords: air quality, data analysis, data collection, NO₂, noise mapping, noise pollution, particulate matter
Procedia PDF Downloads 21223512 Tuning Cubic Equations of State for Supercritical Water Applications
Authors: Shyh Ming Chern
Abstract:
Cubic equations of state (EoS), popular due to their simple mathematical form, ease of use, semi-theoretical nature and, reasonable accuracy are normally fitted to vapor-liquid equilibrium P-v-T data. As a result, They often show poor accuracy in the region near and above the critical point. In this study, the performance of the renowned Peng-Robinson (PR) and Patel-Teja (PT) EoS’s around the critical area has been examined against the P-v-T data of water. Both of them display large deviations at critical point. For instance, PR-EoS exhibits discrepancies as high as 47% for the specific volume, 28% for the enthalpy departure and 43% for the entropy departure at critical point. It is shown that incorporating P-v-T data of the supercritical region into the retuning of a cubic EoS can improve its performance above the critical point dramatically. Adopting a retuned acentric factor of 0.5491 instead of its genuine value of 0.344 for water in PR-EoS and a new F of 0.8854 instead of its original value of 0.6898 for water in PT-EoS reduces the discrepancies to about one third or less.Keywords: equation of state, EoS, supercritical water, SCW
Procedia PDF Downloads 53523511 A Safety Analysis Method for Multi-Agent Systems
Authors: Ching Louis Liu, Edmund Kazmierczak, Tim Miller
Abstract:
Safety analysis for multi-agent systems is complicated by the, potentially nonlinear, interactions between agents. This paper proposes a method for analyzing the safety of multi-agent systems by explicitly focusing on interactions and the accident data of systems that are similar in structure and function to the system being analyzed. The method creates a Bayesian network using the accident data from similar systems. A feature of our method is that the events in accident data are labeled with HAZOP guide words. Our method uses an Ontology to abstract away from the details of a multi-agent implementation. Using the ontology, our methods then constructs an “Interaction Map,” a graphical representation of the patterns of interactions between agents and other artifacts. Interaction maps combined with statistical data from accidents and the HAZOP classifications of events can be converted into a Bayesian Network. Bayesian networks allow designers to explore “what it” scenarios and make design trade-offs that maintain safety. We show how to use the Bayesian networks, and the interaction maps to improve multi-agent system designs.Keywords: multi-agent system, safety analysis, safety model, integration map
Procedia PDF Downloads 41723510 Fast Approximate Bayesian Contextual Cold Start Learning (FAB-COST)
Authors: Jack R. McKenzie, Peter A. Appleby, Thomas House, Neil Walton
Abstract:
Cold-start is a notoriously difficult problem which can occur in recommendation systems, and arises when there is insufficient information to draw inferences for users or items. To address this challenge, a contextual bandit algorithm – the Fast Approximate Bayesian Contextual Cold Start Learning algorithm (FAB-COST) – is proposed, which is designed to provide improved accuracy compared to the traditionally used Laplace approximation in the logistic contextual bandit, while controlling both algorithmic complexity and computational cost. To this end, FAB-COST uses a combination of two moment projection variational methods: Expectation Propagation (EP), which performs well at the cold start, but becomes slow as the amount of data increases; and Assumed Density Filtering (ADF), which has slower growth of computational cost with data size but requires more data to obtain an acceptable level of accuracy. By switching from EP to ADF when the dataset becomes large, it is able to exploit their complementary strengths. The empirical justification for FAB-COST is presented, and systematically compared to other approaches on simulated data. In a benchmark against the Laplace approximation on real data consisting of over 670, 000 impressions from autotrader.co.uk, FAB-COST demonstrates at one point increase of over 16% in user clicks. On the basis of these results, it is argued that FAB-COST is likely to be an attractive approach to cold-start recommendation systems in a variety of contexts.Keywords: cold-start learning, expectation propagation, multi-armed bandits, Thompson Sampling, variational inference
Procedia PDF Downloads 10823509 A Method of Drilling a Ground Using a Robotic Arm
Authors: Lotfi Beji, Laredj Benchikh
Abstract:
Underground tunnel face bolting and pipe umbrella reinforcement are one of the most challenging tasks in construction whether industrial or not, and infrastructures such as roads or pipelines. It is one of the first sectors of economic activity in the world. Through a variety of soil and rock, a cyclic Conventional Tunneling Method (CTM) remains the best one for projects with highly variable ground conditions or shapes. CTM is the only alternative for the renovation of existing tunnels and creating emergency exit. During the drilling process, a wide variety of non-desired vibrations may arise, and a method using a robot arm is proposed. The main kinds of drilling through vibration here is the bit-bouncing phenomenon (resonant axial vibration). Hence, assisting the task by a robot arm may play an important role on drilling performances and security. We propose to control the axial-vibration phenomenon along the drillstring at a practical resonant frequency, and embed a Resonant Sonic Drilling Head (RSDH) as a robot end effector for drilling. Many questionable industry drilling criteria and stability are discussed in this paper.Keywords: drilling, resonant vibration, robot arm, control
Procedia PDF Downloads 29023508 A Discrete Element Method Centrifuge Model of Monopile under Cyclic Lateral Loads
Authors: Nuo Duan, Yi Pik Cheng
Abstract:
This paper presents the data of a series of two-dimensional Discrete Element Method (DEM) simulations of a large-diameter rigid monopile subjected to cyclic loading under a high gravitational force. At present, monopile foundations are widely used to support the tall and heavy wind turbines, which are also subjected to significant from wind and wave actions. A safe design must address issues such as rotations and changes in soil stiffness subject to these loadings conditions. Design guidance on the issue is limited, so are the availability of laboratory and field test data. The interpretation of these results in sand, such as the relation between loading and displacement, relies mainly on empirical correlations to pile properties. Regarding numerical models, most data from Finite Element Method (FEM) can be found. They are not comprehensive, and most of the FEM results are sensitive to input parameters. The micro scale behaviour could change the mechanism of the soil-structure interaction. A DEM model was used in this paper to study the cyclic lateral loads behaviour. A non-dimensional framework is presented and applied to interpret the simulation results. The DEM data compares well with various set of published experimental centrifuge model test data in terms of lateral deflection. The accumulated permanent pile lateral displacements induced by the cyclic lateral loads were found to be dependent on the characteristics of the applied cyclic load, such as the extent of the loading magnitudes and directions.Keywords: cyclic loading, DEM, numerical modelling, sands
Procedia PDF Downloads 32123507 Estimation of Desktop E-Wastes in Delhi Using Multivariate Flow Analysis
Authors: Sumay Bhojwani, Ashutosh Chandra, Mamita Devaburman, Akriti Bhogal
Abstract:
This article uses the Material flow analysis for estimating e-wastes in the Delhi/NCR region. The Material flow analysis is based on sales data obtained from various sources. Much of the data available for the sales is unreliable because of the existence of a huge informal sector. The informal sector in India accounts for more than 90%. Therefore, the scope of this study is only limited to the formal one. Also, for projection of the sales data till 2030, we have used regression (linear) to avoid complexity. The actual sales in the years following 2015 may vary non-linearly but we have assumed a basic linear relation. The purpose of this study was to know an approximate quantity of desktop e-wastes that we will have by the year 2030 so that we start preparing ourselves for the ineluctable investment in the treatment of these ever-rising e-wastes. The results of this study can be used to install a treatment plant for e-wastes in Delhi.Keywords: e-wastes, Delhi, desktops, estimation
Procedia PDF Downloads 25923506 An Erudite Technique for Face Detection and Recognition Using Curvature Analysis
Authors: S. Jagadeesh Kumar
Abstract:
Face detection and recognition is an authoritative technology for image database management, video surveillance, and human computer interface (HCI). Face recognition is a rapidly nascent method, which has been extensively discarded in forensics such as felonious identification, tenable entree, and custodial security. This paper recommends an erudite technique using curvature analysis (CA) that has less false positives incidence, operative in different light environments and confiscates the artifacts that are introduced during image acquisition by ring correction in polar coordinate (RCP) method. This technique affronts mean and median filtering technique to remove the artifacts but it works in polar coordinate during image acquisition. Investigational fallouts for face detection and recognition confirms decent recitation even in diagonal orientation and stance variation.Keywords: curvature analysis, ring correction in polar coordinate method, face detection, face recognition, human computer interaction
Procedia PDF Downloads 28723505 Geospatial Network Analysis Using Particle Swarm Optimization
Authors: Varun Singh, Mainak Bandyopadhyay, Maharana Pratap Singh
Abstract:
The shortest path (SP) problem concerns with finding the shortest path from a specific origin to a specified destination in a given network while minimizing the total cost associated with the path. This problem has widespread applications. Important applications of the SP problem include vehicle routing in transportation systems particularly in the field of in-vehicle Route Guidance System (RGS) and traffic assignment problem (in transportation planning). Well known applications of evolutionary methods like Genetic Algorithms (GA), Ant Colony Optimization, Particle Swarm Optimization (PSO) have come up to solve complex optimization problems to overcome the shortcomings of existing shortest path analysis methods. It has been reported by various researchers that PSO performs better than other evolutionary optimization algorithms in terms of success rate and solution quality. Further Geographic Information Systems (GIS) have emerged as key information systems for geospatial data analysis and visualization. This research paper is focused towards the application of PSO for solving the shortest path problem between multiple points of interest (POI) based on spatial data of Allahabad City and traffic speed data collected using GPS. Geovisualization of results of analysis is carried out in GIS.Keywords: particle swarm optimization, GIS, traffic data, outliers
Procedia PDF Downloads 48323504 Distributed Processing for Content Based Lecture Video Retrieval on Hadoop Framework
Authors: U. S. N. Raju, Kothuri Sai Kiran, Meena G. Kamal, Vinay Nikhil Pabba, Suresh Kanaparthi
Abstract:
There is huge amount of lecture video data available for public use, and many more lecture videos are being created and uploaded every day. Searching for videos on required topics from this huge database is a challenging task. Therefore, an efficient method for video retrieval is needed. An approach for automated video indexing and video search in large lecture video archives is presented. As the amount of video lecture data is huge, it is very inefficient to do the processing in a centralized computation framework. Hence, Hadoop Framework for distributed computing for Big Video Data is used. First, step in the process is automatic video segmentation and key-frame detection to offer a visual guideline for the video content navigation. In the next step, we extract textual metadata by applying video Optical Character Recognition (OCR) technology on key-frames. The OCR and detected slide text line types are adopted for keyword extraction, by which both video- and segment-level keywords are extracted for content-based video browsing and search. The performance of the indexing process can be improved for a large database by using distributed computing on Hadoop framework.Keywords: video lectures, big video data, video retrieval, hadoop
Procedia PDF Downloads 53423503 A Critical Analysis of Environmental Investment in India
Authors: K. Y. Chen, H. Chua, C. W. Kan
Abstract:
Environmental investment is an important issue in many countries. In this study, we will first review the environmental issues related to India and their effect on the economical development. Secondly, economic data would be collected from government yearly statistics. The statistics would also include the environmental investment information of India. Finally, we would co-relate the data in order to find out the relationship between environmental investment and sustainable development in India. Therefore, in the paper, we aim to analyse the effect of an environmental investment on the sustainable development in India. Based on the economic data collected, India is in development status with fast population and GDP growth speed. India is facing the environment problems due to its high-speed development. However, the environment investment could give a positive impact on the sustainable development in India. The environmental investment is keeping in the same growth rate with GDP. Acknowledgment: Authors would like to thank the financial support from the Hong Kong Polytechnic University for this work.Keywords: India, environmental investment, sustainable development, analysis
Procedia PDF Downloads 31523502 A Robust System for Foot Arch Type Classification from Static Foot Pressure Distribution Data Using Linear Discriminant Analysis
Authors: R. Periyasamy, Deepak Joshi, Sneh Anand
Abstract:
Foot posture assessment is important to evaluate foot type, causing gait and postural defects in all age groups. Although different methods are used for classification of foot arch type in clinical/research examination, there is no clear approach for selecting the most appropriate measurement system. Therefore, the aim of this study was to develop a system for evaluation of foot type as clinical decision-making aids for diagnosis of flat and normal arch based on the Arch Index (AI) and foot pressure distribution parameter - Power Ratio (PR) data. The accuracy of the system was evaluated for 27 subjects with age ranging from 24 to 65 years. Foot area measurements (hind foot, mid foot, and forefoot) were acquired simultaneously from foot pressure intensity image using portable PedoPowerGraph system and analysis of the image in frequency domain to obtain foot pressure distribution parameter - PR data. From our results, we obtain 100% classification accuracy of normal and flat foot by using the linear discriminant analysis method. We observe there is no misclassification of foot types because of incorporating foot pressure distribution data instead of only arch index (AI). We found that the mid-foot pressure distribution ratio data and arch index (AI) value are well correlated to foot arch type based on visual analysis. Therefore, this paper suggests that the proposed system is accurate and easy to determine foot arch type from arch index (AI), as well as incorporating mid-foot pressure distribution ratio data instead of physical area of contact. Hence, such computational tool based system can help the clinicians for assessment of foot structure and cross-check their diagnosis of flat foot from mid-foot pressure distribution.Keywords: arch index, computational tool, static foot pressure intensity image, foot pressure distribution, linear discriminant analysis
Procedia PDF Downloads 49923501 Students’ Awareness of the Use of Poster, Power Point and Animated Video Presentations: A Case Study of Third Year Students of the Department of English of Batna University
Authors: Bahloul Amel
Abstract:
The present study debates students’ perceptions of the use of technology in learning English as a Foreign Language. Its aim is to explore and understand students’ preparation and presentation of Posters, PowerPoint and Animated Videos by drawing attention to visual and oral elements. The data is collected through observations and semi-structured interviews and analyzed through phenomenological data analysis steps. The themes emerged from the data, visual learning satisfaction in using information and communication technology, providing structure to oral presentation, learning from peers’ presentations, draw attention to using Posters, PowerPoint and Animated Videos as each supports visual learning and organization of thoughts in oral presentations.Keywords: EFL, posters, PowerPoint presentations, Animated Videos, visual learning
Procedia PDF Downloads 44523500 The Relationship Between Hourly Compensation and Unemployment Rate Using the Panel Data Regression Analysis
Authors: S. K. Ashiquer Rahman
Abstract:
the paper concentrations on the importance of hourly compensation, emphasizing the significance of the unemployment rate. There are the two most important factors of a nation these are its unemployment rate and hourly compensation. These are not merely statistics but they have profound effects on individual, families, and the economy. They are inversely related to one another. When we consider the unemployment rate that will probably decline as hourly compensations in manufacturing rise. But when we reduced the unemployment rates and increased job prospects could result from higher compensation. That’s why, the increased hourly compensation in the manufacturing sector that could have a favorable effect on job changing issues. Moreover, the relationship between hourly compensation and unemployment is complex and influenced by broader economic factors. In this paper, we use panel data regression models to evaluate the expected link between hourly compensation and unemployment rate in order to determine the effect of hourly compensation on unemployment rate. We estimate the fixed effects model, evaluate the error components, and determine which model (the FEM or ECM) is better by pooling all 60 observations. We then analysis and review the data by comparing 3 several countries (United States, Canada and the United Kingdom) using panel data regression models. Finally, we provide result, analysis and a summary of the extensive research on how the hourly compensation effects on the unemployment rate. Additionally, this paper offers relevant and useful informational to help the government and academic community use an econometrics and social approach to lessen on the effect of the hourly compensation on Unemployment rate to eliminate the problem.Keywords: hourly compensation, Unemployment rate, panel data regression models, dummy variables, random effects model, fixed effects model, the linear regression model
Procedia PDF Downloads 8123499 Identifying the Goals of a Multicultural Curriculum for the Primary Education Course
Authors: Fatemeh Havas Beigi
Abstract:
The purpose of this study is to identify the objectives of a multicultural curriculum for the primary education period from the perspective of ethnic teachers and education experts and cultural professionals. The research paradigm is interpretive, the research approach is qualitative, the research strategy is content analysis, the sampling method is purposeful and it is a snowball, and the sample of informants in the research for Iranian ethnic teachers and experts until the theoretical saturation was estimated to be 67 people. The data collection tools used were based on semi-structured interviews and individual interviews and focal interviews were used to collect information. The data format was also in audio format and the first period coding and the second coding were used to analyze the data. Based on data analysis 11 Objective: Paying attention to ethnic equality, expanding educational opportunities and justice, peaceful coexistence, anti-ethnic and racial discrimination education, paying attention to human value and dignity, accepting religious diversity, getting to know ethnicities and cultures, promoting teaching-learning, fostering self-confidence, building national unity, and developing cultural commonalities for a multicultural curriculum were identified.Keywords: objective, multicultural curriculum, connect, elementary education period
Procedia PDF Downloads 9423498 The Carbon Trading Price and Trading Volume Forecast in Shanghai City by BP Neural Network
Authors: Liu Zhiyuan, Sun Zongdi
Abstract:
In this paper, the BP neural network model is established to predict the carbon trading price and carbon trading volume in Shanghai City. First of all, we find the data of carbon trading price and carbon trading volume in Shanghai City from September 30, 2015 to December 23, 2016. The carbon trading price and trading volume data were processed to get the average value of each 5, 10, 20, 30, and 60 carbon trading price and trading volume. Then, these data are used as input of BP neural network model. Finally, after the training of BP neural network, the prediction values of Shanghai carbon trading price and trading volume are obtained, and the model is tested.Keywords: Carbon trading price, carbon trading volume, BP neural network model, Shanghai City
Procedia PDF Downloads 35223497 An Investigation of the Relationship Between Privacy Crisis, Public Discourse on Privacy, and Key Performance Indicators at Facebook (2004–2021)
Authors: Prajwal Eachempati, Laurent Muzellec, Ashish Kumar Jha
Abstract:
We use Facebook as a case study to investigate the complex relationship between the firm’s public discourse (and actions) surrounding data privacy and the performance of a business model based on monetizing user’s data. We do so by looking at the evolution of public discourse over time (2004–2021) and relate topics to revenue and stock market evolution Drawing from archival sources like Zuckerberg We use LDA topic modelling algorithm to reveal 19 topics regrouped in 6 major themes. We first show how, by using persuasive and convincing language that promises better protection of consumer data usage, but also emphasizes greater user control over their own data, the privacy issue is being reframed as one of greater user control and responsibility. Second, we aim to understand and put a value on the extent to which privacy disclosures have a potential impact on the financial performance of social media firms. There we found significant relationship between the topics pertaining to privacy and social media/technology, sentiment score and stock market prices. Revenue is found to be impacted by topics pertaining to politics and new product and service innovations while number of active users is not impacted by the topics unless moderated by external control variables like Return on Assets and Brand Equity.Keywords: public discourses, data protection, social media, privacy, topic modeling, business models, financial performance
Procedia PDF Downloads 9223496 European Electromagnetic Compatibility Directive Applied to Astronomical Observatories
Authors: Oibar Martinez, Clara Oliver
Abstract:
The Cherenkov Telescope Array Project (CTA) aims to build two different observatories of Cherenkov Telescopes, located in Cerro del Paranal, Chile, and La Palma, Spain. These facilities are used in this paper as a case study to investigate how to apply standard Directives on Electromagnetic Compatibility to astronomical observatories. Cherenkov Telescopes are able to provide valuable information from both Galactic and Extragalactic sources by measuring Cherenkov radiation, which is produced by particles which travel faster than light in the atmosphere. The construction requirements demand compliance with the European Electromagnetic Compatibility Directive. The largest telescopes of these observatories, called Large Scale Telescopes (LSTs), are high precision instruments with advanced photomultipliers able to detect the faint sub-nanosecond blue light pulses produced by Cherenkov Radiation. They have a 23-meter parabolic reflective surface. This surface focuses the radiation on a camera composed of an array of high-speed photosensors which are highly sensitive to the radio spectrum pollution. The camera has a field of view of about 4.5 degrees and has been designed for maximum compactness and lowest weight, cost and power consumption. Each pixel incorporates a photo-sensor able to discriminate single photons and the corresponding readout electronics. The first LST is already commissioned and intends to be operated as a service to Scientific Community. Because of this, it must comply with a series of reliability and functional requirements and must have a Conformité Européen (CE) marking. This demands compliance with Directive 2014/30/EU on electromagnetic compatibility. The main difficulty of accomplishing this goal resides on the fact that Conformité Européen marking setups and procedures were implemented for industrial products, whereas no clear protocols have been defined for scientific installations. In this paper, we aim to give an answer to the question on how the directive should be applied to our installation to guarantee the fulfillment of all the requirements and the proper functioning of the telescope itself. Experts in Optics and Electromagnetism were both needed to make these kinds of decisions and match tests which were designed to be made over the equipment of limited dimensions on large scientific plants. An analysis of the elements and configurations most likely to be affected by external interferences and those that are most likely to cause the maximum disturbances was also performed. Obtaining the Conformité Européen mark requires knowing what the harmonized standards are and how the elaboration of the specific requirement is defined. For this type of large installations, one needs to adapt and develop the tests to be carried out. In addition, throughout this process, certification entities and notified bodies play a key role in preparing and agreeing the required technical documentation. We have focused our attention mostly on the technical aspects of each point. We believe that this contribution will be of interest for other scientists involved in applying industrial quality assurance standards to large scientific plant.Keywords: CE marking, electromagnetic compatibility, european directive, scientific installations
Procedia PDF Downloads 11023495 Invention of Novel Technique of Process Scale Up by Using Solid Dosage Form
Authors: Shashank Tiwari, S. P. Mahapatra
Abstract:
The aim of this technique is to reduce the steps of process scales up, save time & cost of the industries. This technique will minimise the steps of process scale up. The new steps are, Novel Lab Scale, Novel Lab Scale Trials, Novel Trial Batches, Novel Exhibit Batches, Novel Validation Batches. In these steps, it is not divided to validation batches in three parts but the data of trials batches, Exhibit Batches and Validation batches are use and compile for production and used for validation. It also increases the batch size of the trial, exhibit batches. The new size of trials batches is not less than fifty Thousand, the exhibit batches increase up to two lack and the validation batches up to five lack. After preparing the batches all their data & drugs use for stability & maintain the validation record and compile data for the technology transfer in production department for preparing the marketed size batches.Keywords: batches, technique, preparation, scale up, validation
Procedia PDF Downloads 35723494 Enhancing Cultural Heritage Data Retrieval by Mapping COURAGE to CIDOC Conceptual Reference Model
Authors: Ghazal Faraj, Andras Micsik
Abstract:
The CIDOC Conceptual Reference Model (CRM) is an extensible ontology that provides integrated access to heterogeneous and digital datasets. The CIDOC-CRM offers a “semantic glue” intended to promote accessibility to several diverse and dispersed sources of cultural heritage data. That is achieved by providing a formal structure for the implicit and explicit concepts and their relationships in the cultural heritage field. The COURAGE (“Cultural Opposition – Understanding the CultuRal HeritAGE of Dissent in the Former Socialist Countries”) project aimed to explore methods about socialist-era cultural resistance during 1950-1990 and planned to serve as a basis for further narratives and digital humanities (DH) research. This project highlights the diversity of flourished alternative cultural scenes in Eastern Europe before 1989. Moreover, the dataset of COURAGE is an online RDF-based registry that consists of historical people, organizations, collections, and featured items. For increasing the inter-links between different datasets and retrieving more relevant data from various data silos, a shared federated ontology for reconciled data is needed. As a first step towards these goals, a full understanding of the CIDOC CRM ontology (target ontology), as well as the COURAGE dataset, was required to start the work. Subsequently, the queries toward the ontology were determined, and a table of equivalent properties from COURAGE and CIDOC CRM was created. The structural diagrams that clarify the mapping process and construct queries are on progress to map person, organization, and collection entities to the ontology. Through mapping the COURAGE dataset to CIDOC-CRM ontology, the dataset will have a common ontological foundation with several other datasets. Therefore, the expected results are: 1) retrieving more detailed data about existing entities, 2) retrieving new entities’ data, 3) aligning COURAGE dataset to a standard vocabulary, 4) running distributed SPARQL queries over several CIDOC-CRM datasets and testing the potentials of distributed query answering using SPARQL. The next plan is to map CIDOC-CRM to other upper-level ontologies or large datasets (e.g., DBpedia, Wikidata), and address similar questions on a wide variety of knowledge bases.Keywords: CIDOC CRM, cultural heritage data, COURAGE dataset, ontology alignment
Procedia PDF Downloads 14723493 Gendered Labelling and Its Effects on Vhavenda Women
Authors: Matodzi Rapalalani
Abstract:
In context with Spencer's (2018) classic labelling theory, labels influence the perceptions of both the individual and other members of society. That is, once labelled, the individual act in ways that confirm the stereotypes attached to the label. This study, therefore, investigates the understanding of gendered labelling and its effects on Vhavenda women. Gender socialization and patriarchy have been viewed as the core causes of the problem. The literature presented the development of gendered labelling, forms of it, and other aspects. A qualitative method of data collection was used in this study, and semi-structural interviews were conducted. A total of 6 participants were used as it is easy to deal with a small sample. Thematic analysis was used as the data was interpreted and analyzed. Ethical issues such as confidentiality, informed consent, and voluntary participation were considered. Through the analysis and data interpretation, causes such as lack of Christian values, insecurities, and lust were mentioned as well as some of the effects such as frustrations, increased divorce, and low self-esteem.Keywords: gender, naming, Venda, women, African culture
Procedia PDF Downloads 9123492 Violence against Children Surveys: Analysis of the Peer-Reviewed Literature from 2009-2019
Authors: Kathleen Cravero, Amanda Nace, Samantha Ski
Abstract:
The Violence Against Children Surveys (VACS) is nationally representative surveys of male and female youth ages 13-24, designed to measure the burden of sexual, physical, and emotional violence experienced in childhood and adolescence. As of 2019, 24 countries implemented or are in the process of implementing a VACS, covering over ten percent of the world’s child population. Since the first article using VACS data from Swaziland was published in 2009, several peer-reviewed articles have been published on the VACS. However, no publications to date have analyzed the breadth of the work and analyzed how the data are represented in the peer-reviewed literature. In this study, we conducted a literature review of all peer-reviewed research that used VACS data or discussed the implementation and methodology of the VACS. The literature review revealed several important findings. Between 2009 and July 2019, thirty-five peer-reviewed articles using VACS data from 12 countries have been published. Twenty of the studies focus on one country, while 15 of the studies focus on two or more countries. Some countries are featured in the literature more than others, for example Kenya (N=14), Malawi (N=12), and Tanzania (N=12). A review of the research by gender demonstrates that research on violence against boys is under-represented. Only two studies specifically focused on boys/young men, while 11 studies focused only on violence against girls. This is despite research which suggests boys and girls experience similar rates of violence. A review of the publications by type of violence revealed significant differences in the types of violence being featured in the literature. Thirteen publications specifically focused on sexual violence, while three studies focused on physical violence, and only one study focused on emotional violence. Almost 70% of the peer-reviewed articles (24 of the 35) were first-authored by someone at the U.S. Centers for Disease Control and Prevention. There were very few first authors from VACS countries, which raises questions about who is leveraging the data and the extent to which capacities for data liberation are being developed within VACS countries. The VACS provide an unprecedented amount of information on the prevalence and past-year incidence of violence against children. Through a review of the peer-reviewed literature on the VACS we can begin to identify trends and gaps in how the data is being used as well as identify areas for further research.Keywords: data to action, global health, implementation science, violence against children surveys
Procedia PDF Downloads 13323491 Training a Neural Network Using Input Dropout with Aggressive Reweighting (IDAR) on Datasets with Many Useless Features
Authors: Stylianos Kampakis
Abstract:
This paper presents a new algorithm for neural networks called “Input Dropout with Aggressive Re-weighting” (IDAR) aimed specifically at datasets with many useless features. IDAR combines two techniques (dropout of input neurons and aggressive re weighting) in order to eliminate the influence of noisy features. The technique can be seen as a generalization of dropout. The algorithm is tested on two different benchmark data sets: a noisy version of the iris dataset and the MADELON data set. Its performance is compared against three other popular techniques for dealing with useless features: L2 regularization, LASSO and random forests. The results demonstrate that IDAR can be an effective technique for handling data sets with many useless features.Keywords: neural networks, feature selection, regularization, aggressive reweighting
Procedia PDF Downloads 45523490 Digitalization of Functional Safety - Increasing Productivity while Reducing Risks
Authors: Michael Scott, Phil Jarrell
Abstract:
Digitalization seems to be everywhere these days. So if one was to digitalize Functional Safety, what would that require: • Ability to directly use data from intelligent P&IDs / process design in a PHA / LOPA • Ability to directly use data from intelligent P&IDs in the SIS Design to support SIL Verification Calculations, SRS, C&Es, Functional Test Plans • Ability to create Unit Operation / SIF Libraries to radically reduce engineering manhours while ensuring consistency and improving quality of SIS designs • Ability to link data directly from a PHA / LOPA to SIS Designs • Ability to leverage reliability models and SRS details from SIS Designs to automatically program the Safety PLC • Ability to leverage SIS Test Plans to automatically create Safety PLC application logic Test Plans for a virtual FAT • Ability to tie real-time data from Process Historians / CMMS to assumptions in the PHA / LOPA and SIS Designs to generate leading indicators on protection layer health • Ability to flag SIS bad actors for proactive corrective actions prior to a near miss or loss of containment event What if I told you all of this was available today? This paper will highlight how the digital revolution has revolutionized the way Safety Instrumented Systems are designed, configured, operated and maintained.Keywords: IEC 61511, safety instrumented systems, functional safety, digitalization, IIoT
Procedia PDF Downloads 18123489 Walmart Sales Forecasting using Machine Learning in Python
Authors: Niyati Sharma, Om Anand, Sanjeev Kumar Prasad
Abstract:
Assuming future sale value for any of the organizations is one of the major essential characteristics of tactical development. Walmart Sales Forecasting is the finest illustration to work with as a beginner; subsequently, it has the major retail data set. Walmart uses this sales estimate problem for hiring purposes also. We would like to analyzing how the internal and external effects of one of the largest companies in the US can walk out their Weekly Sales in the future. Demand forecasting is the planned prerequisite of products or services in the imminent on the basis of present and previous data and different stages of the market. Since all associations is facing the anonymous future and we do not distinguish in the future good demand. Hence, through exploring former statistics and recent market statistics, we envisage the forthcoming claim and building of individual goods, which are extra challenging in the near future. As a result of this, we are producing the required products in pursuance of the petition of the souk in advance. We will be using several machine learning models to test the exactness and then lastly, train the whole data by Using linear regression and fitting the training data into it. Accuracy is 8.88%. The extra trees regression model gives the best accuracy of 97.15%.Keywords: random forest algorithm, linear regression algorithm, extra trees classifier, mean absolute error
Procedia PDF Downloads 14923488 Modular Probe for Basic Monitoring of Water and Air Quality
Authors: Andrés Calvillo Téllez, Marianne Martínez Zanzarric, José Cruz Núñez Pérez
Abstract:
A modular system that performs basic monitoring of both water and air quality is presented. Monitoring is essential for environmental, aquaculture, and agricultural disciplines, where this type of instrumentation is necessary for data collection. The system uses low-cost components, which allows readings close to those with high-cost probes. The probe collects readings such as the coordinates of the geographical position, as well as the time it records the target parameters of the monitored. The modules or subsystems that make up the probe are the global positioning (GPS), which shows the altitude, latitude, and longitude data of the point where the reading will be recorded, a real-time clock stage, the date marking the time, the module SD memory continuously stores data, data acquisition system, central processing unit, and energy. The system acquires parameters to measure water quality, conductivity, pressure, and temperature, and for air, three types of ammonia, dioxide, and carbon monoxide gases were censored. The information obtained allowed us to identify the schedule of modification of the parameters and the identification of the ideal conditions for the growth of microorganisms in the water.Keywords: calibration, conductivity, datalogger, monitoring, real time clock, water quality
Procedia PDF Downloads 10323487 Assessment of the Knowledge and Practices of Healthcare Workers and Patients Regarding Prevention of Tuberculosis at a Tertiary Care Hospital of Southern Punjab
Authors: Muhammad Shahbaz Akhtar
Abstract:
Background; Tuberculosis remains a significant public health challenge in Pakistan, with high incidence and prevalence rates, particularly among vulnerable populations. Addressing the TB burden requires comprehensive efforts to improve healthcare infrastructure, increase access to quality diagnosis and treatment services, raise public awareness, and address socioeconomic determinants of health. Objective; To assess the knowledge and practices of healthcare workers and patients regarding prevention of tuberculosis at a tertiary care hospital of Southern Punjab.Material and methods; Data will be collected from 135 healthcare workers and 135 TB patients visiting Nishtar Hospital, Multan in this descriptive cross – sectional study using non – probability consecutive sampling technique. Proper approval will be taken from Hospital authorities to conduct this study. Study participants will be recruited after taking informed written consent, describing them objectives of this study. The study participants will be ensured of their confidentiality of the data and interviewed to assess their knowledge and practices regarding prevention of tuberculosis. Data Analysis Procedure; Data will be entered and analyzed by using SPSS version 25 to calculated mean and standard deviation for the numerical data such as age, duration of disease and duration of experience. Frequencies and percentages will be calculated for gender, age groups, level of knowledge, qualification, designation and practices. Impact of confounders like gender, age groups, duration of experience, disease duration, years of experience and designation will be assessed by stratification. Post stratification chi – square test will be applied at 0.05 level of significance at 95 % CI.Keywords: tuberculosis, data analysis, HIV/AIDS, preventable
Procedia PDF Downloads 2123486 Estimation of Natural Convection Heat Transfer from Plate-Fin Heat Sinks in a Closed Enclosure
Authors: Han-Taw Chen, Chung-Hou Lai, Tzu-Hsiang Lin, Ge-Jang He
Abstract:
This study applies the inverse method and three-dimensional CFD commercial software in conjunction with the experimental temperature data to investigate the heat transfer and fluid flow characteristics of the plate-fin heat sink in a closed rectangular enclosure for various values of fin height. The inverse method with the finite difference method and the experimental temperature data is applied to determine the heat transfer coefficient. The k-ε turbulence model is used to obtain the heat transfer and fluid flow characteristics within the fins. To validate the accuracy of the results obtained, the comparison of the average heat transfer coefficient is made. The calculated temperature at selected measurement locations on the plate-fin is also compared with experimental data.Keywords: inverse method, FLUENT, k-ε model, heat transfer characteristics, plate-fin heat sink
Procedia PDF Downloads 46023485 Medical and Surgical Nursing Care
Authors: Nassim Salmi
Abstract:
Postoperative mobilization is an important part of fundamental care. Increased mobilization has a positive effect on recovery, but immobilization is still a challenge in postoperative care. Aims: To report how the establishment of a national nursing database was used to measure postoperative mobilization in patients undergoing surgery for ovarian cancer. Mobilization was defined as at least 3 hours out of bed on postoperative day 1, with the goal set at achieving this in 60% of patients. Clinical nurses on 4400 patients with ovarian cancer performed data entry. Findings: 46.7% of patients met the goal for mobilization on the first postoperative day, but variations in duration and type of mobilization were observed. Of those mobilized, 51.8% had been walking in the hallway. A national nursing database creates opportunities to optimize fundamental care. By comparing nursing data with oncological, surgical, and pathology data, it became possible to study mobilization in relation to cancer stage, comorbidity, treatment, and extent of surgery.Keywords: postoperative care, gynecology, nursing documentation, database
Procedia PDF Downloads 11623484 A South African Perspective on Palestine and the Motivation for a One-State Solution
Authors: Farhin Delawala
Abstract:
In the context of Palestine and the broader Middle East, this study delves into the Apartheid regime in Palestine, the country under occupation, and the intricate ties between the United States of America (USA) and the settler colony of ‘Israel’. The paper provides an explanation of the colonisation of Palestine as well as the forms of Apartheid. Moreover, it explains the provisions of United Nations (UN) international laws and how they have been broken by the settler colony of ‘Israel’. The paper contends that the US, motivated by its security interests in the region, has strategically influenced the political instability in the Middle East and the illegal occupation of Palestine. Furthermore, this paper proposes an alternative path of a one-state solution to foster a more peaceful and stable society and advocates for the integration of the Palestinian population into the region, from Gaza and the West Bank, under equal citizen rights. Thereby, the ethno-theocratic nature of the settler colony as an ethno-theocratic state is dismantled.Keywords: apartheid, one-state solution, Palestine, political instability, settler colony
Procedia PDF Downloads 65