Search results for: well data integration
23632 Media Literacy: Information and Communication Technology Impact on Teaching and Learning Methods in Albanian Education System
Authors: Loreta Axhami
Abstract:
Media literacy in the digital age emerges not only as a set of skills to generate true knowledge and information but also as a pedagogy methodology, as a kind of educational philosophy. In addition to such innovations as information integration and communication technologies, media infrastructures, and web usage in the educational system, media literacy enables the change in the learning methods, pedagogy, teaching programs, and school curriculum itself. In this framework, this study focuses on ICT's impact on teaching and learning methods and the degree they are reflected in the Albanian education system. The study is based on a combination of quantitative and qualitative methods of scientific research. Referring to the study findings, it results that student’s limited access to the internet in school, focus on the hardcopy textbooks and the role of the teacher as the only or main source of knowledge and information are some of the main factors contributing to the implementation of authoritarian pedagogical methods in the Albanian education system. In these circumstances, the implementation of media literacy is recommended as an apt educational process for the 21st century, which requires a reconceptualization of textbooks as well as the application of modern teaching and learning methods by integrating information and communication technologies.Keywords: authoritarian pedagogic model, education system, ICT, media literacy
Procedia PDF Downloads 14023631 Artificial Neurons Based on Memristors for Spiking Neural Networks
Authors: Yan Yu, Wang Yu, Chen Xintong, Liu Yi, Zhang Yanzhong, Wang Yanji, Chen Xingyu, Zhang Miaocheng, Tong Yi
Abstract:
Neuromorphic computing based on spiking neural networks (SNNs) has emerged as a promising avenue for building the next generation of intelligent computing systems. Owing to its high-density integration, low power, and outstanding nonlinearity, memristors have attracted emerging attention on achieving SNNs. However, fabricating a low-power and robust memristor-based spiking neuron without extra electrical components is still a challenge for brain-inspired systems. In this work, we demonstrate a TiO₂-based threshold switching (TS) memristor to emulate a leaky integrate-and-fire (LIF) neuron without auxiliary circuits, used to realize single layer fully connected (FC) SNNs. Moreover, our TiO₂-based resistive switching (RS) memristors realize spiking-time-dependent-plasticity (STDP), originating from the Ag diffusion-based filamentary mechanism. This work demonstrates that TiO2-based memristors may provide an efficient method to construct hardware neuromorphic computing systems.Keywords: leaky integrate-and-fire, memristor, spiking neural networks, spiking-time-dependent-plasticity
Procedia PDF Downloads 13423630 Continuous Dyeing of Graphene and Polyaniline on Textiles for Electromagnetic interference Shielding: An Application of Intelligent Fabrics
Authors: Mourad Makhlouf Sabrina Bouriche, Zoubir Benmaamar, Didier Villemin
Abstract:
Background: The increasing presence of electromagnetic interference (EMI) requires the development of effective protection solutions. Intelligent textiles offer a promising approach due to their wear ability and the possibility of integration into everyday clothing. In this study, the use of graphene and polyaniline for EMI shielding on cotton fabrics was examined. Methods: In this study, the continuous dyeing of recycled graphite-derived graphene and polyaniline was examined. Bottom-reforming technology was adopted to improve adhesion and achieve uniform distribution of conductive material on the fiber surface. The effect of material weight ratio on fabric performance and X-band EMI shielding effectiveness (SE) was evaluated. Significant Findings: The dyed cotton fabrics incorporating graphene, polyaniline, and their combination exhibited improved conductivity. Notably, these fabrics achieved EMI SE values ranging from 9 to 16 dB within the X-band frequency range (8-9 GHz). These findings demonstrate the potential of this approach for developing intelligent textiles with effective EMI shielding capabilities. Additionally, the utilization of recycled materials contributes to a more sustainable shielding solution.Keywords: Intelligent textiles, graphene, polyaniline, electromagnetic shielding, conductivity, recycling
Procedia PDF Downloads 4323629 Fast Approximate Bayesian Contextual Cold Start Learning (FAB-COST)
Authors: Jack R. McKenzie, Peter A. Appleby, Thomas House, Neil Walton
Abstract:
Cold-start is a notoriously difficult problem which can occur in recommendation systems, and arises when there is insufficient information to draw inferences for users or items. To address this challenge, a contextual bandit algorithm – the Fast Approximate Bayesian Contextual Cold Start Learning algorithm (FAB-COST) – is proposed, which is designed to provide improved accuracy compared to the traditionally used Laplace approximation in the logistic contextual bandit, while controlling both algorithmic complexity and computational cost. To this end, FAB-COST uses a combination of two moment projection variational methods: Expectation Propagation (EP), which performs well at the cold start, but becomes slow as the amount of data increases; and Assumed Density Filtering (ADF), which has slower growth of computational cost with data size but requires more data to obtain an acceptable level of accuracy. By switching from EP to ADF when the dataset becomes large, it is able to exploit their complementary strengths. The empirical justification for FAB-COST is presented, and systematically compared to other approaches on simulated data. In a benchmark against the Laplace approximation on real data consisting of over 670, 000 impressions from autotrader.co.uk, FAB-COST demonstrates at one point increase of over 16% in user clicks. On the basis of these results, it is argued that FAB-COST is likely to be an attractive approach to cold-start recommendation systems in a variety of contexts.Keywords: cold-start learning, expectation propagation, multi-armed bandits, Thompson Sampling, variational inference
Procedia PDF Downloads 10823628 Design and Characterization of a Smart Composite Fabric for Knee Brace
Authors: Rohith J. K., Amir Nazemi, Abbas S. Milani
Abstract:
In Paralympic sports, athletes often depend on some form of equipment to enable competitive sporting, where most of this equipment would only allow passive physiological supports and discrete physiological measurements. Active feedback physiological support and continuous detection of performance indicators, without time or space constraints, would be beneficial in more effective training and performance measures of Paralympic athletes. Moreover, occasionally the athletes suffer from fatigue and muscular stains due to improper monitoring systems. The latter challenges can be overcome by using Smart Composites technology when manufacturing, e.g., knee brace and other sports wearables utilities, where the sensors can be fused together into the fabric and an assisted system actively support the athlete. This paper shows how different sensing functionality may be created by intrinsic and extrinsic modifications onto different types of composite fabrics, depending on the level of integration and the employed functional elements. Results demonstrate that fabric sensors can be well-tailored to measure muscular strain and be used in the fabrication of a smart knee brace as a sample potential application. Materials, connectors, fabric circuits, interconnects, encapsulation and fabrication methods associated with such smart fabric technologies prove to be customizable and versatile.Keywords: smart composites, sensors, smart fabrics, knee brace
Procedia PDF Downloads 17823627 A Discrete Element Method Centrifuge Model of Monopile under Cyclic Lateral Loads
Authors: Nuo Duan, Yi Pik Cheng
Abstract:
This paper presents the data of a series of two-dimensional Discrete Element Method (DEM) simulations of a large-diameter rigid monopile subjected to cyclic loading under a high gravitational force. At present, monopile foundations are widely used to support the tall and heavy wind turbines, which are also subjected to significant from wind and wave actions. A safe design must address issues such as rotations and changes in soil stiffness subject to these loadings conditions. Design guidance on the issue is limited, so are the availability of laboratory and field test data. The interpretation of these results in sand, such as the relation between loading and displacement, relies mainly on empirical correlations to pile properties. Regarding numerical models, most data from Finite Element Method (FEM) can be found. They are not comprehensive, and most of the FEM results are sensitive to input parameters. The micro scale behaviour could change the mechanism of the soil-structure interaction. A DEM model was used in this paper to study the cyclic lateral loads behaviour. A non-dimensional framework is presented and applied to interpret the simulation results. The DEM data compares well with various set of published experimental centrifuge model test data in terms of lateral deflection. The accumulated permanent pile lateral displacements induced by the cyclic lateral loads were found to be dependent on the characteristics of the applied cyclic load, such as the extent of the loading magnitudes and directions.Keywords: cyclic loading, DEM, numerical modelling, sands
Procedia PDF Downloads 32123626 Estimation of Desktop E-Wastes in Delhi Using Multivariate Flow Analysis
Authors: Sumay Bhojwani, Ashutosh Chandra, Mamita Devaburman, Akriti Bhogal
Abstract:
This article uses the Material flow analysis for estimating e-wastes in the Delhi/NCR region. The Material flow analysis is based on sales data obtained from various sources. Much of the data available for the sales is unreliable because of the existence of a huge informal sector. The informal sector in India accounts for more than 90%. Therefore, the scope of this study is only limited to the formal one. Also, for projection of the sales data till 2030, we have used regression (linear) to avoid complexity. The actual sales in the years following 2015 may vary non-linearly but we have assumed a basic linear relation. The purpose of this study was to know an approximate quantity of desktop e-wastes that we will have by the year 2030 so that we start preparing ourselves for the ineluctable investment in the treatment of these ever-rising e-wastes. The results of this study can be used to install a treatment plant for e-wastes in Delhi.Keywords: e-wastes, Delhi, desktops, estimation
Procedia PDF Downloads 25923625 Geospatial Network Analysis Using Particle Swarm Optimization
Authors: Varun Singh, Mainak Bandyopadhyay, Maharana Pratap Singh
Abstract:
The shortest path (SP) problem concerns with finding the shortest path from a specific origin to a specified destination in a given network while minimizing the total cost associated with the path. This problem has widespread applications. Important applications of the SP problem include vehicle routing in transportation systems particularly in the field of in-vehicle Route Guidance System (RGS) and traffic assignment problem (in transportation planning). Well known applications of evolutionary methods like Genetic Algorithms (GA), Ant Colony Optimization, Particle Swarm Optimization (PSO) have come up to solve complex optimization problems to overcome the shortcomings of existing shortest path analysis methods. It has been reported by various researchers that PSO performs better than other evolutionary optimization algorithms in terms of success rate and solution quality. Further Geographic Information Systems (GIS) have emerged as key information systems for geospatial data analysis and visualization. This research paper is focused towards the application of PSO for solving the shortest path problem between multiple points of interest (POI) based on spatial data of Allahabad City and traffic speed data collected using GPS. Geovisualization of results of analysis is carried out in GIS.Keywords: particle swarm optimization, GIS, traffic data, outliers
Procedia PDF Downloads 48323624 Distributed Processing for Content Based Lecture Video Retrieval on Hadoop Framework
Authors: U. S. N. Raju, Kothuri Sai Kiran, Meena G. Kamal, Vinay Nikhil Pabba, Suresh Kanaparthi
Abstract:
There is huge amount of lecture video data available for public use, and many more lecture videos are being created and uploaded every day. Searching for videos on required topics from this huge database is a challenging task. Therefore, an efficient method for video retrieval is needed. An approach for automated video indexing and video search in large lecture video archives is presented. As the amount of video lecture data is huge, it is very inefficient to do the processing in a centralized computation framework. Hence, Hadoop Framework for distributed computing for Big Video Data is used. First, step in the process is automatic video segmentation and key-frame detection to offer a visual guideline for the video content navigation. In the next step, we extract textual metadata by applying video Optical Character Recognition (OCR) technology on key-frames. The OCR and detected slide text line types are adopted for keyword extraction, by which both video- and segment-level keywords are extracted for content-based video browsing and search. The performance of the indexing process can be improved for a large database by using distributed computing on Hadoop framework.Keywords: video lectures, big video data, video retrieval, hadoop
Procedia PDF Downloads 53423623 A Critical Analysis of Environmental Investment in India
Authors: K. Y. Chen, H. Chua, C. W. Kan
Abstract:
Environmental investment is an important issue in many countries. In this study, we will first review the environmental issues related to India and their effect on the economical development. Secondly, economic data would be collected from government yearly statistics. The statistics would also include the environmental investment information of India. Finally, we would co-relate the data in order to find out the relationship between environmental investment and sustainable development in India. Therefore, in the paper, we aim to analyse the effect of an environmental investment on the sustainable development in India. Based on the economic data collected, India is in development status with fast population and GDP growth speed. India is facing the environment problems due to its high-speed development. However, the environment investment could give a positive impact on the sustainable development in India. The environmental investment is keeping in the same growth rate with GDP. Acknowledgment: Authors would like to thank the financial support from the Hong Kong Polytechnic University for this work.Keywords: India, environmental investment, sustainable development, analysis
Procedia PDF Downloads 31523622 A Robust System for Foot Arch Type Classification from Static Foot Pressure Distribution Data Using Linear Discriminant Analysis
Authors: R. Periyasamy, Deepak Joshi, Sneh Anand
Abstract:
Foot posture assessment is important to evaluate foot type, causing gait and postural defects in all age groups. Although different methods are used for classification of foot arch type in clinical/research examination, there is no clear approach for selecting the most appropriate measurement system. Therefore, the aim of this study was to develop a system for evaluation of foot type as clinical decision-making aids for diagnosis of flat and normal arch based on the Arch Index (AI) and foot pressure distribution parameter - Power Ratio (PR) data. The accuracy of the system was evaluated for 27 subjects with age ranging from 24 to 65 years. Foot area measurements (hind foot, mid foot, and forefoot) were acquired simultaneously from foot pressure intensity image using portable PedoPowerGraph system and analysis of the image in frequency domain to obtain foot pressure distribution parameter - PR data. From our results, we obtain 100% classification accuracy of normal and flat foot by using the linear discriminant analysis method. We observe there is no misclassification of foot types because of incorporating foot pressure distribution data instead of only arch index (AI). We found that the mid-foot pressure distribution ratio data and arch index (AI) value are well correlated to foot arch type based on visual analysis. Therefore, this paper suggests that the proposed system is accurate and easy to determine foot arch type from arch index (AI), as well as incorporating mid-foot pressure distribution ratio data instead of physical area of contact. Hence, such computational tool based system can help the clinicians for assessment of foot structure and cross-check their diagnosis of flat foot from mid-foot pressure distribution.Keywords: arch index, computational tool, static foot pressure intensity image, foot pressure distribution, linear discriminant analysis
Procedia PDF Downloads 49923621 Randomly Casted Single-Wall Carbon Nanotubes Films for High Performance Hybrid Photovoltaic Devices
Authors: My Ali El Khakani
Abstract:
Single-wall Carbon nanotubes (SWCNTs) possess an unprecedented combination of unique properties that make them highly promising for suitable for a new generation of photovoltaic (PV) devices. Prior to discussing the integration of SWCNTs films into effective PV devices, we will briefly highlight our work on the synthesis of SWCNTs by means of the KrF pulsed laser deposition technique, their purification and transfer onto n-silicon substrates to form p-n junctions. Some of the structural and optoelectronic properties of SWCNTs relevant to PV applications will be emphasized. By varying the SWCNTs film density (µg/cm2), we were able to point out the existence of an optimum value that yields the highest photoconversion efficiency (PCE) of ~10%. Further control of the doping of the p-SWCNTs films, through their exposure to nitric acid vapors, along with the insertion of an optimized hole-extraction-layer in the p-SWCNTs/n-Si hybrid devices permitted to achieve a PCE value as high as 14.2%. Such a high PCE value demonstrates the full potential of these p-SWCNTs/n-Si devices for sunlight photoconversion. On the other hand, by examining both the optical transmission and electrical conductance of the SWCNTs’ films, we established a figure of merit (FOM) that was shown to correlate well with the PCE performance. Such a direct relationship between the FOM and the PCE can be used as a guide for further PCE enhancement of these novel p-SWCNTs/n-Si PV devices.Keywords: carbon nanotubes (CNTs), CNTs-silicon hybrid devices, photoconversion, photovoltaic devices, pulsed laser deposition
Procedia PDF Downloads 11823620 Students’ Awareness of the Use of Poster, Power Point and Animated Video Presentations: A Case Study of Third Year Students of the Department of English of Batna University
Authors: Bahloul Amel
Abstract:
The present study debates students’ perceptions of the use of technology in learning English as a Foreign Language. Its aim is to explore and understand students’ preparation and presentation of Posters, PowerPoint and Animated Videos by drawing attention to visual and oral elements. The data is collected through observations and semi-structured interviews and analyzed through phenomenological data analysis steps. The themes emerged from the data, visual learning satisfaction in using information and communication technology, providing structure to oral presentation, learning from peers’ presentations, draw attention to using Posters, PowerPoint and Animated Videos as each supports visual learning and organization of thoughts in oral presentations.Keywords: EFL, posters, PowerPoint presentations, Animated Videos, visual learning
Procedia PDF Downloads 44523619 The Relationship Between Hourly Compensation and Unemployment Rate Using the Panel Data Regression Analysis
Authors: S. K. Ashiquer Rahman
Abstract:
the paper concentrations on the importance of hourly compensation, emphasizing the significance of the unemployment rate. There are the two most important factors of a nation these are its unemployment rate and hourly compensation. These are not merely statistics but they have profound effects on individual, families, and the economy. They are inversely related to one another. When we consider the unemployment rate that will probably decline as hourly compensations in manufacturing rise. But when we reduced the unemployment rates and increased job prospects could result from higher compensation. That’s why, the increased hourly compensation in the manufacturing sector that could have a favorable effect on job changing issues. Moreover, the relationship between hourly compensation and unemployment is complex and influenced by broader economic factors. In this paper, we use panel data regression models to evaluate the expected link between hourly compensation and unemployment rate in order to determine the effect of hourly compensation on unemployment rate. We estimate the fixed effects model, evaluate the error components, and determine which model (the FEM or ECM) is better by pooling all 60 observations. We then analysis and review the data by comparing 3 several countries (United States, Canada and the United Kingdom) using panel data regression models. Finally, we provide result, analysis and a summary of the extensive research on how the hourly compensation effects on the unemployment rate. Additionally, this paper offers relevant and useful informational to help the government and academic community use an econometrics and social approach to lessen on the effect of the hourly compensation on Unemployment rate to eliminate the problem.Keywords: hourly compensation, Unemployment rate, panel data regression models, dummy variables, random effects model, fixed effects model, the linear regression model
Procedia PDF Downloads 8123618 Comparison of Security Challenges and Issues of Mobile Computing and Internet of Things
Authors: Aabiah Nayeem, Fariha Shafiq, Mustabshra Aftab, Rabia Saman Pirzada, Samia Ghazala
Abstract:
In this modern era of technology, the concept of Internet of Things is very popular in every domain. It is a widely distributed system of things in which the data collected from sensory devices is transmitted, analyzed locally/collectively then broadcasted to network where action can be taken remotely via mobile/web apps. Today’s mobile computing is also gaining importance as the services are provided during mobility. Through mobile computing, data are transmitted via computer without physically connected to a fixed point. The challenge is to provide services with high speed and security. Also, the data gathered from the mobiles must be processed in a secured way. Mobile computing is strongly influenced by internet of things. In this paper, we have discussed security issues and challenges of internet of things and mobile computing and we have compared both of them on the basis of similarities and dissimilarities.Keywords: embedded computing, internet of things, mobile computing, wireless technologies
Procedia PDF Downloads 31623617 Identifying the Goals of a Multicultural Curriculum for the Primary Education Course
Authors: Fatemeh Havas Beigi
Abstract:
The purpose of this study is to identify the objectives of a multicultural curriculum for the primary education period from the perspective of ethnic teachers and education experts and cultural professionals. The research paradigm is interpretive, the research approach is qualitative, the research strategy is content analysis, the sampling method is purposeful and it is a snowball, and the sample of informants in the research for Iranian ethnic teachers and experts until the theoretical saturation was estimated to be 67 people. The data collection tools used were based on semi-structured interviews and individual interviews and focal interviews were used to collect information. The data format was also in audio format and the first period coding and the second coding were used to analyze the data. Based on data analysis 11 Objective: Paying attention to ethnic equality, expanding educational opportunities and justice, peaceful coexistence, anti-ethnic and racial discrimination education, paying attention to human value and dignity, accepting religious diversity, getting to know ethnicities and cultures, promoting teaching-learning, fostering self-confidence, building national unity, and developing cultural commonalities for a multicultural curriculum were identified.Keywords: objective, multicultural curriculum, connect, elementary education period
Procedia PDF Downloads 9423616 Serviceability of Fabric-Formed Concrete Structures
Authors: Yadgar Tayfur, Antony Darby, Tim Ibell, Mark Evernden, John Orr
Abstract:
Fabric form-work is a technique to cast concrete structures with a great advantage of saving concrete material of up to 40%. This technique is particularly associated with the optimized concrete structures that usually have smaller cross-section dimensions than equivalent prismatic members. However, this can make the structural system produced from these members prone to smaller serviceability safety margins. Therefore, it is very important to understand the serviceability issue of non-prismatic concrete structures. In this paper, an analytical computer-based model to optimize concrete beams and to predict load-deflection behaviour of both prismatic and non-prismatic concrete beams is presented. The model was developed based on the method of sectional analysis and integration of curvatures. Results from the analytical model were compared to load-deflection behaviour of a number of beams with different geometric and material properties from other researchers. The results of the comparison show that the analytical program can accurately predict the load-deflection response of concrete beams with medium reinforcement ratios. However, it over-estimates deflection values for lightly reinforced specimens. Finally, the analytical program acceptably predicted load-deflection behaviour of on-prismatic concrete beams.Keywords: fabric-formed concrete, continuous beams, optimisation, serviceability
Procedia PDF Downloads 37223615 The Carbon Trading Price and Trading Volume Forecast in Shanghai City by BP Neural Network
Authors: Liu Zhiyuan, Sun Zongdi
Abstract:
In this paper, the BP neural network model is established to predict the carbon trading price and carbon trading volume in Shanghai City. First of all, we find the data of carbon trading price and carbon trading volume in Shanghai City from September 30, 2015 to December 23, 2016. The carbon trading price and trading volume data were processed to get the average value of each 5, 10, 20, 30, and 60 carbon trading price and trading volume. Then, these data are used as input of BP neural network model. Finally, after the training of BP neural network, the prediction values of Shanghai carbon trading price and trading volume are obtained, and the model is tested.Keywords: Carbon trading price, carbon trading volume, BP neural network model, Shanghai City
Procedia PDF Downloads 35223614 Modeling Anisotropic Damage Algorithms of Metallic Structures
Authors: Bahar Ayhan
Abstract:
The present paper is concerned with the numerical modeling of the inelastic behavior of the anisotropically damaged ductile materials, which are based on a generalized macroscopic theory within the framework of continuum damage mechanics. Kinematic decomposition of the strain rates into elastic, plastic and damage parts is basis for accomplishing the structure of continuum theory. The evolution of the damage strain rate tensor is detailed with the consideration of anisotropic effects. Helmholtz free energy functions are constructed separately for the elastic and inelastic behaviors in order to be able to address the plastic and damage process. Additionally, the constitutive structure, which is based on the standard dissipative material approach, is elaborated with stress tensor, a yield criterion for plasticity and a fracture criterion for damage besides the potential functions of each inelastic phenomenon. The finite element method is used to approximate the linearized variational problem. Stress and strain outcomes are solved by using the numerical integration algorithm based on operator split methodology with a plastic and damage (multiplicator) variable separately. Numerical simulations are proposed in order to demonstrate the efficiency of the formulation by comparing the examples in the literature.Keywords: anisotropic damage, finite element method, plasticity, coupling
Procedia PDF Downloads 20623613 An Investigation of the Relationship Between Privacy Crisis, Public Discourse on Privacy, and Key Performance Indicators at Facebook (2004–2021)
Authors: Prajwal Eachempati, Laurent Muzellec, Ashish Kumar Jha
Abstract:
We use Facebook as a case study to investigate the complex relationship between the firm’s public discourse (and actions) surrounding data privacy and the performance of a business model based on monetizing user’s data. We do so by looking at the evolution of public discourse over time (2004–2021) and relate topics to revenue and stock market evolution Drawing from archival sources like Zuckerberg We use LDA topic modelling algorithm to reveal 19 topics regrouped in 6 major themes. We first show how, by using persuasive and convincing language that promises better protection of consumer data usage, but also emphasizes greater user control over their own data, the privacy issue is being reframed as one of greater user control and responsibility. Second, we aim to understand and put a value on the extent to which privacy disclosures have a potential impact on the financial performance of social media firms. There we found significant relationship between the topics pertaining to privacy and social media/technology, sentiment score and stock market prices. Revenue is found to be impacted by topics pertaining to politics and new product and service innovations while number of active users is not impacted by the topics unless moderated by external control variables like Return on Assets and Brand Equity.Keywords: public discourses, data protection, social media, privacy, topic modeling, business models, financial performance
Procedia PDF Downloads 9223612 Invention of Novel Technique of Process Scale Up by Using Solid Dosage Form
Authors: Shashank Tiwari, S. P. Mahapatra
Abstract:
The aim of this technique is to reduce the steps of process scales up, save time & cost of the industries. This technique will minimise the steps of process scale up. The new steps are, Novel Lab Scale, Novel Lab Scale Trials, Novel Trial Batches, Novel Exhibit Batches, Novel Validation Batches. In these steps, it is not divided to validation batches in three parts but the data of trials batches, Exhibit Batches and Validation batches are use and compile for production and used for validation. It also increases the batch size of the trial, exhibit batches. The new size of trials batches is not less than fifty Thousand, the exhibit batches increase up to two lack and the validation batches up to five lack. After preparing the batches all their data & drugs use for stability & maintain the validation record and compile data for the technology transfer in production department for preparing the marketed size batches.Keywords: batches, technique, preparation, scale up, validation
Procedia PDF Downloads 35723611 Jordan Water District Interactive Billing and Accounting Information System
Authors: Adrian J. Forca, Simeon J. Cainday III
Abstract:
The Jordan Water District Interactive Billing and Accounting Information Systems is designed for Jordan Water District to uplift the efficiency and effectiveness of its services to its customers. It is designed to process computations of water bills in accurate and fast way through automating the manual process and ensures that correct rates and fees are applied. In addition to billing process, a mobile app will be integrated into it to support rapid and accurate water bill generation. An interactive feature will be incorporated to support electronic billing to customers who wish to receive water bills through the use of electronic mail. The system will also improve, organize and avoid data inaccuracy in accounting processes because data will be stored in a database which is designed logically correct through normalization. Furthermore, strict programming constraints will be plunged to validate account access privilege based on job function and data being stored and retrieved to ensure data security, reliability, and accuracy. The system will be able to cater the billing and accounting services of Jordan Water District resulting in setting forth the manual process and adapt to the modern technological innovations.Keywords: accounting, bill, information system, interactive
Procedia PDF Downloads 25123610 Enhancing Cultural Heritage Data Retrieval by Mapping COURAGE to CIDOC Conceptual Reference Model
Authors: Ghazal Faraj, Andras Micsik
Abstract:
The CIDOC Conceptual Reference Model (CRM) is an extensible ontology that provides integrated access to heterogeneous and digital datasets. The CIDOC-CRM offers a “semantic glue” intended to promote accessibility to several diverse and dispersed sources of cultural heritage data. That is achieved by providing a formal structure for the implicit and explicit concepts and their relationships in the cultural heritage field. The COURAGE (“Cultural Opposition – Understanding the CultuRal HeritAGE of Dissent in the Former Socialist Countries”) project aimed to explore methods about socialist-era cultural resistance during 1950-1990 and planned to serve as a basis for further narratives and digital humanities (DH) research. This project highlights the diversity of flourished alternative cultural scenes in Eastern Europe before 1989. Moreover, the dataset of COURAGE is an online RDF-based registry that consists of historical people, organizations, collections, and featured items. For increasing the inter-links between different datasets and retrieving more relevant data from various data silos, a shared federated ontology for reconciled data is needed. As a first step towards these goals, a full understanding of the CIDOC CRM ontology (target ontology), as well as the COURAGE dataset, was required to start the work. Subsequently, the queries toward the ontology were determined, and a table of equivalent properties from COURAGE and CIDOC CRM was created. The structural diagrams that clarify the mapping process and construct queries are on progress to map person, organization, and collection entities to the ontology. Through mapping the COURAGE dataset to CIDOC-CRM ontology, the dataset will have a common ontological foundation with several other datasets. Therefore, the expected results are: 1) retrieving more detailed data about existing entities, 2) retrieving new entities’ data, 3) aligning COURAGE dataset to a standard vocabulary, 4) running distributed SPARQL queries over several CIDOC-CRM datasets and testing the potentials of distributed query answering using SPARQL. The next plan is to map CIDOC-CRM to other upper-level ontologies or large datasets (e.g., DBpedia, Wikidata), and address similar questions on a wide variety of knowledge bases.Keywords: CIDOC CRM, cultural heritage data, COURAGE dataset, ontology alignment
Procedia PDF Downloads 14623609 Gendered Labelling and Its Effects on Vhavenda Women
Authors: Matodzi Rapalalani
Abstract:
In context with Spencer's (2018) classic labelling theory, labels influence the perceptions of both the individual and other members of society. That is, once labelled, the individual act in ways that confirm the stereotypes attached to the label. This study, therefore, investigates the understanding of gendered labelling and its effects on Vhavenda women. Gender socialization and patriarchy have been viewed as the core causes of the problem. The literature presented the development of gendered labelling, forms of it, and other aspects. A qualitative method of data collection was used in this study, and semi-structural interviews were conducted. A total of 6 participants were used as it is easy to deal with a small sample. Thematic analysis was used as the data was interpreted and analyzed. Ethical issues such as confidentiality, informed consent, and voluntary participation were considered. Through the analysis and data interpretation, causes such as lack of Christian values, insecurities, and lust were mentioned as well as some of the effects such as frustrations, increased divorce, and low self-esteem.Keywords: gender, naming, Venda, women, African culture
Procedia PDF Downloads 9123608 Violence against Children Surveys: Analysis of the Peer-Reviewed Literature from 2009-2019
Authors: Kathleen Cravero, Amanda Nace, Samantha Ski
Abstract:
The Violence Against Children Surveys (VACS) is nationally representative surveys of male and female youth ages 13-24, designed to measure the burden of sexual, physical, and emotional violence experienced in childhood and adolescence. As of 2019, 24 countries implemented or are in the process of implementing a VACS, covering over ten percent of the world’s child population. Since the first article using VACS data from Swaziland was published in 2009, several peer-reviewed articles have been published on the VACS. However, no publications to date have analyzed the breadth of the work and analyzed how the data are represented in the peer-reviewed literature. In this study, we conducted a literature review of all peer-reviewed research that used VACS data or discussed the implementation and methodology of the VACS. The literature review revealed several important findings. Between 2009 and July 2019, thirty-five peer-reviewed articles using VACS data from 12 countries have been published. Twenty of the studies focus on one country, while 15 of the studies focus on two or more countries. Some countries are featured in the literature more than others, for example Kenya (N=14), Malawi (N=12), and Tanzania (N=12). A review of the research by gender demonstrates that research on violence against boys is under-represented. Only two studies specifically focused on boys/young men, while 11 studies focused only on violence against girls. This is despite research which suggests boys and girls experience similar rates of violence. A review of the publications by type of violence revealed significant differences in the types of violence being featured in the literature. Thirteen publications specifically focused on sexual violence, while three studies focused on physical violence, and only one study focused on emotional violence. Almost 70% of the peer-reviewed articles (24 of the 35) were first-authored by someone at the U.S. Centers for Disease Control and Prevention. There were very few first authors from VACS countries, which raises questions about who is leveraging the data and the extent to which capacities for data liberation are being developed within VACS countries. The VACS provide an unprecedented amount of information on the prevalence and past-year incidence of violence against children. Through a review of the peer-reviewed literature on the VACS we can begin to identify trends and gaps in how the data is being used as well as identify areas for further research.Keywords: data to action, global health, implementation science, violence against children surveys
Procedia PDF Downloads 13323607 Training a Neural Network Using Input Dropout with Aggressive Reweighting (IDAR) on Datasets with Many Useless Features
Authors: Stylianos Kampakis
Abstract:
This paper presents a new algorithm for neural networks called “Input Dropout with Aggressive Re-weighting” (IDAR) aimed specifically at datasets with many useless features. IDAR combines two techniques (dropout of input neurons and aggressive re weighting) in order to eliminate the influence of noisy features. The technique can be seen as a generalization of dropout. The algorithm is tested on two different benchmark data sets: a noisy version of the iris dataset and the MADELON data set. Its performance is compared against three other popular techniques for dealing with useless features: L2 regularization, LASSO and random forests. The results demonstrate that IDAR can be an effective technique for handling data sets with many useless features.Keywords: neural networks, feature selection, regularization, aggressive reweighting
Procedia PDF Downloads 45523606 Digitalization of Functional Safety - Increasing Productivity while Reducing Risks
Authors: Michael Scott, Phil Jarrell
Abstract:
Digitalization seems to be everywhere these days. So if one was to digitalize Functional Safety, what would that require: • Ability to directly use data from intelligent P&IDs / process design in a PHA / LOPA • Ability to directly use data from intelligent P&IDs in the SIS Design to support SIL Verification Calculations, SRS, C&Es, Functional Test Plans • Ability to create Unit Operation / SIF Libraries to radically reduce engineering manhours while ensuring consistency and improving quality of SIS designs • Ability to link data directly from a PHA / LOPA to SIS Designs • Ability to leverage reliability models and SRS details from SIS Designs to automatically program the Safety PLC • Ability to leverage SIS Test Plans to automatically create Safety PLC application logic Test Plans for a virtual FAT • Ability to tie real-time data from Process Historians / CMMS to assumptions in the PHA / LOPA and SIS Designs to generate leading indicators on protection layer health • Ability to flag SIS bad actors for proactive corrective actions prior to a near miss or loss of containment event What if I told you all of this was available today? This paper will highlight how the digital revolution has revolutionized the way Safety Instrumented Systems are designed, configured, operated and maintained.Keywords: IEC 61511, safety instrumented systems, functional safety, digitalization, IIoT
Procedia PDF Downloads 18123605 Walmart Sales Forecasting using Machine Learning in Python
Authors: Niyati Sharma, Om Anand, Sanjeev Kumar Prasad
Abstract:
Assuming future sale value for any of the organizations is one of the major essential characteristics of tactical development. Walmart Sales Forecasting is the finest illustration to work with as a beginner; subsequently, it has the major retail data set. Walmart uses this sales estimate problem for hiring purposes also. We would like to analyzing how the internal and external effects of one of the largest companies in the US can walk out their Weekly Sales in the future. Demand forecasting is the planned prerequisite of products or services in the imminent on the basis of present and previous data and different stages of the market. Since all associations is facing the anonymous future and we do not distinguish in the future good demand. Hence, through exploring former statistics and recent market statistics, we envisage the forthcoming claim and building of individual goods, which are extra challenging in the near future. As a result of this, we are producing the required products in pursuance of the petition of the souk in advance. We will be using several machine learning models to test the exactness and then lastly, train the whole data by Using linear regression and fitting the training data into it. Accuracy is 8.88%. The extra trees regression model gives the best accuracy of 97.15%.Keywords: random forest algorithm, linear regression algorithm, extra trees classifier, mean absolute error
Procedia PDF Downloads 14923604 Anomaly Detection of Log Analysis using Data Visualization Techniques for Digital Forensics Audit and Investigation
Authors: Mohamed Fadzlee Sulaiman, Zainurrasyid Abdullah, Mohd Zabri Adil Talib, Aswami Fadillah Mohd Ariffin
Abstract:
In common digital forensics cases, investigation may rely on the analysis conducted on specific and relevant exhibits involved. Usually the investigation officer may define and advise digital forensic analyst about the goals and objectives to be achieved in reconstructing the trail of evidence while maintaining the specific scope of investigation. With the technology growth, people are starting to realize the importance of cyber security to their organization and this new perspective creates awareness that digital forensics auditing must come in place in order to measure possible threat or attack to their cyber-infrastructure. Instead of performing investigation on incident basis, auditing may broaden the scope of investigation to the level of anomaly detection in daily operation of organization’s cyber space. While handling a huge amount of data such as log files, performing digital forensics audit for large organization proven to be onerous task for the analyst either to analyze the huge files or to translate the findings in a way where the stakeholder can clearly understand. Data visualization can be emphasized in conducting digital forensic audit and investigation to resolve both needs. This study will identify the important factors that should be considered to perform data visualization techniques in order to detect anomaly that meet the digital forensic audit and investigation objectives.Keywords: digital forensic, data visualization, anomaly detection , log analysis, forensic audit, visualization techniques
Procedia PDF Downloads 28723603 Classification of Traffic Complex Acoustic Space
Abstract:
After years of development, the study of soundscape has been refined to the types of urban space and building. Traffic complex takes traffic function as the core, with obvious design features of architectural space combination and traffic streamline. The acoustic environment is strongly characterized by function, space, material, user and other factors. Traffic complex integrates various functions of business, accommodation, entertainment and so on. It has various forms, complex and varied experiences, and its acoustic environment is turned rich and interesting with distribution and coordination of various functions, division and unification of the mass, separation and organization of different space and the cross and the integration of multiple traffic flow. In this study, it made field recordings of each space of various traffic complex, and extracted and analyzed different acoustic elements, including changes in sound pressure, frequency distribution, steady sound source, sound source information and other aspects, to make cluster analysis of each independent traffic complex buildings. It divided complicated traffic complex building space into several typical sound space from acoustic environment perspective, mainly including stable sound space, high-pressure sound space, rhythm sound space and upheaval sound space. This classification can further deepen the study of subjective evaluation and control of the acoustic environment of traffic complex.Keywords: soundscape, traffic complex, cluster analysis, classification
Procedia PDF Downloads 252