Search results for: continuous data
25903 Iot Device Cost Effective Storage Architecture and Real-Time Data Analysis/Data Privacy Framework
Authors: Femi Elegbeleye, Omobayo Esan, Muienge Mbodila, Patrick Bowe
Abstract:
This paper focused on cost effective storage architecture using fog and cloud data storage gateway and presented the design of the framework for the data privacy model and data analytics framework on a real-time analysis when using machine learning method. The paper began with the system analysis, system architecture and its component design, as well as the overall system operations. The several results obtained from this study on data privacy model shows that when two or more data privacy model is combined we tend to have a more stronger privacy to our data, and when fog storage gateway have several advantages over using the traditional cloud storage, from our result shows fog has reduced latency/delay, low bandwidth consumption, and energy usage when been compare with cloud storage, therefore, fog storage will help to lessen excessive cost. This paper dwelt more on the system descriptions, the researchers focused on the research design and framework design for the data privacy model, data storage, and real-time analytics. This paper also shows the major system components and their framework specification. And lastly, the overall research system architecture was shown, its structure, and its interrelationships.Keywords: IoT, fog, cloud, data analysis, data privacy
Procedia PDF Downloads 9925902 Reclamation of Saline and Alkaline Soils through Aquaculture: A Review and Prospects for Future Research
Authors: M. Shivakumar, S. R. Somashekhar, C. V. Raju
Abstract:
Secondary salinization of agricultural lands in any command areas of the world is the major issue in the recent past. Currently, it is estimated that the 954 mh of saline and alkaline soil is present in the world. Thousands of hectares of land, getting added every year. Argentina, Bangladesh and Australia are most affected countries. In India, out of 142.80 million hectare (mh) cropped area, 56 mh is irrigated area. Of which, more than 9 mh (about 16.%) of land is found to be alkaline/saline. Due to continuous utilization of same land for same agricultural activities, excessive usage of fertilizers and water, most of the soils have become alkaline, saline or water logged. These lands are low productive and at times totally unfit for agricultural activities. These soils may or may not posses good physical condition, but plants may suffer from its inability to absorb water from salty solution. Plants suffer from dehydration and loose water to the soil, shrink, resulting death of plant. This process is called plasmolysis. It is the fact that soil is an independent, organic body of nature that acquires properties in accordance with forces which act upon it. Aquaculture is one of the solutions to utilize such problematic soils for food production. When the impoundments are constructed in an area 10-15% of the affected areas, the excess water along with the salts gets into impoundments and management of salt is easier in water than in the soil. Due to high organic input in aquaculture such as feed, manure and continuous deposition of fecal matter, pH of the soil gets reduced and over the period of time such soils can be put back into the original activity. Under National Agricultural Development Program (NADP), the project was implemented in 258 villages of Mandya District, Karnataka State, India and found that these lands can be effectively utilized for fish culture and increase the proteinacious food production by many folds while conserving the soils. The findings of the research can be adopted and up scaled in any country.Keywords: saline and alkaline soils, Aquaculture, Problematic soils, Reclamation
Procedia PDF Downloads 14125901 Sludge Densification: Emerging and Efficient Way to Look at Biological Nutrient Removal Treatment
Authors: Raj Chavan
Abstract:
Currently, there are over 14,500 Water Resource Recovery Facilities (WRRFs) in the United States, with ~35% of them having some type of nutrient limits in place. These WRRFs account for about 1% of overall power demand and 2% of total greenhouse gas emissions (GHG) in the United States and contribute for 10 to 15% of the overall nutrient load to surface rivers in the United States. The evolution of densification technologies toward more compact and energy-efficient nutrient removal processes has been impacted by a number of factors. Existing facilities that require capacity expansion or biomass densification for higher treatability within the same footprint are being subjected to more stringent requirements relating to nutrient removal prior to surface water discharge. Densification of activated sludge has received recent widespread interest as a means for achieving process intensification and nutrient removal at WRRFs. At the core of the technology are the aerobic sludge granules where the biological processes occur. There is considerable interest in the prospect of producing granular sludge in continuous (or traditional) activated sludge processes (CAS) or densification of biomass by moving activated sludge flocs to a denser aggregate of biomass as a highly effective technique of intensification. This presentation will provide a fundamental understanding of densification by presenting insights and practical issues. The topics that will be discussed include methods used to generate and retain densified granules; the mechanisms that allow biological flocs to densify; the role that physical selectors play in the densification of biological flocs; some viable ways for managing biological flocs that have become densified; effects of physical selection design parameters on the retention of densified biological flocs and finally some operational solutions for customizing the flocs and granules required to meet performance and capacity targets. In addition, it will present some case studies where biological and physical parameters were used to generate aerobic granular sludge in the continuous flow system.Keywords: densification, aerobic granular sludge, nutrient removal, intensification
Procedia PDF Downloads 18625900 The Effect of Air Injection in Irrigation Water on Sugar Beet Yield
Authors: Yusuf Ersoy Yildirim, Ismail Tas, Ceren Gorgusen, Tugba Yeter, Aysegul Boyacioglu, K. Mehmet Tugrul, Murat Tugrul, Ayten Namli, H. Sabri Ozturk, M. Onur Akca
Abstract:
In recent years, a lot of research has been done for the sustainable use of scarce resources in the world. Especially, effective and sustainable use of water resources has been researched for many years. Sub-surface drip irrigation (SDI) is one of the most effective irrigation methods in which efficient and sustainable use of irrigation water can be achieved. When the literature is taken into consideration, it is often emphasized that, besides its numerous advantages, it also allows the application of irrigation water to the plant root zone along with air. It is stated in different studies that the air applied to the plant root zone with irrigation water has a positive effect on the root zone. Plants need sufficient oxygen for root respiration as well as for the metabolic functions of the roots. Decreased root respiration due to low oxygen content reduces transpiration, disrupts the flow of ions, and increases the ingress of salt reaching toxic levels, seriously affecting plant growth. Lack of oxygen (Hypoxia) can affect the survival of plants. The lack of oxygen in the soil is related to the exchange of gases in the soil with the gases in the atmosphere. Soil aeration is an important physical parameter of a soil. It is highly dynamic and is closely related to the amount of water in the soil and its bulk weight. Subsurface drip irrigation; It has higher water use efficiency compared to irrigation methods such as furrow irrigation and sprinkler irrigation. However, in heavy clay soils, subsurface drip irrigation creates continuous wetting fronts that predispose the rhizosphere region to hypoxia or anoxia. With subsurface drip irrigation, the oxygen is limited for root microbial respiration and root development, with the continuous spreading of water to a certain region of the root zone. In this study, the change in sugar beet yield caused by air application in the SDI system will be explained.Keywords: sugar beet, subsurface drip irrigation, air application, irrigation efficiency
Procedia PDF Downloads 8125899 Comparison of Selected Pier-Scour Equations for Wide Piers Using Field Data
Authors: Nordila Ahmad, Thamer Mohammad, Bruce W. Melville, Zuliziana Suif
Abstract:
Current methods for predicting local scour at wide bridge piers, were developed on the basis of laboratory studies and very limited scour prediction were tested with field data. Laboratory wide pier scour equation from previous findings with field data were presented. A wide range of field data were used and it consists of both live-bed and clear-water scour. A method for assessing the quality of the data was developed and applied to the data set. Three other wide pier-scour equations from the literature were used to compare the performance of each predictive method. The best-performing scour equation were analyzed using statistical analysis. Comparisons of computed and observed scour depths indicate that the equation from the previous publication produced the smallest discrepancy ratio and RMSE value when compared with the large amount of laboratory and field data.Keywords: field data, local scour, scour equation, wide piers
Procedia PDF Downloads 41325898 BIM Model and Virtual Prototyping in Construction Management
Authors: Samar Alkindy
Abstract:
Purpose: The BIM model has been used to support the planning of different construction projects in the industry by showing the different stages of the construction process. The model has been instrumental in identifying some of the common errors in the construction process through the spatial arrangement. The continuous use of the BIM model in the construction industry has resulted in various radical changes such as virtual prototyping. Construction virtual prototyping is a highly advanced technology that incorporates a BIM model with realistic graphical simulations, and facilitates the simulation of the project before a product is built in the factory. The paper presents virtual prototyping in the construction industry by examining its application, challenges and benefits to a construction project. Methodology approach: A case study was conducted for this study in four major construction projects, which incorporate virtual construction prototyping in several stages of the construction project. Furthermore, there was the administration of interviews with the project manager and engineer and the planning manager. Findings: Data collected from the methodological approach shows a positive response for virtual construction prototyping in construction, especially concerning communication and visualization. Furthermore, the use of virtual prototyping has increased collaboration and efficiency between construction experts handling a project. During the planning stage, virtual prototyping has increased accuracy, reduced planning time, and reduced the amount of rework during the implementation stage. Irrespective of virtual prototyping being a new concept in the construction industry, the findings outline that the approach will benefit the management of construction projects.Keywords: construction operations, construction planning, process simulation, virtual prototyping
Procedia PDF Downloads 23125897 The Maximum Throughput Analysis of UAV Datalink 802.11b Protocol
Authors: Inkyu Kim, SangMan Moon
Abstract:
This IEEE 802.11b protocol provides up to 11Mbps data rate, whereas aerospace industry wants to seek higher data rate COTS data link system in the UAV. The Total Maximum Throughput (TMT) and delay time are studied on many researchers in the past years This paper provides theoretical data throughput performance of UAV formation flight data link using the existing 802.11b performance theory. We operate the UAV formation flight with more than 30 quad copters with 802.11b protocol. We may be predicting that UAV formation flight numbers have to bound data link protocol performance limitations.Keywords: UAV datalink, UAV formation flight datalink, UAV WLAN datalink application, UAV IEEE 802.11b datalink application
Procedia PDF Downloads 39225896 Methods for Distinction of Cattle Using Supervised Learning
Authors: Radoslav Židek, Veronika Šidlová, Radovan Kasarda, Birgit Fuerst-Waltl
Abstract:
Machine learning represents a set of topics dealing with the creation and evaluation of algorithms that facilitate pattern recognition, classification, and prediction, based on models derived from existing data. The data can present identification patterns which are used to classify into groups. The result of the analysis is the pattern which can be used for identification of data set without the need to obtain input data used for creation of this pattern. An important requirement in this process is careful data preparation validation of model used and its suitable interpretation. For breeders, it is important to know the origin of animals from the point of the genetic diversity. In case of missing pedigree information, other methods can be used for traceability of animal´s origin. Genetic diversity written in genetic data is holding relatively useful information to identify animals originated from individual countries. We can conclude that the application of data mining for molecular genetic data using supervised learning is an appropriate tool for hypothesis testing and identifying an individual.Keywords: genetic data, Pinzgau cattle, supervised learning, machine learning
Procedia PDF Downloads 55025895 Router 1X3 - RTL Design and Verification
Authors: Nidhi Gopal
Abstract:
Routing is the process of moving a packet of data from source to destination and enables messages to pass from one computer to another and eventually reach the target machine. A router is a networking device that forwards data packets between computer networks. It is connected to two or more data lines from different networks (as opposed to a network switch, which connects data lines from one single network). This paper mainly emphasizes upon the study of router device, its top level architecture, and how various sub-modules of router i.e. Register, FIFO, FSM and Synchronizer are synthesized, and simulated and finally connected to its top module.Keywords: data packets, networking, router, routing
Procedia PDF Downloads 81225894 Storm-Runoff Simulation Approaches for External Natural Catchments of Urban Sewer Systems
Authors: Joachim F. Sartor
Abstract:
According to German guidelines, external natural catchments are greater sub-catchments without significant portions of impervious areas, which possess a surface drainage system and empty in a sewer network. Basically, such catchments should be disconnected from sewer networks, particularly from combined systems. If this is not possible due to local conditions, their flow hydrographs have to be considered at the design of sewer systems, because the impact may be significant. Since there is a lack of sufficient measurements of storm-runoff events for such catchments and hence verified simulation methods to analyze their design flows, German standards give only general advices and demands special considerations in such cases. Compared to urban sub-catchments, external natural catchments exhibit greatly different flow characteristics. With increasing area size their hydrological behavior approximates that of rural catchments, e.g. sub-surface flow may prevail and lag times are comparable long. There are few observed peak flow values and simple (mostly empirical) approaches that are offered by literature for Central Europe. Most of them are at least helpful to crosscheck results that are achieved by simulation lacking calibration. Using storm-runoff data from five monitored rural watersheds in the west of Germany with catchment areas between 0.33 and 1.07 km2 , the author investigated by multiple event simulation three different approaches to determine the rainfall excess. These are the modified SCS variable run-off coefficient methods by Lutz and Zaiß as well as the soil moisture model by Ostrowski. Selection criteria for storm events from continuous precipitation data were taken from recommendations of M 165 and the runoff concentration method (parallel cascades of linear reservoirs) from a DWA working report to which the author had contributed. In general, the two run-off coefficient methods showed results that are of sufficient accuracy for most practical purposes. The soil moisture model showed no significant better results, at least not to such a degree that it would justify the additional data collection that its parameter determination requires. Particularly typical convective summer events after long dry periods, that are often decisive for sewer networks (not so much for rivers), showed discrepancies between simulated and measured flow hydrographs.Keywords: external natural catchments, sewer network design, storm-runoff modelling, urban drainage
Procedia PDF Downloads 15125893 Noise Reduction in Web Data: A Learning Approach Based on Dynamic User Interests
Authors: Julius Onyancha, Valentina Plekhanova
Abstract:
One of the significant issues facing web users is the amount of noise in web data which hinders the process of finding useful information in relation to their dynamic interests. Current research works consider noise as any data that does not form part of the main web page and propose noise web data reduction tools which mainly focus on eliminating noise in relation to the content and layout of web data. This paper argues that not all data that form part of the main web page is of a user interest and not all noise data is actually noise to a given user. Therefore, learning of noise web data allocated to the user requests ensures not only reduction of noisiness level in a web user profile, but also a decrease in the loss of useful information hence improves the quality of a web user profile. Noise Web Data Learning (NWDL) tool/algorithm capable of learning noise web data in web user profile is proposed. The proposed work considers elimination of noise data in relation to dynamic user interest. In order to validate the performance of the proposed work, an experimental design setup is presented. The results obtained are compared with the current algorithms applied in noise web data reduction process. The experimental results show that the proposed work considers the dynamic change of user interest prior to elimination of noise data. The proposed work contributes towards improving the quality of a web user profile by reducing the amount of useful information eliminated as noise.Keywords: web log data, web user profile, user interest, noise web data learning, machine learning
Procedia PDF Downloads 26525892 A Virtual Set-Up to Evaluate Augmented Reality Effect on Simulated Driving
Authors: Alicia Yanadira Nava Fuentes, Ilse Cervantes Camacho, Amadeo José Argüelles Cruz, Ana María Balboa Verduzco
Abstract:
Augmented reality promises being present in future driving, with its immersive technology let to show directions and maps to identify important places indicating with graphic elements when the car driver requires the information. On the other side, driving is considered a multitasking activity and, for some people, a complex activity where different situations commonly occur that require the immediate attention of the car driver to make decisions that contribute to avoid accidents; therefore, the main aim of the project is the instrumentation of a platform with biometric sensors that allows evaluating the performance in driving vehicles with the influence of augmented reality devices to detect the level of attention in drivers, since it is important to know the effect that it produces. In this study, the physiological sensors EPOC X (EEG), ECG06 PRO and EMG Myoware are joined in the driving test platform with a Logitech G29 steering wheel and the simulation software City Car Driving in which the level of traffic can be controlled, as well as the number of pedestrians that exist within the simulation obtaining a driver interaction in real mode and through a MSP430 microcontroller achieves the acquisition of data for storage. The sensors bring a continuous analog signal in time that needs signal conditioning, at this point, a signal amplifier is incorporated due to the acquired signals having a sensitive range of 1.25 mm/mV, also filtering that consists in eliminating the frequency bands of the signal in order to be interpretative and without noise to convert it from an analog signal into a digital signal to analyze the physiological signals of the drivers, these values are stored in a database. Based on this compilation, we work on the extraction of signal features and implement K-NN (k-nearest neighbor) classification methods and decision trees (unsupervised learning) that enable the study of data for the identification of patterns and determine by classification methods different effects of augmented reality on drivers. The expected results of this project include are a test platform instrumented with biometric sensors for data acquisition during driving and a database with the required variables to determine the effect caused by augmented reality on people in simulated driving.Keywords: augmented reality, driving, physiological signals, test platform
Procedia PDF Downloads 14125891 On Kantorovich-Stancu Type Operators with the Variation Detracting Property
Authors: Özlem Öksüzer
Abstract:
In this paper, we introduce variation detracting property of Kantorovich-Stancu type operators in the space of functions of bounded variation. These problems are studied with respect to the variation seminorm.Keywords: Kantorovich-Stancu type operators, variation seminorm, variation detracting property, absolutely continuous function
Procedia PDF Downloads 40725890 Data Mining and Knowledge Management Application to Enhance Business Operations: An Exploratory Study
Authors: Zeba Mahmood
Abstract:
The modern business organizations are adopting technological advancement to achieve competitive edge and satisfy their consumer. The development in the field of Information technology systems has changed the way of conducting business today. Business operations today rely more on the data they obtained and this data is continuously increasing in volume. The data stored in different locations is difficult to find and use without the effective implementation of Data mining and Knowledge management techniques. Organizations who smartly identify, obtain and then convert data in useful formats for their decision making and operational improvements create additional value for their customers and enhance their operational capabilities. Marketers and Customer relationship departments of firm use Data mining techniques to make relevant decisions, this paper emphasizes on the identification of different data mining and Knowledge management techniques that are applied to different business industries. The challenges and issues of execution of these techniques are also discussed and critically analyzed in this paper.Keywords: knowledge, knowledge management, knowledge discovery in databases, business, operational, information, data mining
Procedia PDF Downloads 53825889 Box Counting Dimension of the Union L of Trinomial Curves When α ≥ 1
Authors: Kaoutar Lamrini Uahabi, Mohamed Atounti
Abstract:
In the present work, we consider one category of curves denoted by L(p, k, r, n). These curves are continuous arcs which are trajectories of roots of the trinomial equation zn = αzk + (1 − α), where z is a complex number, n and k are two integers such that 1 ≤ k ≤ n − 1 and α is a real parameter greater than 1. Denoting by L the union of all trinomial curves L(p, k, r, n) and using the box counting dimension as fractal dimension, we will prove that the dimension of L is equal to 3/2.Keywords: feasible angles, fractal dimension, Minkowski sausage, trinomial curves, trinomial equation
Procedia PDF Downloads 18925888 Indexing and Incremental Approach Using Map Reduce Bipartite Graph (MRBG) for Mining Evolving Big Data
Authors: Adarsh Shroff
Abstract:
Big data is a collection of dataset so large and complex that it becomes difficult to process using data base management tools. To perform operations like search, analysis, visualization on big data by using data mining; which is the process of extraction of patterns or knowledge from large data set. In recent years, the data mining applications become stale and obsolete over time. Incremental processing is a promising approach to refreshing mining results. It utilizes previously saved states to avoid the expense of re-computation from scratch. This project uses i2MapReduce, an incremental processing extension to Map Reduce, the most widely used framework for mining big data. I2MapReduce performs key-value pair level incremental processing rather than task level re-computation, supports not only one-step computation but also more sophisticated iterative computation, which is widely used in data mining applications, and incorporates a set of novel techniques to reduce I/O overhead for accessing preserved fine-grain computation states. To optimize the mining results, evaluate i2MapReduce using a one-step algorithm and three iterative algorithms with diverse computation characteristics for efficient mining.Keywords: big data, map reduce, incremental processing, iterative computation
Procedia PDF Downloads 35025887 Designing a Model for Measuring the Components of Good Governance in the Iranian Higher Education System
Authors: Maria Ghorbanian, Mohammad Ghahramani, Mahmood Abolghasemi
Abstract:
Universities and institutions of higher education in Iran, like other higher education institutions in the world, have a heavy mission and task to educate students based on the needs of the country. Taking on such a serious responsibility requires having a good governance system for planning, formulating executive plans, evaluating, and finally modifying them in accordance with the current conditions and challenges ahead. In this regard, the present study was conducted with the aim of identifying the components of good governance in the Iranian higher education system by survey method and with a quantitative approach. In order to collect data, a researcher-made questionnaire was used, which includes two parts: personal and professional characteristics (5 questions) and the three components of good governance in the Iranian higher education system, including good management and leadership (8 items), continuous evaluation and effective (university performance, finance, and university appointments) (8 items) and civic responsibility and sustainable development (7 items). These variables were measured and coded in the form of a five-level Likert scale from "Very Low = 1" to "Very High = 5". First, the validity and reliability of the research model were examined. In order to calculate the reliability of the questionnaire, two methods of Cronbach's alpha and combined reliability were used. Fornell-Larker interaction and criterion were also used to determine the degree of diagnostic validity. The statistical population of this study included all faculty members of public universities in Tehran (N = 4429). The sample size was estimated to be 340 using the Cochran's formula. These numbers were studied using a randomized method with a proportional assignment. The data were analyzed by the structural equation method with the least-squares approach. The results showed that the component of civil responsibility and sustainable development with a factor load of 0.827 is the most important element of good governance.Keywords: good governance, higher education, sustainable, development
Procedia PDF Downloads 17125886 An Analysis on Gravel of Sand-Gravel Bar at Gneiss or Granite Area of the Upper Hongcheon River in South Korea
Authors: Man Kyu Kim, Hansu Shin
Abstract:
This study is an analysis on gravel of sand-gravel bar that stretches variously in the Duchon and Naechon stream basins, which are situated on Hong-Cheon River (a well-developed sand-gravel bar in upstream river) basins in Korea. Naechon stream mostly flows through granite zone but Duchon stream mostly flows through gneiss zone. The characteristics of gravel in the sand-gravel bar of these two branches in the upper Hongcheon River were analyzed in this study in order to understand the geomorphic development of streams depending on the differences of bedrock. Through the analysis on the roundness and flatness of gravel, we figured out an irregular trend following the increase in supply of granite gravel and gneiss gravel as we traveled downstream. The result shows that the two basins have uppermost small basin condition reflecting the mountain valley environment although it may be difficult to do an equivalent comparison to other roundness researches in Korea or in Europe. This study conducted an analysis on gravels found in small scale streams unlike the previous studies trend which mostly studies large rivers. The research provides an opportunity to offer basic data for continuous comparison research on various small basins.Keywords: flatness, geology, roundness, sand-gravel bar
Procedia PDF Downloads 36625885 Continuous-Time Convertible Lease Pricing and Firm Value
Authors: Ons Triki, Fathi Abid
Abstract:
Along with the increase in the use of leasing contracts in corporate finance, multiple studies aim to model the credit risk of the lease in order to cover the losses of the lessor of the asset if the lessee goes bankrupt. In the current research paper, a convertible lease contract is elaborated in a continuous time stochastic universe aiming to ensure the financial stability of the firm and quickly recover the losses of the counterparties to the lease in case of default. This work examines the term structure of the lease rates taking into account the credit default risk and the capital structure of the firm. The interaction between the lessee's capital structure and the equilibrium lease rate has been assessed by applying the competitive lease market argument developed by Grenadier (1996) and the endogenous structural default model set forward by Leland and Toft (1996). The cumulative probability of default was calculated by referring to Leland and Toft (1996) and Yildirim and Huan (2006). Additionally, the link between lessee credit risk and lease rate was addressed so as to explore the impact of convertible lease financing on the term structure of the lease rate, the optimal leverage ratio, the cumulative default probability, and the optimal firm value by applying an endogenous conversion threshold. The numerical analysis is suggestive that the duration structure of lease rates increases with the increase in the degree of the market price of risk. The maximal value of the firm decreases with the effect of the optimal leverage ratio. The results are indicative that the cumulative probability of default increases with the maturity of the lease contract if the volatility of the asset service flows is significant. Introducing the convertible lease contract will increase the optimal value of the firm as a function of asset volatility for a high initial service flow level and a conversion ratio close to 1.Keywords: convertible lease contract, lease rate, credit-risk, capital structure, default probability
Procedia PDF Downloads 9825884 Analyzing Large Scale Recurrent Event Data with a Divide-And-Conquer Approach
Authors: Jerry Q. Cheng
Abstract:
Currently, in analyzing large-scale recurrent event data, there are many challenges such as memory limitations, unscalable computing time, etc. In this research, a divide-and-conquer method is proposed using parametric frailty models. Specifically, the data is randomly divided into many subsets, and the maximum likelihood estimator from each individual data set is obtained. Then a weighted method is proposed to combine these individual estimators as the final estimator. It is shown that this divide-and-conquer estimator is asymptotically equivalent to the estimator based on the full data. Simulation studies are conducted to demonstrate the performance of this proposed method. This approach is applied to a large real dataset of repeated heart failure hospitalizations.Keywords: big data analytics, divide-and-conquer, recurrent event data, statistical computing
Procedia PDF Downloads 16525883 Authorship Attribution Using Sociolinguistic Profiling When Considering Civil and Criminal Cases
Authors: Diana A. Sokolova
Abstract:
This article is devoted to one of the possibilities for identifying the author of an oral or written text - sociolinguistic profiling. Sociolinguistic profiling is utilized as a forensic linguistics technique to identify individuals through language patterns, particularly in criminal cases. It examines how social factors influence language use. This study aims to showcase the significance of linguistic profiling for attributing authorship in texts and emphasizes the necessity for its continuous enhancement while considering its strengths and weaknesses. The study employs semantic-syntactic, lexical-semantic, linguopragmatic, logical, presupposition, authorization, and content analysis methods to investigate linguistic profiling. The research highlights the relevance of sociolinguistic profiling in authorship attribution and underscores the importance of ongoing refinement of the technique, considering its limitations. This study emphasizes the practical application of linguistic profiling in legal settings and underscores the impact of social factors on language use, contributing to the field of forensic linguistics. Data collection involves collecting oral and written texts from criminal and civil court cases to analyze language patterns for authorship attribution. The collected data is analyzed using various linguistic analysis methods to identify individual characteristics and patterns that can aid in authorship attribution. The study addresses the effectiveness of sociolinguistic profiling in identifying authors of texts and explores the impact of social factors on language use in legal contexts. In spite of advantages challenges in linguistics profiling have spurred debates and controversies in academic circles, legal environments, and the public sphere. So, this research highlights the significance of sociolinguistic profiling in authorship attribution and emphasizes the need for further development of this method, considering its strengths and weaknesses.Keywords: authorship attribution, detection of identifying, dialect, features, forensic linguistics, social influence, sociolinguistics, unique speech characteristics
Procedia PDF Downloads 3625882 Enablers of Total Quality Management for Social Enterprises: A Study of UAE Social Organizations
Authors: Farhat Sultana
Abstract:
Originality: TQM principles are considered the tools to enhance organizational performance for most organizations. The paper contributes to the literature on the social enterprise because social organizations are still far behind in implementing TQM as compared to other private, public, and nonprofit organizations. Study design: The study is based on the data and information provided by two case studies and one focus group of social enterprises. Purpose: The purpose of the study is to get a deep understating of TQM implementation and to recognize the enablers of TQM in social enterprises that enhance the organizational performance of social enterprises located in UAE. Findings: As per the findings of the study, key enablers of Total Quality management in the case enterprises are leadership support, strategic approach for quality, continuous improvement, process improvement, employee empowerment and customer focus practices, though some inhibitors for TQM implementation such as managerial structure for quality assurance and performance appraisal mechanism are also pointed out by the study. Research limitations: The study findings are only based on two case studies and one focus group, which is not enough to generalize the findings to all social organizations. Practical Implications: Identified TQM enablers can help management to implement TQM successfully in social enterprises. Social implications: The study provides enabling path for Social enterprises to implement TQM to seek quality output to build a better society.Keywords: TQM, social enterprise, enablers of TQM, UAE
Procedia PDF Downloads 11025881 A Web and Cloud-Based Measurement System Analysis Tool for the Automotive Industry
Authors: C. A. Barros, Ana P. Barroso
Abstract:
Any industrial company needs to determine the amount of variation that exists within its measurement process and guarantee the reliability of their data, studying the performance of their measurement system, in terms of linearity, bias, repeatability and reproducibility and stability. This issue is critical for automotive industry suppliers, who are required to be certified by the 16949:2016 standard (replaces the ISO/TS 16949) of International Automotive Task Force, defining the requirements of a quality management system for companies in the automotive industry. Measurement System Analysis (MSA) is one of the mandatory tools. Frequently, the measurement system in companies is not connected to the equipment and do not incorporate the methods proposed by the Automotive Industry Action Group (AIAG). To address these constraints, an R&D project is in progress, whose objective is to develop a web and cloud-based MSA tool. This MSA tool incorporates Industry 4.0 concepts, such as, Internet of Things (IoT) protocols to assure the connection with the measuring equipment, cloud computing, artificial intelligence, statistical tools, and advanced mathematical algorithms. This paper presents the preliminary findings of the project. The web and cloud-based MSA tool is innovative because it implements all statistical tests proposed in the MSA-4 reference manual from AIAG as well as other emerging methods and techniques. As it is integrated with the measuring devices, it reduces the manual input of data and therefore the errors. The tool ensures traceability of all performed tests and can be used in quality laboratories and in the production lines. Besides, it monitors MSAs over time, allowing both the analysis of deviations from the variation of the measurements performed and the management of measurement equipment and calibrations. To develop the MSA tool a ten-step approach was implemented. Firstly, it was performed a benchmarking analysis of the current competitors and commercial solutions linked to MSA, concerning Industry 4.0 paradigm. Next, an analysis of the size of the target market for the MSA tool was done. Afterwards, data flow and traceability requirements were analysed in order to implement an IoT data network that interconnects with the equipment, preferably via wireless. The MSA web solution was designed under UI/UX principles and an API in python language was developed to perform the algorithms and the statistical analysis. Continuous validation of the tool by companies is being performed to assure real time management of the ‘big data’. The main results of this R&D project are: MSA Tool, web and cloud-based; Python API; New Algorithms to the market; and Style Guide of UI/UX of the tool. The MSA tool proposed adds value to the state of the art as it ensures an effective response to the new challenges of measurement systems, which are increasingly critical in production processes. Although the automotive industry has triggered the development of this innovative MSA tool, other industries would also benefit from it. Currently, companies from molds and plastics, chemical and food industry are already validating it.Keywords: automotive Industry, industry 4.0, Internet of Things, IATF 16949:2016, measurement system analysis
Procedia PDF Downloads 21425880 Adoption of Big Data by Global Chemical Industries
Authors: Ashiff Khan, A. Seetharaman, Abhijit Dasgupta
Abstract:
The new era of big data (BD) is influencing chemical industries tremendously, providing several opportunities to reshape the way they operate and help them shift towards intelligent manufacturing. Given the availability of free software and the large amount of real-time data generated and stored in process plants, chemical industries are still in the early stages of big data adoption. The industry is just starting to realize the importance of the large amount of data it owns to make the right decisions and support its strategies. This article explores the importance of professional competencies and data science that influence BD in chemical industries to help it move towards intelligent manufacturing fast and reliable. This article utilizes a literature review and identifies potential applications in the chemical industry to move from conventional methods to a data-driven approach. The scope of this document is limited to the adoption of BD in chemical industries and the variables identified in this article. To achieve this objective, government, academia, and industry must work together to overcome all present and future challenges.Keywords: chemical engineering, big data analytics, industrial revolution, professional competence, data science
Procedia PDF Downloads 8525879 Decision-Making in Higher Education: Case Studies Demonstrating the Value of Institutional Effectiveness Tools
Authors: Carolinda Douglass
Abstract:
Institutional Effectiveness (IE) is the purposeful integration of functions that foster student success and support institutional performance. IE is growing rapidly within higher education as it is increasingly viewed by higher education administrators as a beneficial approach for promoting data-informed decision-making in campus-wide strategic planning and execution of strategic initiatives. Specific IE tools, including, but not limited to, project management; impactful collaboration and communication; commitment to continuous quality improvement; and accountability through rigorous evaluation; are gaining momentum under the auspices of IE. This research utilizes a case study approach to examine the use of these IE tools, highlight successes of this use, and identify areas for improvement in the implementation of IE tools within higher education. The research includes three case studies: (1) improving upon academic program review processes including the assessment of student learning outcomes as a core component of program quality; (2) revising an institutional vision, mission, and core values; and (3) successfully navigating an institution-wide re-accreditation process. Several methods of data collection are embedded within the case studies, including surveys, focus groups, interviews, and document analyses. Subjects of these methods include higher education administrators, faculty, and staff. Key findings from the research include areas of success and areas for improvement in the use of IE tools associated with specific case studies as well as aggregated results across case studies. For example, the use of case management proved useful in all of the case studies, while rigorous evaluation did not uniformly provide the value-added that was expected by higher education decision-makers. The use of multiple IE tools was shown to be consistently useful in decision-making when applied with appropriate awareness of and sensitivity to core institutional culture (for example, institutional mission, local environments and communities, disciplinary distinctions, and labor relations). As IE gains a stronger foothold in higher education, leaders in higher education can make judicious use of IE tools to promote better decision-making and secure improved outcomes of strategic planning and the execution of strategic initiatives.Keywords: accreditation, data-informed decision-making, higher education management, institutional effectiveness tools, institutional mission, program review, strategic planning
Procedia PDF Downloads 11625878 Secure Multiparty Computations for Privacy Preserving Classifiers
Authors: M. Sumana, K. S. Hareesha
Abstract:
Secure computations are essential while performing privacy preserving data mining. Distributed privacy preserving data mining involve two to more sites that cannot pool in their data to a third party due to the violation of law regarding the individual. Hence in order to model the private data without compromising privacy and information loss, secure multiparty computations are used. Secure computations of product, mean, variance, dot product, sigmoid function using the additive and multiplicative homomorphic property is discussed. The computations are performed on vertically partitioned data with a single site holding the class value.Keywords: homomorphic property, secure product, secure mean and variance, secure dot product, vertically partitioned data
Procedia PDF Downloads 41225877 Curative Role of Bromoenol Lactone, an Inhibitor of Phospholipase A2 Enzyme, during Cigarette Smoke Condensate Induced Anomalies in Lung Epithelium
Authors: Subodh Kumar, Sanjeev Kumar Sharma, Gaurav Kaushik, Pramod Avti, Phulen Sarma, Bikash Medhi, Krishan Lal Khanduja
Abstract:
Background: It is well known that cigarette smoke is one of the causative factors in various lung diseases especially cancer. Carcinogens and oxidant molecules present in cigarette smoke not only damage the cellular constituents (lipids, proteins, DNA) but may also regulate the molecular pathways involved in inflammation and cancer. Continuous oxidative stress caused by the constituents of cigarette smoke leads to higher PhospholipaseA₂ (PLA₂) activity, resulting in elevated levels of secondary metabolites whose role is well defined in cancer. To reduce the burden of chronic inflammation as well as oxidative stress, and higher levels of secondary metabolites, we checked the curative potential of PLA₂ inhibitor Bromoenol Lactone (BEL) during continuous exposure of cigarette smoke condensate (CSC). Aim: To check the therapeutic potential of Bromoenol Lactone (BEL), an inhibitor of PhospholipaseA₂s, in pathways of CSC-induced changes in type I and type II alveolar epithelial cells. Methods: Effect of BEL on CSC-induced PLA2 activity were checked using colorimetric assay, cellular toxicity using cell viability assay, membrane integrity using fluorescein di-acetate (FDA) uptake assay, reactive oxygen species (ROS) levels and apoptosis markers through flow cytometry, and cellular regulation using MAPKinases levels, in lung epithelium. Results: BEL significantly mimicked CSC-induced PLA₂ activity, ROS levels, apoptosis, and kinases level whereas improved cellular viability and membrane integrity. Conclusions: Current observations revealed that BEL may be a potential therapeutic agent during Cigarette smoke-induced anomalies in lung epithelium.Keywords: cigarette smoke condensate, phospholipase A₂, oxidative stress, alveolar epithelium, bromoenol lactone
Procedia PDF Downloads 18925876 Online Monitoring and Control of Continuous Mechanosynthesis by UV-Vis Spectrophotometry
Authors: Darren A. Whitaker, Dan Palmer, Jens Wesholowski, James Flaherty, John Mack, Ahmad B. Albadarin, Gavin Walker
Abstract:
Traditional mechanosynthesis has been performed by either ball milling or manual grinding. However, neither of these techniques allow the easy application of process control. The temperature may change unpredictably due to friction in the process. Hence the amount of energy transferred to the reactants is intrinsically non-uniform. Recently, it has been shown that the use of Twin-Screw extrusion (TSE) can overcome these limitations. Additionally, TSE enables a platform for continuous synthesis or manufacturing as it is an open-ended process, with feedstocks at one end and product at the other. Several materials including metal-organic frameworks (MOFs), co-crystals and small organic molecules have been produced mechanochemically using TSE. The described advantages of TSE are offset by drawbacks such as increased process complexity (a large number of process parameters) and variation in feedstock flow impacting on product quality. To handle the above-mentioned drawbacks, this study utilizes UV-Vis spectrophotometry (InSpectroX, ColVisTec) as an online tool to gain real-time information about the quality of the product. Additionally, this is combined with real-time process information in an Advanced Process Control system (PharmaMV, Perceptive Engineering) allowing full supervision and control of the TSE process. Further, by characterizing the dynamic behavior of the TSE, a model predictive controller (MPC) can be employed to ensure the process remains under control when perturbed by external disturbances. Two reactions were studied; a Knoevenagel condensation reaction of barbituric acid and vanillin and, the direct amidation of hydroquinone by ammonium acetate to form N-Acetyl-para-aminophenol (APAP) commonly known as paracetamol. Both reactions could be carried out continuously using TSE, nuclear magnetic resonance (NMR) spectroscopy was used to confirm the percentage conversion of starting materials to product. This information was used to construct partial least squares (PLS) calibration models within the PharmaMV development system, which relates the percent conversion to product to the acquired UV-Vis spectrum. Once this was complete, the model was deployed within the PharmaMV Real-Time System to carry out automated optimization experiments to maximize the percentage conversion based on a set of process parameters in a design of experiments (DoE) style methodology. With the optimum set of process parameters established, a series of PRBS process response tests (i.e. Pseudo-Random Binary Sequences) around the optimum were conducted. The resultant dataset was used to build a statistical model and associated MPC. The controller maximizes product quality whilst ensuring the process remains at the optimum even as disturbances such as raw material variability are introduced into the system. To summarize, a combination of online spectral monitoring and advanced process control was used to develop a robust system for optimization and control of two TSE based mechanosynthetic processes.Keywords: continuous synthesis, pharmaceutical, spectroscopy, advanced process control
Procedia PDF Downloads 17725875 Changing Behaviour in the Digital Era: A Concrete Use Case from the Domain of Health
Authors: Francesca Spagnoli, Shenja van der Graaf, Pieter Ballon
Abstract:
Humans do not behave rationally. We are emotional, easily influenced by others, as well as by our context. The study of human behaviour became a supreme endeavour within many academic disciplines, including economics, sociology, and clinical and social psychology. Understanding what motivates humans and triggers them to perform certain activities, and what it takes to change their behaviour, is central both for researchers and companies, as well as policy makers to implement efficient public policies. While numerous theoretical approaches for diverse domains such as health, retail, environment have been developed, the methodological models guiding the evaluation of such research have reached for a long time their limits. Within this context, digitisation, the Information and communication technologies (ICT) and wearable, the Internet of Things (IoT) connecting networks of devices, and new possibilities to collect and analyse massive amounts of data made it possible to study behaviour from a realistic perspective, as never before. Digital technologies make it possible to (1) capture data in real-life settings, (2) regain control over data by capturing the context of behaviour, and (3) analyse huge set of information through continuous measurement. Within this complex context, this paper describes a new framework for initiating behavioural change, capitalising on the digital developments in applied research projects and applicable both to academia, enterprises and policy makers. By applying this model, behavioural research can be conducted to address the issues of different domains, such as mobility, environment, health or media. The Modular Behavioural Analysis Approach (MBAA) is here described and firstly validated through a concrete use case within the domain of health. The results gathered have proven that disclosing information about health in connection with the use of digital apps for health, can be a leverage for changing behaviour, but it is only a first component requiring further follow-up actions. To this end, a clear definition of different 'behavioural profiles', towards which addressing several typologies of interventions, it is essential to effectively enable behavioural change. In the refined version of the MBAA a strong focus will rely on defining a methodology for shaping 'behavioural profiles' and related interventions, as well as the evaluation of side-effects on the creation of new business models and sustainability plans.Keywords: behavioural change, framework, health, nudging, sustainability
Procedia PDF Downloads 22125874 Satellite Interferometric Investigations of Subsidence Events Associated with Groundwater Extraction in Sao Paulo, Brazil
Authors: B. Mendonça, D. Sandwell
Abstract:
The Metropolitan Region of Sao Paulo (MRSP) has suffered from serious water scarcity. Consequently, the most convenient solution has been building wells to extract groundwater from local aquifers. However, it requires constant vigilance to prevent over extraction and future events that can pose serious threat to the population, such as subsidence. Radar imaging techniques (InSAR) have allowed continuous investigation of such phenomena. The analysis of data in the present study consists of 23 SAR images dated from October 2007 to March 2011, obtained by the ALOS-1 spacecraft. Data processing was made with the software GMTSAR, by using the InSAR technique to create pairs of interferograms with ground displacement during different time spans. First results show a correlation between the location of 102 wells registered in 2009 and signals of ground displacement equal or lower than -90 millimeters (mm) in the region. The longest time span interferogram obtained dates from October 2007 to March 2010. As a result, from that interferogram, it was possible to detect the average velocity of displacement in millimeters per year (mm/y), and which areas strong signals have persisted in the MRSP. Four specific areas with signals of subsidence of 28 mm/y to 40 mm/y were chosen to investigate the phenomenon: Guarulhos (Sao Paulo International Airport), the Greater Sao Paulo, Itaquera and Sao Caetano do Sul. The coverage area of the signals was between 0.6 km and 1.65 km of length. All areas are located above a sedimentary type of aquifer. Itaquera and Sao Caetano do Sul showed signals varying from 28 mm/y to 32 mm/y. On the other hand, the places most likely to be suffering from stronger subsidence are the ones in the Greater Sao Paulo and Guarulhos, right beside the International Airport of Sao Paulo. The rate of displacement observed in both regions goes from 35 mm/y to 40 mm/y. Previous investigations of the water use at the International Airport highlight the risks of excessive water extraction that was being done through 9 deep wells. Therefore, it is affirmed that subsidence events are likely to occur and to cause serious damage in the area. This study could show a situation that has not been explored with proper importance in the city, given its social and economic consequences. Since the data were only available until 2011, the question that remains is if the situation still persists. It could be reaffirmed, however, a scenario of risk at the International Airport of Sao Paulo that needs further investigation.Keywords: ground subsidence, Interferometric Satellite Aperture Radar (InSAR), metropolitan region of Sao Paulo, water extraction
Procedia PDF Downloads 354