Search results for: data quality
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 9635

Search results for: data quality

8975 Efficacy of Garlic and Chili Combination Solution on Cabbage Insect Pests and Crop Growth in Vietnam

Authors: Nguyen Minh Tuan, Bui Lan Anh, Bui Nu Hoang Anh

Abstract:

The study was conducted to evaluate the efficiency of Garlic and Chili combination solution on control of insect pests in cabbage crop. The solution was sprayed at different intervals after transplanting. The efficiency of Garlic and chili combination solution on cabbage insect pests was measured. Results revealed that Garlic and chili combination solution was the effectively reduced cabbage insect pests. On other hand, the spray solution not only reduced the number of days required for the cabbage growth but also greatly enhanced the leaf number, head diameter, head weight, and quality of cabbage. Garlic and chili combination solution have positive effects on pests reduction and improve growth, yield and quality of cabbage vegetable.

Keywords: Cabbage, Garlic, Chili, Diamondback moth, Cutworm, Flea Beetle, Quality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7042
8974 SVM Based Model as an Optimal Classifier for the Classification of Sonar Signals

Authors: Suresh S. Salankar, Balasaheb M. Patre

Abstract:

Research into the problem of classification of sonar signals has been taken up as a challenging task for the neural networks. This paper investigates the design of an optimal classifier using a Multi layer Perceptron Neural Network (MLP NN) and Support Vector Machines (SVM). Results obtained using sonar data sets suggest that SVM classifier perform well in comparison with well-known MLP NN classifier. An average classification accuracy of 91.974% is achieved with SVM classifier and 90.3609% with MLP NN classifier, on the test instances. The area under the Receiver Operating Characteristics (ROC) curve for the proposed SVM classifier on test data set is found as 0.981183, which is very close to unity and this clearly confirms the excellent quality of the proposed classifier. The SVM classifier employed in this paper is implemented using kernel Adatron algorithm is seen to be robust and relatively insensitive to the parameter initialization in comparison to MLP NN.

Keywords: Classification, MLP NN, backpropagation algorithm, SVM, Receiver Operating Characteristics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1819
8973 Numerical Investigation of Displacement Ventilation Effectiveness

Authors: Ramy H. Mohammed

Abstract:

Displacement ventilation of a room with an occupant is modeled using CFD. The geometry of manikin is accurately represented in CFD model to minimize potential. Indoor zero equation turbulence model is used to simulate all cases and the effect of the thermal radiation from manikin is taken into account. After validation of the code, predicted mean vote, mean age of air, and ventilation effectiveness are used to predict the thermal comfort zones and indoor air quality. The effect of the inlet velocity and temperature on the thermal comfort and indoor air quality is investigated. The results show that the inlet velocity has great effect on the thermal comfort and indoor air quality and low inlet velocity is sufficient to establish comfortable conditions inside the room. In addition, the displacement ventilation system achieves not only thermal comfort in ventilated rooms, but also energy saving of fan power.

Keywords: Displacement ventilation, Energy saving, Thermal comfort, Turbulence model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2594
8972 Probability Distribution of Rainfall Depth at Hourly Time-Scale

Authors: S. Dan'azumi, S. Shamsudin, A. A. Rahman

Abstract:

Rainfall data at fine resolution and knowledge of its characteristics plays a major role in the efficient design and operation of agricultural, telecommunication, runoff and erosion control as well as water quality control systems. The paper is aimed to study the statistical distribution of hourly rainfall depth for 12 representative stations spread across Peninsular Malaysia. Hourly rainfall data of 10 to 22 years period were collected and its statistical characteristics were estimated. Three probability distributions namely, Generalized Pareto, Exponential and Gamma distributions were proposed to model the hourly rainfall depth, and three goodness-of-fit tests, namely, Kolmogorov-Sminov, Anderson-Darling and Chi-Squared tests were used to evaluate their fitness. Result indicates that the east cost of the Peninsular receives higher depth of rainfall as compared to west coast. However, the rainfall frequency is found to be irregular. Also result from the goodness-of-fit tests show that all the three models fit the rainfall data at 1% level of significance. However, Generalized Pareto fits better than Exponential and Gamma distributions and is therefore recommended as the best fit.

Keywords: Goodness-of-fit test, Hourly rainfall, Malaysia, Probability distribution.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2919
8971 Performance Evaluation of Data Mining Techniques for Predicting Software Reliability

Authors: Pradeep Kumar, Abdul Wahid

Abstract:

Accurate software reliability prediction not only enables developers to improve the quality of software but also provides useful information to help them for planning valuable resources. This paper examines the performance of three well-known data mining techniques (CART, TreeNet and Random Forest) for predicting software reliability. We evaluate and compare the performance of proposed models with Cascade Correlation Neural Network (CCNN) using sixteen empirical databases from the Data and Analysis Center for Software. The goal of our study is to help project managers to concentrate their testing efforts to minimize the software failures in order to improve the reliability of the software systems. Two performance measures, Normalized Root Mean Squared Error (NRMSE) and Mean Absolute Errors (MAE), illustrate that CART model is accurate than the models predicted using Random Forest, TreeNet and CCNN in all datasets used in our study. Finally, we conclude that such methods can help in reliability prediction using real-life failure datasets.

Keywords: Classification, Cascade Correlation Neural Network, Random Forest, Software reliability, TreeNet.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1838
8970 On Measuring the Reusability Proneness of Mobile Applications

Authors: Fathi Taibi

Abstract:

The abnormal increase in the number of applications available for download in Android markets is a good indication that they are being reused. However, little is known about their real reusability potential. A considerable amount of these applications is reported as having a poor quality or being malicious. Hence, in this paper, an approach to measure the reusability potential of classes in Android applications is proposed. The approach is not meant specifically for this particular type of applications. Rather, it is intended for Object-Oriented (OO) software systems in general and aims also to provide means to discard the classes of low quality and defect prone applications from being reused directly through inheritance and instantiation. An empirical investigation is conducted to measure and rank the reusability potential of the classes of randomly selected Android applications. The results obtained are thoroughly analyzed in order to understand the extent of this potential and the factors influencing it.

Keywords: Reusability, Software Quality Factors, Software Metrics, Empirical Investigation, Object-Oriented Software, Android Applications.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1801
8969 A Novel Approach to Optimal Cutting Tool Replacement

Authors: Cem Karacal, Sohyung Cho, William Yu

Abstract:

In metal cutting industries, mathematical/statistical models are typically used to predict tool replacement time. These off-line methods usually result in less than optimum replacement time thereby either wasting resources or causing quality problems. The few online real-time methods proposed use indirect measurement techniques and are prone to similar errors. Our idea is based on identifying the optimal replacement time using an electronic nose to detect the airborne compounds released when the tool wear reaches to a chemical substrate doped into tool material during the fabrication. The study investigates the feasibility of the idea, possible doping materials and methods along with data stream mining techniques for detection and monitoring different phases of tool wear.

Keywords: Tool condition monitoring, cutting tool replacement, data stream mining, e-Nose.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1881
8968 Improvement of Realization Quality of Aerospace Products Using Augmented Reality Technology

Authors: Nuran Bahar, Mehmet A. Akcayol

Abstract:

In the aviation industry, many faults may occur frequently during the maintenance processes and assembly operations of complex structured aircrafts because of their high dependencies of components. These faults affect the quality of aircraft parts or developed modules adversely. Technical employee requires long time and high labor force while checking the correctness of each component. In addition, the person must be trained regularly because of the ever-growing and changing technology. Generally, the cost of this training is very high. Augmented Reality (AR) technology reduces the cost of training radically and improves the effectiveness of the training. In this study, the usage of AR technology in the aviation industry has been investigated and the effectiveness of AR with heads-up display glasses has been examined. An application has been developed for comparison of production process with AR and manual one.

Keywords: Aerospace, assembly quality, augmented reality, heads-up display.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1742
8967 Missing Link Data Estimation with Recurrent Neural Network: An Application Using Speed Data of Daegu Metropolitan Area

Authors: JaeHwan Yang, Da-Woon Jeong, Seung-Young Kho, Dong-Kyu Kim

Abstract:

In terms of ITS, information on link characteristic is an essential factor for plan or operation. But in practical cases, not every link has installed sensors on it. The link that does not have data on it is called “Missing Link”. The purpose of this study is to impute data of these missing links. To get these data, this study applies the machine learning method. With the machine learning process, especially for the deep learning process, missing link data can be estimated from present link data. For deep learning process, this study uses “Recurrent Neural Network” to take time-series data of road. As input data, Dedicated Short-range Communications (DSRC) data of Dalgubul-daero of Daegu Metropolitan Area had been fed into the learning process. Neural Network structure has 17 links with present data as input, 2 hidden layers, for 1 missing link data. As a result, forecasted data of target link show about 94% of accuracy compared with actual data.

Keywords: Data Estimation, link data, machine learning, road network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1503
8966 A Goal-Driven Crime Scripting Framework

Authors: Hashem Dehghanniri

Abstract:

Crime scripting is a simple and effective crime modeling technique that aims to improve understanding of security analysts about security and crime incidents. Low-quality scripts provide a wrong, incomplete, or sophisticated understanding of the crime commission process, which oppose the purpose of their application, e.g., identifying effective and cost-efficient situational crime prevention (SCP) measures. One important and overlooked factor in generating quality scripts is the crime scripting method. This study investigates the problems within the existing crime scripting practices and proposes a crime scripting approach that contributes to generating quality crime scripts. It was validated by experienced crime scripters. This framework helps analysts develop better crime scripts and contributes to their effective application, e.g., SCP measures identification or policy-making.

Keywords: Attack modeling, crime commission process, crime script, situational crime prevention.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 705
8965 Investigating Daylight Quality in Malaysian Government Office Buildings Through Daylight Factor and Surface Luminance

Authors: Mohd Zin Kandar, Mohd Sabere Sulaiman, Yong Razidah Rashid, Dilshan Remaz Ossen, Aminatuzuhariah MAbdullah, Lim Yaik Wah, Mansour Nikpour

Abstract:

In recent years, there has been an increasing interest in using daylight to save energy in buildings. In tropical regions, daylighting is always an energy saver. On the other hand, daylight provides visual comfort. According to standards, it shows that many criteria should be taken into consideration in order to have daylight utilization and visual comfort. The current standard in Malaysia, MS 1525 does not provide sufficient guideline. Hence, more research is needed on daylight performance. If architects do not consider daylight design, it not only causes inconvenience in working spaces but also causes more energy consumption as well as environmental pollution. This research had surveyed daylight performance in 5 selected office buildings from different area of Malaysian through experimental method. Several parameters of daylight quality such as daylight factor, surface luminance and surface luminance ratio were measured in different rooms in each building. The result of this research demonstrated that most of the buildings were not designed for daylight utilization. Therefore, it is very important that architects follow the daylight design recommendation to reduce consumption of electric power for artificial lighting while the sufficient quality of daylight is available.

Keywords: Daylight factor, Field measurement, Daylighting quality, Tropical

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3482
8964 CNet Module Design of IMCS

Authors: Youkyung Park, SeungYup Kang, SungHo Kim, SimKyun Yook

Abstract:

IMCS is Integrated Monitoring and Control System for thermal power plant. This system consists of mainly two parts; controllers and OIS (Operator Interface System). These two parts are connected by Ethernet-based communication. The controller side of communication is managed by CNet module and OIS side is managed by data server of OIS. CNet module sends the data of controller to data server and receives commend data from data server. To minimizes or balance the load of data server, this module buffers data created by controller at every cycle and send buffered data to data server on request of data server. For multiple data server, this module manages the connection line with each data server and response for each request from multiple data server. CNet module is included in each controller of redundant system. When controller fail-over happens on redundant system, this module can provide data of controller to data sever without loss. This paper presents three main features – separation of get task, usage of ring buffer and monitoring communication status –of CNet module to carry out these functions.

Keywords: Ethernet communication, DCS, power plant, ring buffer, data integrity

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1562
8963 Integrating Competences into Work Planning – The Influence of Competence-Based Parameters on Strategic Business Objectives

Authors: G. Meyer, M. Klewer, P. Nyhuis

Abstract:

Constantly changing economic conditions require companies to design their production to be more economical, innovative, and flexible. Since workers have a decisive influence on cost, time, and quality, e.g. by monitoring indicators that determine quality, by developing processes more resistant to disturbances, or by monitoring environmental standards, a focus on personnel as a production factor is needed. This presupposes the efficient use and systematic enhancement of employees’ existing competences since greater consideration of these aspects in work planning will help to enhance competitiveness. The aim of the research project ‘Integrated Technology- and Competence-based Work Planning in Socio-Technical Systems’ is to develop a new work planning method that combines technology with work science by incorporating employees’ skills as a quality indicator. For employee competences to increase competitiveness, it is first of all necessary to assess how competences affect cost, time, and quality. A model for deriving predictions about the effects of competence-based parameters on these strategic business objectives is developed in this paper.

Keywords: Competence management, education and training, employee competences, one-factor-at-a-time method, work planning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1541
8962 A Large Ion Collider Experiment (ALICE) Diffractive Detector Control System for RUN-II at the Large Hadron Collider

Authors: J. C. Cabanillas-Noris, M. I. Martínez-Hernández, I. León-Monzón

Abstract:

The selection of diffractive events in the ALICE experiment during the first data taking period (RUN-I) of the Large Hadron Collider (LHC) was limited by the range over which rapidity gaps occur. It would be possible to achieve better measurements by expanding the range in which the production of particles can be detected. For this purpose, the ALICE Diffractive (AD0) detector has been installed and commissioned for the second phase (RUN-II). Any new detector should be able to take the data synchronously with all other detectors and be operated through the ALICE central systems. One of the key elements that must be developed for the AD0 detector is the Detector Control System (DCS). The DCS must be designed to operate safely and correctly this detector. Furthermore, the DCS must also provide optimum operating conditions for the acquisition and storage of physics data and ensure these are of the highest quality. The operation of AD0 implies the configuration of about 200 parameters, from electronics settings and power supply levels to the archiving of operating conditions data and the generation of safety alerts. It also includes the automation of procedures to get the AD0 detector ready for taking data in the appropriate conditions for the different run types in ALICE. The performance of AD0 detector depends on a certain number of parameters such as the nominal voltages for each photomultiplier tube (PMT), their threshold levels to accept or reject the incoming pulses, the definition of triggers, etc. All these parameters define the efficiency of AD0 and they have to be monitored and controlled through AD0 DCS. Finally, AD0 DCS provides the operator with multiple interfaces to execute these tasks. They are realized as operating panels and scripts running in the background. These features are implemented on a SCADA software platform as a distributed control system which integrates to the global control system of the ALICE experiment.

Keywords: AD0, ALICE, DCS, LHC.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1395
8961 A Model Predictive Control Based Virtual Active Power Filter Using V2G Technology

Authors: Mahdi Zolfaghari, Seyed Hossein Hosseinian, Hossein Askarian Abyaneh, Mehrdad Abedi

Abstract:

This paper presents a virtual active power filter (VAPF) using vehicle to grid (V2G) technology to maintain power quality requirements. The optimal discrete operation of the power converter of electric vehicle (EV) is based on recognizing desired switching states using the model predictive control (MPC) algorithm. A fast dynamic response, lower total harmonic distortion (THD) and good reference tracking performance are realized through the presented control strategy. The simulation results using MATLAB/Simulink validate the effectiveness of the scheme in improving power quality as well as good dynamic response in power transferring capability.

Keywords: Virtual active power filter, V2G technology, model predictive control, electric vehicle, power quality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 999
8960 The Importance of Analysis of Internal Quality Management Systems and Self-Examination Processes in Engineering Accreditation Processes

Authors: Wilfred Fritz

Abstract:

The accreditation process of engineering degree programmes is based on various reports evaluated by the relevant governing bodies of the institution of higher education. One of the aforementioned reports for the accreditation process is a self-assessment report which is to be completed by the applying institution. This paper seeks to emphasise the importance of analysis of internal quality management systems and self-examination processes in the engineering accreditation processes. A description of how the programme fulfils the criteria should be given. Relevant stakeholders all need to contribute in the writing and structuring of the self-assessment report. The last step is to gather evidence in the form of supporting documentation. In conclusion, the paper also identifies learning outcomes in a case study in seeking accreditation from an international relevant professional body.

Keywords: Accreditation, governing bodies, self-assessment report, quality management.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 679
8959 Operating Equipment Effectiveness with a Reliability Indicator

Authors: Carl D. Hays III

Abstract:

The purpose of this theory paper is to add a reliability indicator to Operating Equipment Effectiveness (OpEE) which is used to evaluate the productivity of machines and equipment with wheels and tracks. OpEE is a derivative of Overall Equipment Effectiveness (OEE) which has been widely used for many decades in factories that manufacture products. OEE has three variables, Availability Rate, Work Rate, and Quality Rate. When OpEE was converted from OEE, the Quality Rate variable was replaced with Travel Rate. Travel Rate is essentially utilization which is a common performance indicator in machines and equipment. OpEE was designed for machines operated in remote locations such as forests, roads, fields, and farms. This theory paper intends to add the Quality Rate variable back to OpEE by including a reliability indicator in the dashboard view. This paper will suggest that the OEE quality variable can be used with a reliability metric and combined with the OpEE score. With this dashboard view of both performance metrics and reliability, fleet managers will have a more complete understanding of equipment productivity and reliability. This view will provide both leading and lagging indicators of performance in machines and equipment. The lagging indicators will indicate the trends and the leading indicators will provide an overall performance score to manage.

Keywords: Operating Equipment Effectiveness, Operating Equipment Effectiveness, IoT, Contamination Monitoring.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 506
8958 Laboratory Investigations on Mechanical Properties of High Volume Fly Ash Concrete and Composite Sections

Authors: Aravindkumar B. Harwalkar, S. S. Awanti

Abstract:

Use of fly ash as a supplementary cementing material in large volumes can bring both technological and economic benefits for concrete industry. In this investigation mix proportions for high volume fly ash concrete were determined at cement replacement levels of 50%, 55%, 60% and 65% with low calcium fly ash. Flexural and compressive strengths of different mixes were measured at ages of 7, 28 and 90 days. Flexural strength of composite section prepared from pavement quality and lean high volume fly ash concrete was determined at the age of 28 days. High volume fly ash concrete mixes exhibited higher rate of strength gain and age factors than corresponding reference concrete mixes. The optimum cement replacement level for pavement quality concrete was found to be 60%. The consideration of bond between pavement quality and lean of high volume fly ash concrete will be beneficial in design of rigid pavements.

Keywords: Keywords—Composite section, Compressive strength, Flexural strength, Fly ash.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1954
8957 Effect of Dietary Chromium Yeast on Thigh Meat Quality of Broiler Chicks in Heat Stress Condition

Authors: Majid Toghyani, Abbas Ali Gheisari, Ali Khodami, Mehdi Toghyani, Mohammad Mohammadrezaei, Ramin Bahadoran

Abstract:

This experiment was conducted to investigate the effect of different levels of dietary chromium yeast (Cr-yeast) on thigh meat quality of broiler chicks reared under heat stress condition. Two hundred and forty Ross male chickens in heat stress condition (33±3°C) were allocated to five treatments in a completely randomized design. Treatments were supplemented with 0 (control), 200, 400, 800 and 1200 μg kg-1 Cr in the form of Cr yeast. Twelve chicks from each treatment were slaughtered at 42 d, to evaluate moisture, protein, lipid, pH and lipid oxidation of thigh meat. Protein, moisture, lipid and pH of thigh meat were not affected by supplemental Cr. Thigh meat lipid tended to decrease in broilers received 1200 μg kg-1. Storage time increased lipid oxidation of meat (P<0.01). Lipid oxidation of thigh muscle for two days of storage were affected by supplemental Cr and decreased (P<0.05). Results of this study showed that dietary Cr-yeast supplementation improved the thigh meat quality of broiler chicks in heat stress condition.

Keywords: Broiler, Heat stress, Chromium yeast, Thigh meat quality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2263
8956 Big Data: Concepts, Technologies and Applications in the Public Sector

Authors: A. Alexandru, C. A. Alexandru, D. Coardos, E. Tudora

Abstract:

Big Data (BD) is associated with a new generation of technologies and architectures which can harness the value of extremely large volumes of very varied data through real time processing and analysis. It involves changes in (1) data types, (2) accumulation speed, and (3) data volume. This paper presents the main concepts related to the BD paradigm, and introduces architectures and technologies for BD and BD sets. The integration of BD with the Hadoop Framework is also underlined. BD has attracted a lot of attention in the public sector due to the newly emerging technologies that allow the availability of network access. The volume of different types of data has exponentially increased. Some applications of BD in the public sector in Romania are briefly presented.

Keywords: Big data, big data Analytics, Hadoop framework, cloud computing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2318
8955 A Frugal Bidding Procedure for Replicating WWW Content

Authors: Samee Ullah Khan, C. Ardil

Abstract:

Fine-grained data replication over the Internet allows duplication of frequently accessed data objects, as opposed to entire sites, to certain locations so as to improve the performance of largescale content distribution systems. In a distributed system, agents representing their sites try to maximize their own benefit since they are driven by different goals such as to minimize their communication costs, latency, etc. In this paper, we will use game theoretical techniques and in particular auctions to identify a bidding mechanism that encapsulates the selfishness of the agents, while having a controlling hand over them. In essence, the proposed game theory based mechanism is the study of what happens when independent agents act selfishly and how to control them to maximize the overall performance. A bidding mechanism asks how one can design systems so that agents- selfish behavior results in the desired system-wide goals. Experimental results reveal that this mechanism provides excellent solution quality, while maintaining fast execution time. The comparisons are recorded against some well known techniques such as greedy, branch and bound, game theoretical auctions and genetic algorithms.

Keywords: Internet, data content replication, static allocation, mechanism design, equilibrium.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1403
8954 Investigating Technical and Pedagogical Considerations in Producing Screen Recorded Videos

Authors: M. Nikafrooz, J. Darsareh

Abstract:

Due to the COVID-19 pandemic, its impacts on education all over the world, and the problems arising from the use of traditional methods in education during the pandemic, it was necessary to apply alternative solutions to achieve educational goals. In this regard, electronic content production through screen recording became popular among many teachers. However, the production of screen-recorded videos requires special technical and pedagogical considerations. The purpose of this study was to extract and present the technical and pedagogical considerations for producing screen-recorded videos to provide a useful and comprehensive guideline for e-content producers. This study was applied research, the design was descriptive, and data collection has been done using qualitative method. In order to collect the data, 524 previously produced screen-recorded videos were evaluated by using an open-ended questionnaire. After collecting the data, they were categorized, and finally, 83 items as technical and pedagogical considerations in the form of 5 domains were determined. By applying such considerations, it is expected to decrease producing and editing time, increase the technical and pedagogical quality, and finally facilitate and enhance the processes of teaching and learning.

Keywords: E-learning, e-content, screen recorded-videos, screen recording software, technical and pedagogical considerations.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 645
8953 Strengthening Legal Protection of Personal Data through Technical Protection Regulation in Line with Human Rights

Authors: Tomy Prihananto, Damar Apri Sudarmadi

Abstract:

Indonesia recognizes the right to privacy as a human right. Indonesia provides legal protection against data management activities because the protection of personal data is a part of human rights. This paper aims to describe the arrangement of data management and data management in Indonesia. This paper is a descriptive research with qualitative approach and collecting data from literature study. Results of this paper are comprehensive arrangement of data that have been set up as a technical requirement of data protection by encryption methods. Arrangements on encryption and protection of personal data are mutually reinforcing arrangements in the protection of personal data. Indonesia has two important and immediately enacted laws that provide protection for the privacy of information that is part of human rights.

Keywords: Indonesia, protection, personal data, privacy, human rights, encryption.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 985
8952 Methane and Other Hydrocarbon Gas Emissions Resulting from Flaring in Kuwait Oilfields

Authors: Khaireyah Kh. Al-Hamad, V. Nassehi, A. R. Khan

Abstract:

Air pollution is a major environmental health problem, affecting developed and developing countries around the world. Increasing amounts of potentially harmful gases and particulate matter are being emitted into the atmosphere on a global scale, resulting in damage to human health and the environment. Petroleum-related air pollutants can have a wide variety of adverse environmental impacts. In the crude oil production sectors, there is a strong need for a thorough knowledge of gaseous emissions resulting from the flaring of associated gas of known composition on daily basis through combustion activities under several operating conditions. This can help in the control of gaseous emission from flares and thus in the protection of their immediate and distant surrounding against environmental degradation. The impacts of methane and non-methane hydrocarbons emissions from flaring activities at oil production facilities at Kuwait Oilfields have been assessed through a screening study using records of flaring operations taken at the gas and oil production sites, and by analyzing available meteorological and air quality data measured at stations located near anthropogenic sources. In the present study the Industrial Source Complex (ISCST3) Dispersion Model is used to calculate the ground level concentrations of methane and nonmethane hydrocarbons emitted due to flaring in all over Kuwait Oilfields. The simulation of real hourly air quality in and around oil production facilities in the State of Kuwait for the year 2006, inserting the respective source emission data into the ISCST3 software indicates that the levels of non-methane hydrocarbons from the flaring activities exceed the allowable ambient air standard set by Kuwait EPA. So, there is a strong need to address this acute problem to minimize the impact of methane and non-methane hydrocarbons released from flaring activities over the urban area of Kuwait.

Keywords: Kuwait Oilfields, ISCST3 model, flaring, Airpollution, Methane and Non-methane.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2057
8951 Performance Analysis of Polycrystalline and Monocrystalline Solar Module in Dhaka, Bangladesh

Authors: N. J. Imu, N. Rabbani, Md E. Hossain

Abstract:

Achieving national climate goals requires transforming the energy system and increasing the use of renewable energy in Bangladesh as renewable energy offers an environmentally friendly energy supply. In view of this, Bangladesh has set a goal of 100% renewable power generation by 2050. Among all the renewable energy, solar is the most effective and popular source of renewable energy in Bangladesh. In order to build up on-grid and off-grid solar systems to increase energy transformation, monocrystalline type (highly efficient) solar module, and the polycrystalline type (low-efficient) solar module are commonly used. Due to their low price and availability, polycrystalline-type solar modules dominated the local market in the past years. However, in recent times the use of monocrystalline types modules has increased considerably owing to the significant decrease in price difference that existed between these two modules. Despite the deployment of both mono- and poly-crystalline modules in the market, the proliferation of low-quality solar panels are dominating the market resulting in reduced generation of solar electricity than expected. This situation is further aggravated by insufficient information regarding the effect of solar irradiation on solar module performance in relation to the quality of the materials used for the production of the module. This research aims to evaluate the efficiency of monocrystalline and polycrystalline solar modules that are available in Bangladesh by considering seasonal variations. Both types of solar modules have been tested for three different capacities 45W, 60W, and 100W in Dhaka regions to evaluate their power generation capability under Standard Test Conditions (STC). Module testing data were recorded twelve months in a full year from January to December. Data for solar irradiation were collected using HT304N while HT I-V400 multifunction instrument was used for testing voltage and current of photovoltaic (PV) systems and complete power quality analyzer. Results obtained in this study indicated differences between the efficiencies of polycrystalline and monocrystalline solar modules under the country’s solar irradiation. The average efficiencies of 45W, 60W, and 100W monocrystalline solar panels were recorded as 11.73%, 13.41%, and 15.37% respectively while for polycrystalline panels were 8.66%, 9.37%, and 12.34%. Monocrystalline solar panels, which offer greater working output than polycrystalline ones, are also represented by the Pearson Correlation value. The output of polycrystalline solar panels fluctuated highly with the changes in irradiation and temperature whereas monocrystalline panels were much stable.

Keywords: Solar energy, solar irradiation, efficiency, polycrystalline solar module, monocrystalline solar module, SPSS analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 157
8950 A Practical Approach for Testing the Process Quality

Authors: Mou-Yuan Liao, Chien-Wei Wu, Chien-Hua Lin

Abstract:

Process capability index Cpk is the most widely used index in making managerial decisions since it provides bounds on the process yield for normally distributed processes. However, existent methods for assessing process performance which constructed by statistical inference may unfortunately lead to fine results, because uncertainties exist in most real-world applications. Thus, this study adopts fuzzy inference to deal with testing of Cpk . A brief score is obtained for assessing a supplier’s process instead of a severe evaluation.

Keywords: Process capability analysis, quality control.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1425
8949 Big Brain: A Single Database System for a Federated Data Warehouse Architecture

Authors: X. Gumara Rigol, I. Martínez de Apellaniz Anzuola, A. Garcia Serrano, A. Franzi Cros, O. Vidal Calbet, A. Al Maruf

Abstract:

Traditional federated architectures for data warehousing work well when corporations have existing regional data warehouses and there is a need to aggregate data at a global level. Schibsted Media Group has been maturing from a decentralised organisation into a more globalised one and needed to build both some of the regional data warehouses for some brands at the same time as the global one. In this paper, we present the architectural alternatives studied and why a custom federated approach was the notable recommendation to go further with the implementation. Although the data warehouses are logically federated, the implementation uses a single database system which presented many advantages like: cost reduction and improved data access to global users allowing consumers of the data to have a common data model for detailed analysis across different geographies and a flexible layer for local specific needs in the same place.

Keywords: Data integration, data warehousing, federated architecture, online analytical processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 708
8948 A Fast Replica Placement Methodology for Large-scale Distributed Computing Systems

Authors: Samee Ullah Khan, C. Ardil

Abstract:

Fine-grained data replication over the Internet allows duplication of frequently accessed data objects, as opposed to entire sites, to certain locations so as to improve the performance of largescale content distribution systems. In a distributed system, agents representing their sites try to maximize their own benefit since they are driven by different goals such as to minimize their communication costs, latency, etc. In this paper, we will use game theoretical techniques and in particular auctions to identify a bidding mechanism that encapsulates the selfishness of the agents, while having a controlling hand over them. In essence, the proposed game theory based mechanism is the study of what happens when independent agents act selfishly and how to control them to maximize the overall performance. A bidding mechanism asks how one can design systems so that agents- selfish behavior results in the desired system-wide goals. Experimental results reveal that this mechanism provides excellent solution quality, while maintaining fast execution time. The comparisons are recorded against some well known techniques such as greedy, branch and bound, game theoretical auctions and genetic algorithms.

Keywords: Data replication, auctions, static allocation, pricing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1693
8947 Amelioration of Cardiac Arrythmias Classification Performance Using Artificial Neural Network, Adaptive Neuro-Fuzzy and Fuzzy Inference Systems Classifiers

Authors: Alexandre Boum, Salomon Madinatou

Abstract:

This paper aims at bringing a scientific contribution to the cardiac arrhythmia biomedical diagnosis systems; more precisely to the study of the amelioration of cardiac arrhythmia classification performance using artificial neural network, adaptive neuro-fuzzy and fuzzy inference systems classifiers. The purpose of this amelioration is to enable cardiologists to make reliable diagnosis through automatic cardiac arrhythmia analyzes and classifications based on high confidence classifiers. In this study, six classes of the most commonly encountered arrhythmias are considered: the Right Bundle Branch Block, the Left Bundle Branch Block, the Ventricular Extrasystole, the Auricular Extrasystole, the Atrial Fibrillation and the Normal Cardiac rate beat. From the electrocardiogram (ECG) extracted parameters, we constructed a matrix (360x360) serving as an input data sample for the classifiers based on neural networks and a matrix (1x6) for the classifier based on fuzzy logic. By varying three parameters (the quality of the neural network learning, the data size and the quality of the input parameters) the automatic classification permitted us to obtain the following performances: in terms of correct classification rate, 83.6% was obtained using the fuzzy logic based classifier, 99.7% using the neural network based classifier and 99.8% for the adaptive neuro-fuzzy based classifier. These results are based on signals containing at least 360 cardiac cycles. Based on the comparative analysis of the aforementioned three arrhythmia classifiers, the classifiers based on neural networks exhibit a better performance.

Keywords: Adaptive neuro-fuzzy, artificial neural network, cardiac arrythmias, fuzzy inference systems.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 705
8946 Supplier Selection in a Scenario Based Stochastic Model with Uncertain Defectiveness and Delivery Lateness Rates

Authors: Abeer Amayri, Akif A. Bulgak

Abstract:

Due to today’s globalization as well as outsourcing practices of the companies, the Supply Chain (SC) performances have become more dependent on the efficient movement of material among places that are geographically dispersed, where there is more chance for disruptions. One such disruption is the quality and delivery uncertainties of outsourcing. These uncertainties could lead the products to be unsafe and, as is the case in a number of recent examples, companies may have to end up in recalling their products. As a result of these problems, there is a need to develop a methodology for selecting suppliers globally in view of risks associated with low quality and late delivery. Accordingly, we developed a two-stage stochastic model that captures the risks associated with uncertainty in quality and delivery as well as a solution procedure for the model. The stochastic model developed simultaneously optimizes supplier selection and purchase quantities under price discounts over a time horizon. In particular, our target is the study of global organizations with multiple sites and multiple overseas suppliers, where the pricing is offered in suppliers’ local currencies. Our proposed methodology is applied to a case study for a US automotive company having two assembly plants and four potential global suppliers to illustrate how the proposed model works in practice.

Keywords: Global supply chains, quality, stochastic programming, supplier selection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1567