Search results for: Statistical quality control
3421 Minimizing Grid Reliance: A Power Model Approach for Peak Hour Demand Based on Hybrid Solar Systems
Authors: Almutasim Billa A. Alanazi, Hal S. Tharp
Abstract:
Electrical energy demands have increased due to population growth and the variety of new electrical load technologies. This increase demand has nearly doubled during peak hours. Consequently, that necessitates the construction of new power plant infrastructures, which is a costly approach due to the expense of construction building, future preservation like maintenance, and environmental impact. As an alternative approach, most electrical utilities increase the price of electrical usage during peak hours, encouraging consumers to use less electricity during peak periods under Time-Of-Use programs, which may not be universally suitable for all consumers. Furthermore, in some areas, the excessive demand and the lack of supply cause an electrical outage, posing considerable stress and challenges to electrical utilities and consumers. However, control systems, artificial intelligence (AI), and renewable energy (RE), when effectively integrated, provide new solutions to mitigate excessive demand during peak hours. This paper presents a power model that reduces the reliance on the power grid during peak hours by utilizing a hybrid solar system connected to a residential house with a power management controller, that prioritizes the power drives between Photovoltaic (PV) production, battery backup, and the utility electrical grid. As a result, dependence on utility grid was from 3% to 18% during peak hours, improving energy stability safely and efficiently for electrical utilities, consumers, and communities, providing a viable alternative to conventional approaches such as Time-Of-Use programs.
Keywords: Artificial intelligence, AI, control system, photovoltaic, PV, renewable energy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1293420 Dissecting Big Trajectory Data to Analyse Road Network Travel Efficiency
Authors: Rania Alshikhe, Vinita Jindal
Abstract:
Digital innovation has played a crucial role in managing smart transportation. For this, big trajectory data collected from trav-eling vehicles, such as taxis through installed global positioning sys-tem (GPS)-enabled devices can be utilized. It offers an unprecedented opportunity to trace the movements of vehicles in fine spatiotemporal granularity. This paper aims to explore big trajectory data to measure the travel efficiency of road networks using the proposed statistical travel efficiency measure (STEM) across an entire city. Further, it identifies the cause of low travel efficiency by proposed least square approximation network-based causality exploration (LANCE). Finally, the resulting data analysis reveals the causes of low travel efficiency, along with the road segments that need to be optimized to improve the traffic conditions and thus minimize the average travel time from given point A to point B in the road network. Obtained results show that our proposed approach outperforms the baseline algorithms for measuring the travel efficiency of the road network.
Keywords: GPS trajectory, road network, taxi trips, digital map, big data, STEM, LANCE
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5103419 A Study of Students’ Perceptions Regarding the Effectiveness of Semester and Annual Examination System at Institute of Education and Research
Authors: Ayesha Batool, Saghir Ahmad, Abid Hussain Ch.
Abstract:
The art of the examination is probably the most difficult one in the whole range of educational practices. Semester system is the system of examination, which is set with an institute by its own teachers. Annual system is the system of examination, which is constructed and administrated by some agency outside the institute, it enables the teacher to estimate the effectiveness of the instruction, and students to estimate the progress made by them. On the other hand, semester system of examinations requires following the curriculum strictly and methods of teaching are to be employed by the choice of teachers. The main purpose of the study was to investigate university students’ perceptions regarding the effectiveness of semester system and annual system. The study was quantitative in nature. The sample consisted of 200 students. A five point Likert type scale was used to collect the data. The statistical measures like frequencies, mean, standard deviation, and One Way ANOVA test were applied to analyze the data. The major findings of the study indicated that in semester system students do not spend much time in political activities and develop their study habits. It also revealed that annual system of examination does not satisfy the educational aspirations of the students.
Keywords: Effectiveness, semester system, annual system.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9413418 Dynamic Variation in Nano-Scale CMOS SRAM Cells Due to LF/RTS Noise and Threshold Voltage
Authors: M. Fadlallah, G. Ghibaudo, C. G. Theodorou
Abstract:
The dynamic variation in memory devices such as the Static Random Access Memory can give errors in read or write operations. In this paper, the effect of low-frequency and random telegraph noise on the dynamic variation of one SRAM cell is detailed. The effect on circuit noise, speed, and length of time of processing is examined, using the Supply Read Retention Voltage and the Read Static Noise Margin. New test run methods are also developed. The obtained results simulation shows the importance of noise caused by dynamic variation, and the impact of Random Telegraph noise on SRAM variability is examined by evaluating the statistical distributions of Random Telegraph noise amplitude in the pull-up, pull-down. The threshold voltage mismatch between neighboring cell transistors due to intrinsic fluctuations typically contributes to larger reductions in static noise margin. Also the contribution of each of the SRAM transistor to total dynamic variation has been identified.
Keywords: Low-frequency noise, Random Telegraph Noise, Dynamic Variation, SRRV.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7193417 Burstiness Reduction of a Doubly Stochastic AR-Modeled Uniform Activity VBR Video
Authors: J. P. Dubois
Abstract:
Stochastic modeling of network traffic is an area of significant research activity for current and future broadband communication networks. Multimedia traffic is statistically characterized by a bursty variable bit rate (VBR) profile. In this paper, we develop an improved model for uniform activity level video sources in ATM using a doubly stochastic autoregressive model driven by an underlying spatial point process. We then examine a number of burstiness metrics such as the peak-to-average ratio (PAR), the temporal autocovariance function (ACF) and the traffic measurements histogram. We found that the former measure is most suitable for capturing the burstiness of single scene video traffic. In the last phase of this work, we analyse statistical multiplexing of several constant scene video sources. This proved, expectedly, to be advantageous with respect to reducing the burstiness of the traffic, as long as the sources are statistically independent. We observed that the burstiness was rapidly diminishing, with the largest gain occuring when only around 5 sources are multiplexed. The novel model used in this paper for characterizing uniform activity video was thus found to be an accurate model.Keywords: AR, ATM, burstiness, doubly stochastic, statisticalmultiplexing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14093416 Public Economic Efficiency and Case-Based Reasoning: A Theoretical Framework to Police Performance
Authors: Javier Parra-Domínguez, Juan Manuel Corchado
Abstract:
At present, public efficiency is a concept that intends to maximize return on public investment focus on minimizing the use of resources and maximizing the outputs. The concept takes into account statistical criteria drawn up according to techniques such as DEA (Data Envelopment Analysis). The purpose of the current work is to consider, more precisely, the theoretical application of CBR (Case-Based Reasoning) from economics and computer science, as a preliminary step to improving the efficiency of law enforcement agencies (public sector). With the aim of increasing the efficiency of the public sector, we have entered into a phase whose main objective is the implementation of new technologies. Our main conclusion is that the application of computer techniques, such as CBR, has become key to the efficiency of the public sector, which continues to require economic valuation based on methodologies such as DEA. As a theoretical result and conclusion, the incorporation of CBR systems will reduce the number of inputs and increase, theoretically, the number of outputs generated based on previous computer knowledge.Keywords: Case-based reasoning, knowledge, police, public efficiency.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6033415 Analysis of Web User Identification Methods
Authors: Renáta Iváncsy, Sándor Juhász
Abstract:
Web usage mining has become a popular research area, as a huge amount of data is available online. These data can be used for several purposes, such as web personalization, web structure enhancement, web navigation prediction etc. However, the raw log files are not directly usable; they have to be preprocessed in order to transform them into a suitable format for different data mining tasks. One of the key issues in the preprocessing phase is to identify web users. Identifying users based on web log files is not a straightforward problem, thus various methods have been developed. There are several difficulties that have to be overcome, such as client side caching, changing and shared IP addresses and so on. This paper presents three different methods for identifying web users. Two of them are the most commonly used methods in web log mining systems, whereas the third on is our novel approach that uses a complex cookie-based method to identify web users. Furthermore we also take steps towards identifying the individuals behind the impersonal web users. To demonstrate the efficiency of the new method we developed an implementation called Web Activity Tracking (WAT) system that aims at a more precise distinction of web users based on log data. We present some statistical analysis created by the WAT on real data about the behavior of the Hungarian web users and a comprehensive analysis and comparison of the three methodsKeywords: Data preparation, Tracking individuals, Web useridentification, Web usage mining
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 43923414 Protective Effect of Saponin Extract from the Root of Garcinia kola (Bitter kola) against Paracetamol- Induced Hepatotoxicity in Albino Rats
Authors: Yemisi Rufina Alli Smith, Isaac Gbadura Adanlawo
Abstract:
Liver disorders are one of the major problems of the world. Despite its frequent occurrence, high morbidity and high mortality, its medical management is currently inadequate. This study was designed to evaluate the hepatoprotective effect of saponin extract of the root of Garcinia kola on the integrity of the liver of paracetamol induced wistar albino rats. Twenty five (25) male adult wistar albino rats were divided into five (5) groups. Group I was the Control group that received distilled water only, group II was the negative control that received 2 g/kg of paracetamol on the 13th day, and group III, IV and V were pre-treated with 100, 200 and 400mg/kg of the saponin extract before inducing the liver damage on the 13th day with 2 g/kg of paracetamol. Twenty four (24) h after administration, the rats were sacrificed and blood samples were collected. The serum Alanine Transaminase (ALT), Aspartate Transaminase (AST), Alkaline Phosphatase (ALP) activities, Bilirubin and conjugated bilirubin, glucose and protein concentrations were evaluated. The liver was fixed immediately in Formalin and was processed and stained in Haematoxylin and Eosin (H&E). Administration of saponin extract from the root of Garcinia kola significantly decreased paracetamol induced elevated enzymes in the test group. Also histological observations showed that saponin extract of the root of Garcinia kola exhibited a significant liver protection against the toxicant as evident by the cells trying to return to normal. Saponin extract from the root of Garcinia kola indicated a protection of structural integrity of the hepatocytic cell membrane and regeneration of the damaged liver.
Keywords: Garcinia kola, Hepatoprotective, paracetamol, Saponin.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29383413 Further the Effectiveness of Software Testability Measure
Authors: Liang Zhao, Feng Wang, Bo Deng, Bo Yang
Abstract:
Software testability is proposed to address the problem of increasing cost of test and the quality of software. Testability measure provides a quantified way to denote the testability of software. Since 1990s, many testability measure models are proposed to address the problem. By discussing the contradiction between domain testability and domain range ratio (DRR), a new testability measure, semantic fault distance, is proposed. Its validity is discussed.
Keywords: Software testability, DRR, Domain testability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20453412 Active Power Filter dimensioning Using a Hysteresis Current Controller
Authors: Tarek A. Kasmieh, Hassan S. Omran
Abstract:
This paper aims to give a full study of the dynamic behavior of a mono-phase active power filter. First, the principle of the parallel active power filter will be introduced. Then, a dimensioning procedure for all its components will be explained in detail, such as the input filter, the current and voltage controllers. This active power filter is simulated using OrCAD program showing the validity of the theoretical study.Keywords: Active power filter, Power Quality, Hysteresiscurrent controller.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17073411 Analysis of Cost Estimation and Payment Systems for Consultant Contracts in the US, Japan, China and the UK
Authors: Shih-Hsu Wang, Yuan-Yuan Cheng, Ming-Tsung Lee, Wei-Chih Wang
Abstract:
Determining reasonable fees is the main objective of designing the cost estimation and payment systems for consultant contracts. However, project clients utilize different cost estimation and payment systems because of their varying views on the reasonableness of consultant fees. This study reviews the cost estimation and payment systems of consultant contracts for five countries, including the US (Washington State Department of Transportation), Japan (Ministry of Land, Infrastructure, Transport and Tourism), China (Engineering Design Charging Standard) and UK (Her Majesty's Treasure). Specifically, this work investigates the budgeting process, contractor selection method, contractual price negotiation process, cost review, and cost-control concept of the systems used in these countries. The main finding indicates that that project client-s view on whether the fee is high will affect the way he controls it. In the US, the fee is commonly considered to be high. As a result, stringent auditing system (low flexibility given to the consultant) is then applied. In the UK, the fee is viewed to be low by comparing it to the total life-cycle project cost. Thus, a system that has high flexibility in budgeting and cost reviewing is given to the consultant. In terms of the flexibility allowed for the consultant, the systems applied in Japan and China fall between those of the US and UK. Both the US and UK systems are helpful in determining a reasonable fee. However, in the US system, rigid auditing standards must be established and additional cost-audit manpower is required. In the UK system, sufficient historical cost data should be needed to evaluate the reasonableness of the consultant-s proposed fee
Keywords: Consultant Services, Cost Estimation and Payment System, Payment Flexibility, Cost-control Concept
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16863410 3D Scaffolds Fabricated by Microfluidic Device for Rat Cardiomyocytes Observation
Authors: Chih-Wei Chao, Jiashing Yu
Abstract:
To mimic the natural circumstances of cell growth in an organism, we present three-dimensional (3D) scaffolds fabricated by microfluidics for cultivation. This work investigates the cellular behaviors of rat cardiomyocytes in gelatin 3D scaffolds compared to those on 2D control, such as proliferation, viability and morphology. We found that the scaffolds may induce skeletal differentiation of H9c2 cells.
Keywords: Microfluidic device, H9c2, tissue engineering, 3D scaffolds.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20683409 Simulating Flow Transients in Conveying Pipeline Systems by Rigid Column and Full Elastic Methods: Pump Combined with Air Chamber
Authors: I. Abuiziah, A. Oulhaj, K. Sebari, D. Ouazar, A. A. Saber
Abstract:
In water pipeline systems, the flow control is an integrated part of the operation, for instance, opening and closing the valves, starting and stopping the pumps, when these operations very quickly performed, they shall cause the hydraulic transient phenomena, which may cause pump and, valve failures and catastrophic pipe ruptures. Fluid transient analysis is one of the more challenging and complicated flow problems in the design and the operation of water pipeline systems. Transient control has become an essential requirement for ensuring safe operation of water pipeline systems. An accurate analysis and suitable protection devices should be used to protect water pipeline systems. The fourth-order Runge-Kutta method has been used to solve the dynamic and continuity equations in the rigid column method, while the characteristics method used to solve these equations in the full elastic methods. This paper presents the problem of modeling and simulating of transient phenomena in conveying pipeline systems based on the rigid column and full elastic methods. Also, it provides the influence of using the protection devices to protect the pipeline systems from damaging due to the gain pressure which occur in the transient state. The results obtained provide that the model is an efficient tool for flow transient analysis and provide approximately identical results by using these two methods. Moreover; using the closed surge tank reduces the unfavorable effects of transients.
Keywords: Flow transient, Pipeline, Air chamber, Numerical model, Protection devices, Elastic method, Rigid column method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 44073408 Design and Analysis of a Piezoelectric Linear Motor Based on Rigid Clamping
Authors: Chao Yi, Cunyue Lu, Lingwei Quan
Abstract:
Piezoelectric linear motors have the characteristics of great electromagnetic compatibility, high positioning accuracy, compact structure and no deceleration mechanism, which make it promising to applicate in micro-miniature precision drive systems. However, most piezoelectric motors are employed by flexible clamping, which has insufficient rigidity and is difficult to use in rapid positioning. Another problem is that this clamping method seriously affects the vibration efficiency of the vibrating unit. In order to solve these problems, this paper proposes a piezoelectric stack linear motor based on double-end rigid clamping. First, a piezoelectric linear motor with a length of only 35.5 mm is designed. This motor is mainly composed of a motor stator, a driving foot, a ceramic friction strip, a linear guide, a pre-tightening mechanism and a base. This structure is much simpler and smaller than most similar motors, and it is easy to assemble as well as to realize precise control. In addition, the properties of piezoelectric stack are reviewed and in order to obtain the elliptic motion trajectory of the driving head, a driving scheme of the longitudinal-shear composite stack is innovatively proposed. Finally, impedance analysis and speed performance testing were performed on the piezoelectric linear motor prototype. The motor can measure speed up to 25.5 mm/s under the excitation of signal voltage of 120 V and frequency of 390 Hz. The result shows that the proposed piezoelectric stacked linear motor obtains great performance. It can run smoothly in a large speed range, which is suitable for various precision control in medical images, aerospace, precision machinery and many other fields.
Keywords: Elliptical trajectory, linear motor, piezoelectric stack, rigid clamping.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7203407 Techniques of Construction Management in Civil Engineering
Authors: Mamoon M. Atout
Abstract:
The Middle East Gulf region has witnessed rapid growth and development in many areas over the last two decades. The development of the real-estate sector, construction industry and infrastructure projects are a major share of the development that has participated in the civilization of the countries of the Gulf. Construction industry projects were planned and managed by different types of experts, who came from all over the world having different types of experiences in construction management and industry. Some of these projects were completed on time, while many were not, due to many accumulating factors. Many accumulated factors are considered as the principle reason for the problem experienced at the project construction stage, which reflected negatively on the project success. Specific causes of delay have been identified by construction managers to avoid any unexpected delays through proper analysis and considerations to some implications such as risk assessment and analysis for many potential problems to ensure that projects will be delivered on time. Construction management implications were adopted and considered by project managers who have experience and knowledge in applying the techniques of the system of engineering construction management. The aim of this research is to determine the benefits of the implications of construction management by the construction team and level of considerations of the techniques and processes during the project development and construction phases to avoid any delay in the projects. It also aims to determine the factors that participate to project completion delays in case project managers are not well committed to their roles and responsibilities. The results of the analysis will determine the necessity of the applications required by the project team to avoid the causes of delays that help them deliver projects on time, e.g. verifying tender documents, quantities and preparing the construction method of the project.
Keywords: Construction management, control process, cost control, planning and scheduling, roles and responsibilities.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14293406 The Guideline of Overall Competitive Advantage Promotion with Key Success Paths
Authors: M. F. Wu, F. T. Cheng, C. S. Wu, M. C. Tan
Abstract:
It is a critical time to upgrade technology and increase value added with manufacturing skills developing and management strategies that will highly satisfy the customers need in the precision machinery global market. In recent years, the supply side, each precision machinery manufacturers in each country are facing the pressures of price reducing from the demand side voices that pushes the high-end precision machinery manufacturers adopts low-cost and high-quality strategy to retrieve the market. Because of the trend of the global market, the manufacturers must take price reducing strategies and upgrade technology of low-end machinery for differentiations to consolidate the market.By using six key success factors (KSFs), customer perceived value, customer satisfaction, customer service, product design, product effectiveness and machine structure quality are causal conditions to explore the impact of competitive advantage of the enterprise, such as overall profitability and product pricing power. This research uses key success paths (KSPs) approach and f/s QCA software to explore various combinations of causal relationships, so as to fully understand the performance level of KSFs and business objectives in order to achieve competitive advantage. In this study, the combination of a causal relationships, are called Key Success Paths (KSPs). The key success paths guide the enterprise to achieve the specific outcomes of business. The findings of this study indicate that there are thirteen KSPs to achieve the overall profitability, sixteen KSPs to achieve the product pricing power and seventeen KSPs to achieve both overall profitability and pricing power of the enterprise. The KSPs provide the directions of resources integration and allocation, improve utilization efficiency of limited resources to realize the continuous vision of the enterprise.
Keywords: Precision Machinery Industry, Key Success Factors (KSPs), Key Success Paths (KSPs), Overall Profitability, Product Pricing Power, Competitive Advantages.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18563405 Improvement in Power Transformer Intelligent Dissolved Gas Analysis Method
Authors: S. Qaedi, S. Seyedtabaii
Abstract:
Non-Destructive evaluation of in-service power transformer condition is necessary for avoiding catastrophic failures. Dissolved Gas Analysis (DGA) is one of the important methods. Traditional, statistical and intelligent DGA approaches have been adopted for accurate classification of incipient fault sources. Unfortunately, there are not often enough faulty patterns required for sufficient training of intelligent systems. By bootstrapping the shortcoming is expected to be alleviated and algorithms with better classification success rates to be obtained. In this paper the performance of an artificial neural network, K-Nearest Neighbour and support vector machine methods using bootstrapped data are detailed and shown that while the success rate of the ANN algorithms improves remarkably, the outcome of the others do not benefit so much from the provided enlarged data space. For assessment, two databases are employed: IEC TC10 and a dataset collected from reported data in papers. High average test success rate well exhibits the remarkable outcome.Keywords: Dissolved gas analysis, Transformer incipient fault, Artificial Neural Network, Support Vector Machine (SVM), KNearest Neighbor (KNN)
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27393404 Effects of Video Games and Online Chat on Mathematics Performance in High School: An Approach of Multivariate Data Analysis
Authors: Lina Wu, Wenyi Lu, Ye Li
Abstract:
Regarding heavy video game players for boys and super online chat lovers for girls as a symbolic phrase in the current adolescent culture, this project of data analysis verifies the displacement effect on deteriorating mathematics performance. To evaluate correlation or regression coefficients between a factor of playing video games or chatting online and mathematics performance compared with other factors, we use multivariate analysis technique and take gender difference into account. We find the most important reason for the negative sign of the displacement effect on mathematics performance due to students’ poor academic background. Statistical analysis methods in this project could be applied to study internet users’ academic performance from the high school education to the college education.
Keywords: Correlation coefficients, displacement effect, gender difference, multivariate analysis technique, regression coefficients.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21703403 Tool Failure Detection Based on Statistical Analysis of Metal Cutting Acoustic Emission Signals
Authors: Othman Belgassim, Krzysztof Jemielniak
Abstract:
The analysis of Acoustic Emission (AE) signal generated from metal cutting processes has often approached statistically. This is due to the stochastic nature of the emission signal as a result of factors effecting the signal from its generation through transmission and sensing. Different techniques are applied in this manner, each of which is suitable for certain processes. In metal cutting where the emission generated by the deformation process is rather continuous, an appropriate method for analysing the AE signal based on the root mean square (RMS) of the signal is often used and is suitable for use with the conventional signal processing systems. The aim of this paper is to set a strategy in tool failure detection in turning processes via the statistic analysis of the AE generated from the cutting zone. The strategy is based on the investigation of the distribution moments of the AE signal at predetermined sampling. The skews and kurtosis of these distributions are the key elements in the detection. A normal (Gaussian) distribution has first been suggested then this was eliminated due to insufficiency. The so called Beta distribution was then considered, this has been used with an assumed β density function and has given promising results with regard to chipping and tool breakage detection.Keywords: AE signal, skew, kurtosis, tool failure
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18473402 Development of a Real-Time Energy Models for Photovoltaic Water Pumping System
Authors: Ammar Mahjoubi, Ridha Fethi Mechlouch, Belgacem Mahdhaoui, Ammar Ben Brahim
Abstract:
This purpose of this paper is to develop and validate a model to accurately predict the cell temperature of a PV module that adapts to various mounting configurations, mounting locations, and climates while only requiring readily available data from the module manufacturer. Results from this model are also compared to results from published cell temperature models. The models were used to predict real-time performance from a PV water pumping systems in the desert of Medenine, south of Tunisia using 60-min intervals of measured performance data during one complete year. Statistical analysis of the predicted results and measured data highlight possible sources of errors and the limitations and/or adequacy of existing models, to describe the temperature and efficiency of PV-cells and consequently, the accuracy of performance of PV water pumping systems prediction models.Keywords: Temperature of a photovoltaic module, Predicted models, PV water pumping systems efficiency, Simulation, Desert of southern Tunisia.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18513401 PM10 Prediction and Forecasting Using CART: A Case Study for Pleven, Bulgaria
Authors: Snezhana G. Gocheva-Ilieva, Maya P. Stoimenova
Abstract:
Ambient air pollution with fine particulate matter (PM10) is a systematic permanent problem in many countries around the world. The accumulation of a large number of measurements of both the PM10 concentrations and the accompanying atmospheric factors allow for their statistical modeling to detect dependencies and forecast future pollution. This study applies the classification and regression trees (CART) method for building and analyzing PM10 models. In the empirical study, average daily air data for the city of Pleven, Bulgaria for a period of 5 years are used. Predictors in the models are seven meteorological variables, time variables, as well as lagged PM10 variables and some lagged meteorological variables, delayed by 1 or 2 days with respect to the initial time series, respectively. The degree of influence of the predictors in the models is determined. The selected best CART models are used to forecast future PM10 concentrations for two days ahead after the last date in the modeling procedure and show very accurate results.Keywords: Cross-validation, decision tree, lagged variables, short-term forecasting.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7383400 Tribological Aspects of Advanced Roll Material in Cold Rolling of Stainless Steel
Authors: Mohammed Tahir, Jonas Lagergren
Abstract:
Vancron 40, a nitrided powder metallurgical tool Steel, is used in cold work applications where the predominant failure mechanisms are adhesive wear or galling. Typical applications of Vancron 40 are among others fine blanking, cold extrusion, deep drawing and cold work rolls for cluster mills. Vancron 40 positive results for cold work rolls for cluster mills and as a tool for some severe metal forming process makes it competitive compared to other type of work rolls that require higher precision, among others in cold rolling of thin stainless steel, which required high surface finish quality. In this project, three roll materials for cold rolling of stainless steel strip was examined, Vancron 40, Narva 12B (a high-carbon, high-chromium tool steel alloyed with tungsten) and Supra 3 (a Chromium-molybdenum tungsten-vanadium alloyed high speed steel). The purpose of this project was to study the depth profiles of the ironed stainless steel strips, emergence of galling and to study the lubrication performance used by steel industries. Laboratory experiments were conducted to examine scratch of the strip, galling and surface roughness of the roll materials under severe tribological conditions. The critical sliding length for onset of galling was estimated for stainless steel with four different lubricants. Laboratory experiments result of performance evaluation of resistance capability of rolls toward adhesive wear under severe conditions for low and high reductions. Vancron 40 in combination with cold rolling lubricant gave good surface quality, prevents galling of metal surfaces and good bearing capacity.
Keywords: Adhesive wear, Cold rolling, Lubricant, Stainless steel, Surface finish, Vancron 40.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27603399 Study of Compaction in Hot-Mix Asphalt Using Computer Simulations
Authors: Kasthurirangan Gopalakrishnan, Naga Shashidhar, Xiaoxiong Zhong
Abstract:
During the process of compaction in Hot-Mix Asphalt (HMA) mixtures, the distance between aggregate particles decreases as they come together and eliminate air-voids. By measuring the inter-particle distances in a cut-section of a HMA sample the degree of compaction can be estimated. For this, a calibration curve is generated by computer simulation technique when the gradation and asphalt content of the HMA mixture are known. A two-dimensional cross section of HMA specimen was simulated using the mixture design information (gradation, asphalt content and air-void content). Nearest neighbor distance methods such as Delaunay triangulation were used to study the changes in inter-particle distance and area distribution during the process of compaction in HMA. Such computer simulations would enable making several hundreds of repetitions in a short period of time without the necessity to compact and analyze laboratory specimens in order to obtain good statistics on the parameters defined. The distributions for the statistical parameters based on computer simulations showed similar trends as those of laboratory specimens.Keywords: Computer simulations, Hot-Mix Asphalt (HMA), inter-particle distance, image analysis, nearest neighbor
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18923398 Data Mining Classification Methods Applied in Drug Design
Authors: Mária Stachová, Lukáš Sobíšek
Abstract:
Data mining incorporates a group of statistical methods used to analyze a set of information, or a data set. It operates with models and algorithms, which are powerful tools with the great potential. They can help people to understand the patterns in certain chunk of information so it is obvious that the data mining tools have a wide area of applications. For example in the theoretical chemistry data mining tools can be used to predict moleculeproperties or improve computer-assisted drug design. Classification analysis is one of the major data mining methodologies. The aim of thecontribution is to create a classification model, which would be able to deal with a huge data set with high accuracy. For this purpose logistic regression, Bayesian logistic regression and random forest models were built using R software. TheBayesian logistic regression in Latent GOLD software was created as well. These classification methods belong to supervised learning methods. It was necessary to reduce data matrix dimension before construct models and thus the factor analysis (FA) was used. Those models were applied to predict the biological activity of molecules, potential new drug candidates.Keywords: data mining, classification, drug design, QSAR
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 28493397 Performance Analysis of HSDPA Systems using Low-Density Parity-Check (LDPC)Coding as Compared to Turbo Coding
Authors: K. Anitha Sheela, J. Tarun Kumar
Abstract:
HSDPA is a new feature which is introduced in Release-5 specifications of the 3GPP WCDMA/UTRA standard to realize higher speed data rate together with lower round-trip times. Moreover, the HSDPA concept offers outstanding improvement of packet throughput and also significantly reduces the packet call transfer delay as compared to Release -99 DSCH. Till now the HSDPA system uses turbo coding which is the best coding technique to achieve the Shannon limit. However, the main drawbacks of turbo coding are high decoding complexity and high latency which makes it unsuitable for some applications like satellite communications, since the transmission distance itself introduces latency due to limited speed of light. Hence in this paper it is proposed to use LDPC coding in place of Turbo coding for HSDPA system which decreases the latency and decoding complexity. But LDPC coding increases the Encoding complexity. Though the complexity of transmitter increases at NodeB, the End user is at an advantage in terms of receiver complexity and Bit- error rate. In this paper LDPC Encoder is implemented using “sparse parity check matrix" H to generate a codeword at Encoder and “Belief Propagation algorithm "for LDPC decoding .Simulation results shows that in LDPC coding the BER suddenly drops as the number of iterations increase with a small increase in Eb/No. Which is not possible in Turbo coding. Also same BER was achieved using less number of iterations and hence the latency and receiver complexity has decreased for LDPC coding. HSDPA increases the downlink data rate within a cell to a theoretical maximum of 14Mbps, with 2Mbps on the uplink. The changes that HSDPA enables includes better quality, more reliable and more robust data services. In other words, while realistic data rates are only a few Mbps, the actual quality and number of users achieved will improve significantly.Keywords: AMC, HSDPA, LDPC, WCDMA, 3GPP.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20483396 Application of the Least Squares Method in the Adjustment of Chlorodifluoromethane (HCFC-142b) Regression Models
Authors: L. J. de Bessa Neto, V. S. Filho, J. V. Ferreira Nunes, G. C. Bergamo
Abstract:
There are many situations in which human activities have significant effects on the environment. Damage to the ozone layer is one of them. The objective of this work is to use the Least Squares Method, considering the linear, exponential, logarithmic, power and polynomial models of the second degree, to analyze through the coefficient of determination (R²), which model best fits the behavior of the chlorodifluoromethane (HCFC-142b) in parts per trillion between 1992 and 2018, as well as estimates of future concentrations between 5 and 10 periods, i.e. the concentration of this pollutant in the years 2023 and 2028 in each of the adjustments. A total of 809 observations of the concentration of HCFC-142b in one of the monitoring stations of gases precursors of the deterioration of the ozone layer during the period of time studied were selected and, using these data, the statistical software Excel was used for make the scatter plots of each of the adjustment models. With the development of the present study, it was observed that the logarithmic fit was the model that best fit the data set, since besides having a significant R² its adjusted curve was compatible with the natural trend curve of the phenomenon.
Keywords: Chlorodifluoromethane (HCFC-142b), ozone (O3), least squares method, regression models.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8283395 Optimal Image Compression Based on Sign and Magnitude Coding of Wavelet Coefficients
Authors: Mbainaibeye Jérôme, Noureddine Ellouze
Abstract:
Wavelet transforms is a very powerful tools for image compression. One of its advantage is the provision of both spatial and frequency localization of image energy. However, wavelet transform coefficients are defined by both a magnitude and sign. While algorithms exist for efficiently coding the magnitude of the transform coefficients, they are not efficient for the coding of their sign. It is generally assumed that there is no compression gain to be obtained from the coding of the sign. Only recently have some authors begun to investigate the sign of wavelet coefficients in image coding. Some authors have assumed that the sign information bit of wavelet coefficients may be encoded with the estimated probability of 0.5; the same assumption concerns the refinement information bit. In this paper, we propose a new method for Separate Sign Coding (SSC) of wavelet image coefficients. The sign and the magnitude of wavelet image coefficients are examined to obtain their online probabilities. We use the scalar quantization in which the information of the wavelet coefficient to belong to the lower or to the upper sub-interval in the uncertainly interval is also examined. We show that the sign information and the refinement information may be encoded by the probability of approximately 0.5 only after about five bit planes. Two maps are separately entropy encoded: the sign map and the magnitude map. The refinement information of the wavelet coefficient to belong to the lower or to the upper sub-interval in the uncertainly interval is also entropy encoded. An algorithm is developed and simulations are performed on three standard images in grey scale: Lena, Barbara and Cameraman. Five scales are performed using the biorthogonal wavelet transform 9/7 filter bank. The obtained results are compared to JPEG2000 standard in terms of peak signal to noise ration (PSNR) for the three images and in terms of subjective quality (visual quality). It is shown that the proposed method outperforms the JPEG2000. The proposed method is also compared to other codec in the literature. It is shown that the proposed method is very successful and shows its performance in term of PSNR.
Keywords: Image compression, wavelet transform, sign coding, magnitude coding.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16733394 Investigation of Tbilisi City Atmospheric Air Pollution with PM in Usual and Emergency Situations Using the Observational and Numerical Modeling Data
Authors: N. Gigauri, V. Kukhalashvili, V. Sesadze, A. Surmava, L. Intskirveli
Abstract:
Pollution of the Tbilisi atmospheric air with PM2.5 and PM10 in usual and pandemic situations by using the data of 5 stationary observation points is investigated. The values of the statistical characteristic parameters of PM in the atmosphere of Tbilisi are analyzed and trend graphs are constructed. By means of analysis of pollution levels in the quarantine and usual periods the proportion of vehicle traffic in pollution of city is estimated. Experimental measurements of PM2.5, PM10 in the atmosphere have been carried out in different districts of the city and map of the distribution of their concentrations were constructed. It is shown that maximum pollution values are recorded in the city center and along major motorways. It is shown that the average monthly concentrations vary in the range of 0.6-1.6 Maximum Permissible Concentration (MPC). Average daily values of concentration vary at 2-4 days intervals. The distribution of PM10 generated as a result of traffic is numerical modeled. The modeling results are compared with the observation data.
Keywords: Air pollution, numerical modeling, PM2.5, PM10.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5763393 The Role and Importance of Genome Sequencing in Prediction of Cancer Risk
Authors: M. Sadeghi, H. Pezeshk, R. Tusserkani, A. Sharifi Zarchi, A. Malekpour, M. Foroughmand, S. Goliaei, M. Totonchi, N. Ansari–Pour
Abstract:
The role and relative importance of intrinsic and extrinsic factors in the development of complex diseases such as cancer still remains a controversial issue. Determining the amount of variation explained by these factors needs experimental data and statistical models. These models are nevertheless based on the occurrence and accumulation of random mutational events during stem cell division, thus rendering cancer development a stochastic outcome. We demonstrate that not only individual genome sequencing is uninformative in determining cancer risk, but also assigning a unique genome sequence to any given individual (healthy or affected) is not meaningful. Current whole-genome sequencing approaches are therefore unlikely to realize the promise of personalized medicine. In conclusion, since genome sequence differs from cell to cell and changes over time, it seems that determining the risk factor of complex diseases based on genome sequence is somewhat unrealistic, and therefore, the resulting data are likely to be inherently uninformative.
Keywords: Cancer risk, extrinsic factors, genome sequencing, intrinsic factors.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11173392 Attacks Classification in Adaptive Intrusion Detection using Decision Tree
Authors: Dewan Md. Farid, Nouria Harbi, Emna Bahri, Mohammad Zahidur Rahman, Chowdhury Mofizur Rahman
Abstract:
Recently, information security has become a key issue in information technology as the number of computer security breaches are exposed to an increasing number of security threats. A variety of intrusion detection systems (IDS) have been employed for protecting computers and networks from malicious network-based or host-based attacks by using traditional statistical methods to new data mining approaches in last decades. However, today's commercially available intrusion detection systems are signature-based that are not capable of detecting unknown attacks. In this paper, we present a new learning algorithm for anomaly based network intrusion detection system using decision tree algorithm that distinguishes attacks from normal behaviors and identifies different types of intrusions. Experimental results on the KDD99 benchmark network intrusion detection dataset demonstrate that the proposed learning algorithm achieved 98% detection rate (DR) in comparison with other existing methods.Keywords: Detection rate, decision tree, intrusion detectionsystem, network security.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3631