Search results for: probabilistic classification vector machines
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3869

Search results for: probabilistic classification vector machines

1949 Vertical and Horizantal Distribution Patterns of Major and Trace Elements: Surface and Subsurface Sediments of Endhorheic Lake Acigol Basin, Denizli Turkey

Authors: M. Budakoglu, M. Karaman

Abstract:

Lake Acıgöl is located in area with limited influences from urban and industrial pollution sources, there is nevertheless a need to understand all potential lithological and anthropogenic sources of priority contaminants in this closed basin. This study discusses vertical and horizontal distribution pattern of major, trace elements of recent lake sediments to better understand their current geochemical analog with lithological units in the Lake Acıgöl basin. This study also provides reliable background levels for the region by the detailed surfaced lithological units data. The detail results of surface, subsurface and shallow core sediments from these relatively unperturbed ecosystems, highlight its importance as conservation area, despite the high-scale industrial salt production activity. While P2O5/TiO2 versus MgO/CaO classification diagram indicate magmatic and sedimentary origin of lake sediment, Log(SiO2/Al2O3) versus Log(Na2O/K2O) classification diagrams express lithological assemblages of shale, iron-shale, vacke and arkose. The plot between TiO2 vs. SiO2 and P2O5/TiO2 vs. MgO/CaO also supports the origin of the primary magma source. The average compositions of the 20 different lithological units used as a proxy for geochemical background in the study area. As expected from weathered rock materials, there is a large variation in the major element content for all analyzed lake samples. The A-CN-K and A-CNK-FM ternary diagrams were used to deduce weathering trends. Surface and subsurface sediments display an intense weathering history according to these ternary diagrams. The most of the sediments samples plot around UCC and TTG, suggesting a low to moderate weathering history for the provenance. The sediments plot in a region clearly suggesting relative similar contents in Al2O3, CaO, Na2O, and K2O from those of lithological samples.

Keywords: Lake Acıgöl, recent lake sediment, geochemical speciation of major and trace elements, heavy metals, Denizli, Turkey

Procedia PDF Downloads 411
1948 Productivity Improvement of Faffa Food Share Company Using a Computerized Maintenance Management System

Authors: Gadisa Alemayehu, Muralidhar Avvari, Atkilt Mulu G.

Abstract:

Since 1962 EC, the Faffa Food Share Company has been producing and supplying flour (famix) and value-added flour (baby food) in Ethiopia. It meets nearly all of the country's total flour demand, both for relief and commercial markets. However, it is incompetent in the international market due to a poor maintenance management system. The results of recorded documents and stopwatches revealed that frequent failure machines, as well as a poor maintenance management system, cause increased production downtimes, resulting in a 29.19 percent decrease in production from the planned production. As a result, the current study's goal is to recommend newly developed software for use in and as a Computerized Maintenance Management System (CMMS). As a result, the system increases machine reliability and decreases the frequency of equipment failure, reducing breakdown time and maintenance costs. The company's overall manufacturing performance improved by 4.45 percent, particularly after the implementation of the CMMS.

Keywords: CMMS, manufacturing performance, delivery, availability, flexibility, Faffa Food Share Company

Procedia PDF Downloads 137
1947 A General Variable Neighborhood Search Algorithm to Minimize Makespan of the Distributed Permutation Flowshop Scheduling Problem

Authors: G. M. Komaki, S. Mobin, E. Teymourian, S. Sheikh

Abstract:

This paper addresses minimizing the makespan of the distributed permutation flow shop scheduling problem. In this problem, there are several parallel identical factories or flowshops each with series of similar machines. Each job should be allocated to one of the factories and all of the operations of the jobs should be performed in the allocated factory. This problem has recently gained attention and due to NP-Hard nature of the problem, metaheuristic algorithms have been proposed to tackle it. Majority of the proposed algorithms require large computational time which is the main drawback. In this study, a general variable neighborhood search algorithm (GVNS) is proposed where several time-saving schemes have been incorporated into it. Also, the GVNS uses the sophisticated method to change the shaking procedure or perturbation depending on the progress of the incumbent solution to prevent stagnation of the search. The performance of the proposed algorithm is compared to the state-of-the-art algorithms based on standard benchmark instances.

Keywords: distributed permutation flow shop, scheduling, makespan, general variable neighborhood search algorithm

Procedia PDF Downloads 354
1946 Effect of Various Tillage Systems on Soil Compaction

Authors: Sushil Kumar, Mukesh Jain, Vijaya Rani, Vinod Kumar

Abstract:

The prime importance of tillage is that it prepares the land where the seed easily germinate and later the plant can well establish. Using different types of equipments driven manually or by powered, machines make the soil suitable to place the seeds into the desirable depth. Moreover, tillage loosens the compacted layers. Heavy equipment and tillage implements can cause damage to the soil structure. Effect of various tillage methods on soil compaction was studied in Rabi season of 2013-14 at village Ladwa, Hisar, Haryana (India). The experiments studied the effect of six tillage treatments i.e. no tillage or zero tillage (T1), tillage with rotavator (T2), disc harrow (T3), rotavator + sub soiler (T4), disc harrow + sub soiler (T5) and power harrow (T6) on soil compaction. Soil compaction was measured before tillage and after sowing at 0, 30, 60 and 90 days after sowing. No change in soil resistance was recorded before and after no tillage treatment. Maximum soil resistance was found in zero tillage followed by disc harrow up to 150 mm soil depth. Minimum soil resistance was found in rotavator immediately after the tillage treatment. However, the soil resistance approached the same level as it had been before the tillage after the soil strata where the implement cannot reach.

Keywords: tillage, no tillage, rotavator, subsoiler, compaction

Procedia PDF Downloads 318
1945 A Comprehensive Framework for Fraud Prevention and Customer Feedback Classification in E-Commerce

Authors: Samhita Mummadi, Sree Divya Nagalli, Harshini Vemuri, Saketh Charan Nakka, Sumesh K. J.

Abstract:

One of the most significant challenges faced by people in today’s digital era is an alarming increase in fraudulent activities on online platforms. The fascination with online shopping to avoid long queues in shopping malls, the availability of a variety of products, and home delivery of goods have paved the way for a rapid increase in vast online shopping platforms. This has had a major impact on increasing fraudulent activities as well. This loop of online shopping and transactions has paved the way for fraudulent users to commit fraud. For instance, consider a store that orders thousands of products all at once, but what’s fishy about this is the massive number of items purchased and their transactions turning out to be fraud, leading to a huge loss for the seller. Considering scenarios like these underscores the urgent need to introduce machine learning approaches to combat fraud in online shopping. By leveraging robust algorithms, namely KNN, Decision Trees, and Random Forest, which are highly effective in generating accurate results, this research endeavors to discern patterns indicative of fraudulent behavior within transactional data. Introducing a comprehensive solution to this problem in order to empower e-commerce administrators in timely fraud detection and prevention is the primary motive and the main focus. In addition to that, sentiment analysis is harnessed in the model so that the e-commerce admin can tailor to the customer’s and consumer’s concerns, feedback, and comments, allowing the admin to improve the user’s experience. The ultimate objective of this study is to ramp up online shopping platforms against fraud and ensure a safer shopping experience. This paper underscores a model accuracy of 84%. All the findings and observations that were noted during our work lay the groundwork for future advancements in the development of more resilient and adaptive fraud detection systems, which will become crucial as technologies continue to evolve.

Keywords: behavior analysis, feature selection, Fraudulent pattern recognition, imbalanced classification, transactional anomalies

Procedia PDF Downloads 28
1944 The Fiscal-Monetary Policy and Economic Growth in Algeria: VECM Approach

Authors: K. Bokreta, D. Benanaya

Abstract:

The objective of this study is to examine the relative effectiveness of monetary and fiscal policy in Algeria using the econometric modelling techniques of cointegration and vector error correction modelling to analyse and draw policy inferences. The chosen variables of fiscal policy are government expenditure and net taxes on products, while the effect of monetary policy is presented by the inflation rate and the official exchange rate. From the results, we find that in the long-run, the impact of government expenditures is positive, while the effect of taxes is negative on growth. Additionally, we find that the inflation rate is found to have little effect on GDP per capita but the impact of the exchange rate is insignificant. We conclude that fiscal policy is more powerful then monetary policy in promoting economic growth in Algeria.

Keywords: economic growth, monetary policy, fiscal policy, VECM

Procedia PDF Downloads 310
1943 Quantum Statistical Machine Learning and Quantum Time Series

Authors: Omar Alzeley, Sergey Utev

Abstract:

Minimizing a constrained multivariate function is the fundamental of Machine learning, and these algorithms are at the core of data mining and data visualization techniques. The decision function that maps input points to output points is based on the result of optimization. This optimization is the central of learning theory. One approach to complex systems where the dynamics of the system is inferred by a statistical analysis of the fluctuations in time of some associated observable is time series analysis. The purpose of this paper is a mathematical transition from the autoregressive model of classical time series to the matrix formalization of quantum theory. Firstly, we have proposed a quantum time series model (QTS). Although Hamiltonian technique becomes an established tool to detect a deterministic chaos, other approaches emerge. The quantum probabilistic technique is used to motivate the construction of our QTS model. The QTS model resembles the quantum dynamic model which was applied to financial data. Secondly, various statistical methods, including machine learning algorithms such as the Kalman filter algorithm, are applied to estimate and analyses the unknown parameters of the model. Finally, simulation techniques such as Markov chain Monte Carlo have been used to support our investigations. The proposed model has been examined by using real and simulated data. We establish the relation between quantum statistical machine and quantum time series via random matrix theory. It is interesting to note that the primary focus of the application of QTS in the field of quantum chaos was to find a model that explain chaotic behaviour. Maybe this model will reveal another insight into quantum chaos.

Keywords: machine learning, simulation techniques, quantum probability, tensor product, time series

Procedia PDF Downloads 469
1942 Spatial Patterns of Urban Expansion in Kuwait City between 1989 and 2001

Authors: Saad Algharib, Jay Lee

Abstract:

Urbanization is a complex phenomenon that occurs during the city’s development from one form to another. In other words, it is the process when the activities in the land use/land cover change from rural to urban. Since the oil exploration, Kuwait City has been growing rapidly due to its urbanization and population growth by both natural growth and inward immigration. The main objective of this study is to detect changes in urban land use/land cover and to examine the changing spatial patterns of urban growth in and around Kuwait City between 1989 and 2001. In addition, this study also evaluates the spatial patterns of the changes detected and how they can be related to the spatial configuration of the city. Recently, the use of remote sensing and geographic information systems became very useful and important tools in urban studies because of the integration of them can allow and provide the analysts and planners to detect, monitor and analyze the urban growth in a region effectively. Moreover, both planners and users can predict the trends of the growth in urban areas in the future with remotely sensed and GIS data because they can be effectively updated with required precision levels. In order to identify the new urban areas between 1989 and 2001, the study uses satellite images of the study area and remote sensing technology for classifying these images. Unsupervised classification method was applied to classify images to land use and land cover data layers. After finishing the unsupervised classification method, GIS overlay function was applied to the classified images for detecting the locations and patterns of the new urban areas that developed during the study period. GIS was also utilized to evaluate the distribution of the spatial patterns. For example, Moran’s index was applied for all data inputs to examine the urban growth distribution. Furthermore, this study assesses if the spatial patterns and process of these changes take place in a random fashion or with certain identifiable trends. During the study period, the result of this study indicates that the urban growth has occurred and expanded 10% from 32.4% in 1989 to 42.4% in 2001. Also, the results revealed that the largest increase of the urban area occurred between the major highways after the forth ring road from the center of Kuwait City. Moreover, the spatial distribution of urban growth occurred in cluster manners.

Keywords: geographic information systems, remote sensing, urbanization, urban growth

Procedia PDF Downloads 171
1941 Design and Performance Analysis of a Hydro-Power Rim-Driven Superconducting Synchronous Generator

Authors: A. Hassannia, S. Ramezani

Abstract:

The technology of superconductivity has developed in many power system devices such as transmission cable, transformer, current limiter, motor and generator. Superconducting wires can carry high density current without loss, which is the capability that is used to design the compact, lightweight and more efficient electrical machines. Superconducting motors have found applications in marine and air propulsion systems as well as superconducting generators are considered in low power hydraulic and wind generators. This paper presents a rim-driven superconducting synchronous generator for hydraulic power plant. The rim-driven concept improves the performance of hydro turbine. Furthermore, high magnetic field that is produced by superconducting windings allows replacing the rotor core. As a consequent, the volume and weight of the machine is decreased significantly. In this paper, a 1 MW coreless rim-driven superconducting synchronous generator is designed. Main performance characteristics of the proposed machine are then evaluated using finite elements method and compared to an ordinary similar size synchronous generator.

Keywords: coreless machine, electrical machine design, hydraulic generator, rim-driven machine, superconducting generator

Procedia PDF Downloads 175
1940 Normalized Compression Distance Based Scene Alteration Analysis of a Video

Authors: Lakshay Kharbanda, Aabhas Chauhan

Abstract:

In this paper, an application of Normalized Compression Distance (NCD) to detect notable scene alterations occurring in videos is presented. Several research groups have been developing methods to perform image classification using NCD, a computable approximation to Normalized Information Distance (NID) by studying the degree of similarity in images. The timeframes where significant aberrations between the frames of a video have occurred have been identified by obtaining a threshold NCD value, using two compressors: LZMA and BZIP2 and defining scene alterations using Pixel Difference Percentage metrics.

Keywords: image compression, Kolmogorov complexity, normalized compression distance, root mean square error

Procedia PDF Downloads 340
1939 Affective Adaptation Design for Better Gaming Experiences

Authors: Ollie Hall, Salma ElSayed

Abstract:

Affective adaptation is a novel way for game designers to add an extra layer of engagement to their productions. When player’s emotions factor in game design, endless possibilities for creative gameplay emerge. Whilst gaining popularity, existing affective game research mostly runs controlled experiments carried in restrictive settings and relies on one or more specialist devices for measuring a player’s emotional state. These conditions, albeit effective, are not necessarily realistic. Moreover, the simplified narrative and intrusive wearables may not be suitable for the average player. This exploratory study investigates delivering an immersive affective experience in the wild with minimal requirements in an attempt for the average developer to reach the average player. A puzzle game is created with a rich narrative and creative mechanics. It employs both explicit and implicit adaptation and only requires a web camera. Participants played the game on their own machines in various settings. Whilst it was rated feasible, very engaging, and enjoyable, it remains questionable whether a fully immersive experience was delivered due to the limited sample size.

Keywords: affective games, dynamic adaptation, emotion recognition, game design

Procedia PDF Downloads 151
1938 Recognition of Tifinagh Characters with Missing Parts Using Neural Network

Authors: El Mahdi Barrah, Said Safi, Abdessamad Malaoui

Abstract:

In this paper, we present an algorithm for reconstruction from incomplete 2D scans for tifinagh characters. This algorithm is based on using correlation between the lost block and its neighbors. This system proposed contains three main parts: pre-processing, features extraction and recognition. In the first step, we construct a database of tifinagh characters. In the second step, we will apply “shape analysis algorithm”. In classification part, we will use Neural Network. The simulation results demonstrate that the proposed method give good results.

Keywords: Tifinagh character recognition, neural networks, local cost computation, ANN

Procedia PDF Downloads 334
1937 An Application of a Machine Monitoring by Using the Internet of Things to Improve a Preventive Maintenance: Case Study of an Automated Plastic Granule-Packing Machine

Authors: Anek Apipatkul, Paphakorn Pitayachaval

Abstract:

Preventive maintenance is a standardized procedure to control and prevent risky problems affecting production in order to increase work efficiency. Machine monitoring also routinely works to collect data for a scheduling maintenance period. This paper is to present the application of machine monitoring by using the internet of things (IOTs) and a lean technique in order to manage with complex maintenance tasks of an automated plastic granule packing machine. To organize the preventive maintenance, there are several processes that the machine monitoring was applied, starting with defining a clear scope of the machine, establishing standards in maintenance work, applying a just-in-time (JIT) technique for timely delivery in the maintenance work, solving problems on the floor, and also improving the inspection process. The result has shown that wasted time was reduced, and machines have been operated as scheduled. Furthermore, the efficiency of the scheduled maintenance period was increased by 95%.

Keywords: internet of things, preventive maintenance, machine monitoring, lean technique

Procedia PDF Downloads 102
1936 Automated Machine Learning Algorithm Using Recurrent Neural Network to Perform Long-Term Time Series Forecasting

Authors: Ying Su, Morgan C. Wang

Abstract:

Long-term time series forecasting is an important research area for automated machine learning (AutoML). Currently, forecasting based on either machine learning or statistical learning is usually built by experts, and it requires significant manual effort, from model construction, feature engineering, and hyper-parameter tuning to the construction of the time series model. Automation is not possible since there are too many human interventions. To overcome these limitations, this article proposed to use recurrent neural networks (RNN) through the memory state of RNN to perform long-term time series prediction. We have shown that this proposed approach is better than the traditional Autoregressive Integrated Moving Average (ARIMA). In addition, we also found it is better than other network systems, including Fully Connected Neural Networks (FNN), Convolutional Neural Networks (CNN), and Nonpooling Convolutional Neural Networks (NPCNN).

Keywords: automated machines learning, autoregressive integrated moving average, neural networks, time series analysis

Procedia PDF Downloads 105
1935 Development of the Academic Model to Predict Student Success at VUT-FSASEC Using Decision Trees

Authors: Langa Hendrick Musawenkosi, Twala Bhekisipho

Abstract:

The success or failure of students is a concern for every academic institution, college, university, governments and students themselves. Several approaches have been researched to address this concern. In this paper, a view is held that when a student enters a university or college or an academic institution, he or she enters an academic environment. The academic environment is unique concept used to develop the solution for making predictions effectively. This paper presents a model to determine the propensity of a student to succeed or fail in the French South African Schneider Electric Education Center (FSASEC) at the Vaal University of Technology (VUT). The Decision Tree algorithm is used to implement the model at FSASEC.

Keywords: FSASEC, academic environment model, decision trees, k-nearest neighbor, machine learning, popularity index, support vector machine

Procedia PDF Downloads 200
1934 A Next-Generation Pin-On-Plate Tribometer for Use in Arthroplasty Material Performance Research

Authors: Lewis J. Woollin, Robert I. Davidson, Paul Watson, Philip J. Hyde

Abstract:

Introduction: In-vitro testing of arthroplasty materials is of paramount importance when ensuring that they can withstand the performance requirements encountered in-vivo. One common machine used for in-vitro testing is a pin-on-plate tribometer, an early stage screening device that generates data on the wear characteristics of arthroplasty bearing materials. These devices test vertically loaded rotating cylindrical pins acting against reciprocating plates, representing the bearing surfaces. In this study, a pin-on-plate machine has been developed that provides several improvements over current technology, thereby progressing arthroplasty bearing research. Historically, pin-on-plate tribometers have been used to investigate the performance of arthroplasty bearing materials under conditions commonly encountered during a standard gait cycle; nominal operating pressures of 2-6 MPa and an operating frequency of 1 Hz are typical. There has been increased interest in using pin-on-plate machines to test more representative in-vivo conditions, due to the drive to test 'beyond compliance', as well as their testing speed and economic advantages over hip simulators. Current pin-on-plate machines do not accommodate the increased performance requirements associated with more extreme kinematic conditions, therefore a next-generation pin-on-plate tribometer has been developed to bridge the gap between current technology and future research requirements. Methodology: The design was driven by several physiologically relevant requirements. Firstly, an increased loading capacity was essential to replicate the peak pressures that occur in the natural hip joint during running and chair-rising, as well as increasing the understanding of wear rates in obese patients. Secondly, the introduction of mid-cycle load variation was of paramount importance, as this allows for an approximation of the loads present in a gait cycle to be applied and to test the fatigue properties of materials. Finally, the rig must be validated against previous-generation pin-on-plate and arthroplasty wear data. Results: The resulting machine is a twelve station device that is split into three sets of four stations, providing an increased testing capacity compared to most current pin-on-plate tribometers. The loading of the pins is generated using a pneumatic system, which can produce contact pressures of up to 201 MPa on a 3.2 mm² round pin face. This greatly exceeds currently achievable contact pressures in literature and opens new research avenues such as testing rim wear of mal-positioned hip implants. Additionally, the contact pressure of each set can be changed independently of the others, allowing multiple loading conditions to be tested simultaneously. Using pneumatics also allows the applied pressure to be switched ON/OFF mid-cycle, another feature not currently reported elsewhere, which allows for investigation into intermittent loading and material fatigue. The device is currently undergoing a series of validation tests using Ultra-High-Molecular-Weight-Polyethylene pins and 316L Stainless Steel Plates (polished to a Ra < 0.05 µm). The operating pressures will be between 2-6 MPa, operating at 1 Hz, allowing for validation of the machine against results reported previously in the literature. The successful production of this next-generation pin-on-plate tribometer will, following its validation, unlock multiple previously unavailable research avenues.

Keywords: arthroplasty, mechanical design, pin-on-plate, total joint replacement, wear testing

Procedia PDF Downloads 95
1933 Fuzzy-Sliding Controller Design for Induction Motor Control

Authors: M. Bouferhane, A. Boukhebza, L. Hatab

Abstract:

In this paper, the position control of linear induction motor using fuzzy sliding mode controller design is proposed. First, the indirect field oriented control LIM is derived. Then, a designed sliding mode control system with an integral-operation switching surface is investigated, in which a simple adaptive algorithm is utilized for generalised soft-switching parameter. Finally, a fuzzy sliding mode controller is derived to compensate the uncertainties which occur in the control, in which the fuzzy logic system is used to dynamically control parameter settings of the SMC control law. The effectiveness of the proposed control scheme is verified by numerical simulation. The experimental results of the proposed scheme have presented good performances compared to the conventional sliding mode controller.

Keywords: linear induction motor, vector control, backstepping, fuzzy-sliding mode control

Procedia PDF Downloads 489
1932 Effect of Al Contents on Magnetic Domains of {100} Grains in Electrical Steels

Authors: Hyunseo Choi, Jaewan Hong, Seil Lee, Yang Mo Koo

Abstract:

Non-oriented (NO) electrical steel is one of the most important soft magnetic materials for rotating machines. Si has usually been added to electrical steels to reduce eddy current loss by increasing the electrical resistivity. Si content more than 3.5 wt% causes cracks during cold rolling due to increase of brittleness. Al also increases the electrical resistivity of the materials as much as Si. In addition, cold workability of Fe-Al is better than Fe-Si, so that Al can be added up to 6.0 wt%. However, the effect of Al contents on magnetic properties of electrical steels has not been studied in detail. Magnetic domains of {100} grains in electrical steels, ranging from 1.85 to 6.54 wt% Al, were observed by magneto-optic Kerr microscopy. Furthermore, the correlation of magnetic domains with magnetic properties was investigated. As Al contents increased, the magnetic domain size of {100} grains decreased due to lowered domain wall energy. Reorganization of magnetic domain structure became more complex as domain size decreased. Therefore, the addition of Al to electrical steel caused hysteresis loss to increase. Anomalous loss decreased and saturated after 4.68% Al.

Keywords: electrical steel, magnetic domain structure, Al addition, core loss, rearrangement of domains

Procedia PDF Downloads 243
1931 Features for Measuring Credibility on Facebook Information

Authors: Kanda Runapongsa Saikaew, Chaluemwut Noyunsan

Abstract:

Nowadays social media information, such as news, links, images, or VDOs, is shared extensively. However, the effectiveness of disseminating information through social media lacks in quality: less fact checking, more biases, and several rumors. Many researchers have investigated about credibility on Twitter, but there is no the research report about credibility information on Facebook. This paper proposes features for measuring credibility on Facebook information. We developed the system for credibility on Facebook. First, we have developed FB credibility evaluator for measuring credibility of each post by manual human’s labelling. We then collected the training data for creating a model using Support Vector Machine (SVM). Secondly, we developed a chrome extension of FB credibility for Facebook users to evaluate the credibility of each post. Based on the usage analysis of our FB credibility chrome extension, about 81% of users’ responses agree with suggested credibility automatically computed by the proposed system.

Keywords: facebook, social media, credibility measurement, internet

Procedia PDF Downloads 356
1930 Classification of Sturm-Liouville Problems at Infinity

Authors: Kishor J. shinde

Abstract:

We determine the values of k and p such that the Sturm-Liouville differential operator τu=-(d^2 u)/(dx^2) + kx^p u is in limit point case or limit circle case at infinity. In particular it is shown that τ is in the limit point case when (i) for p=2 and ∀k, (ii) for ∀p and k=0, (iii) for all p and k>0, (iv) for 0≤p≤2 and k<0, (v) for p<0 and k<0. τ is in the limit circle case when (i) for p>2 and k<0.

Keywords: limit point case, limit circle case, Sturm-Liouville, infinity

Procedia PDF Downloads 367
1929 Rice Area Determination Using Landsat-Based Indices and Land Surface Temperature Values

Authors: Burçin Saltık, Levent Genç

Abstract:

In this study, it was aimed to determine a route for identification of rice cultivation areas within Thrace and Marmara regions of Turkey using remote sensing and GIS. Landsat 8 (OLI-TIRS) imageries acquired in production season of 2013 with 181/32 Path/Row number were used. Four different seasonal images were generated utilizing original bands and different transformation techniques. All images were classified individually using supervised classification techniques and Land Use Land Cover Maps (LULC) were generated with 8 classes. Areas (ha, %) of each classes were calculated. In addition, district-based rice distribution maps were developed and results of these maps were compared with Turkish Statistical Institute (TurkSTAT; TSI)’s actual rice cultivation area records. Accuracy assessments were conducted, and most accurate map was selected depending on accuracy assessment and coherency with TSI results. Additionally, rice areas on over 4° slope values were considered as mis-classified pixels and they eliminated using slope map and GIS tools. Finally, randomized rice zones were selected to obtain maximum-minimum value ranges of each date (May, June, July, August, September images separately) NDVI, LSWI, and LST images to test whether they may be used for rice area determination via raster calculator tool of ArcGIS. The most accurate classification for rice determination was obtained from seasonal LSWI LULC map, and considering TSI data and accuracy assessment results and mis-classified pixels were eliminated from this map. According to results, 83151.5 ha of rice areas exist within study area. However, this result is higher than TSI records with an area of 12702.3 ha. Use of maximum-minimum range of rice area NDVI, LSWI, and LST was tested in Meric district. It was seen that using the value ranges obtained from July imagery, gave the closest results to TSI records, and the difference was only 206.4 ha. This difference is normal due to relatively low resolution of images. Thus, employment of images with higher spectral, spatial, temporal and radiometric resolutions may provide more reliable results.

Keywords: landsat 8 (OLI-TIRS), LST, LSWI, LULC, NDVI, rice

Procedia PDF Downloads 228
1928 Improvement of GVPI Insulation System Characteristics by Curing Process Modification

Authors: M. Shadmand

Abstract:

The curing process of insulation system for electrical machines plays a determinative role for its durability and reliability. Polar structure of insulating resin molecules and used filler of insulation system can be taken as an occasion to leverage it to enhance overall characteristics of insulation system, mechanically and electrically. The curing process regime for insulating system plays an important role for its mechanical and electrical characteristics by arranging the polymerization of chain structure for resin. In this research, the effect of electrical field application on in-curing insulating system for Global Vacuum Pressurized Impregnation (GVPI) system for traction motor was considered by performing the dissipation factor, polarization and de-polarization current (PDC) and voltage endurance (aging) measurements on sample test objects. Outcome results depicted obvious improvement in mechanical strength of the insulation system as well as higher electrical characteristics with routing and long-time (aging) electrical tests. Coming together, polarization of insulation system during curing process would enhance the machine life time. 

Keywords: insulation system, GVPI, PDC, aging

Procedia PDF Downloads 268
1927 Benders Decomposition Approach to Solve the Hybrid Flow Shop Scheduling Problem

Authors: Ebrahim Asadi-Gangraj

Abstract:

Hybrid flow shop scheduling problem (HFS) contains sequencing in a flow shop where, at any stage, there exist one or more related or unrelated parallel machines. This production system is a common manufacturing environment in many real industries, such as the steel manufacturing, ceramic tile manufacturing, and car assembly industries. In this research, a mixed integer linear programming (MILP) model is presented for the hybrid flow shop scheduling problem, in which, the objective consists of minimizing the maximum completion time (makespan). For this purpose, a Benders Decomposition (BD) method is developed to solve the research problem. The proposed approach is tested on some test problems, small to moderate scale. The experimental results show that the Benders decomposition approach can solve the hybrid flow shop scheduling problem in a reasonable time, especially for small and moderate-size test problems.

Keywords: hybrid flow shop, mixed integer linear programming, Benders decomposition, makespan

Procedia PDF Downloads 190
1926 A New Approach towards the Development of Next Generation CNC

Authors: Yusri Yusof, Kamran Latif

Abstract:

Computer Numeric Control (CNC) machine has been widely used in the industries since its inception. Currently, in CNC technology has been used for various operations like milling, drilling, packing and welding etc. with the rapid growth in the manufacturing world the demand of flexibility in the CNC machines has rapidly increased. Previously, the commercial CNC failed to provide flexibility because its structure was of closed nature that does not provide access to the inner features of CNC. Also CNC’s operating ISO data interface model was found to be limited. Therefore, to overcome that problem, Open Architecture Control (OAC) technology and STEP-NC data interface model are introduced. At present the Personal Computer (PC) has been the best platform for the development of open-CNC systems. In this paper, both ISO data interface model interpretation, its verification and execution has been highlighted with the introduction of the new techniques. The proposed is composed of ISO data interpretation, 3D simulation and machine motion control modules. The system is tested on an old 3 axis CNC milling machine. The results are found to be satisfactory in performance. This implementation has successfully enabled sustainable manufacturing environment.

Keywords: CNC, ISO 6983, ISO 14649, LabVIEW, open architecture control, reconfigurable manufacturing systems, sustainable manufacturing, Soft-CNC

Procedia PDF Downloads 516
1925 A Stochastic Diffusion Process Based on the Two-Parameters Weibull Density Function

Authors: Meriem Bahij, Ahmed Nafidi, Boujemâa Achchab, Sílvio M. A. Gama, José A. O. Matos

Abstract:

Stochastic modeling concerns the use of probability to model real-world situations in which uncertainty is present. Therefore, the purpose of stochastic modeling is to estimate the probability of outcomes within a forecast, i.e. to be able to predict what conditions or decisions might happen under different situations. In the present study, we present a model of a stochastic diffusion process based on the bi-Weibull distribution function (its trend is proportional to the bi-Weibull probability density function). In general, the Weibull distribution has the ability to assume the characteristics of many different types of distributions. This has made it very popular among engineers and quality practitioners, who have considered it the most commonly used distribution for studying problems such as modeling reliability data, accelerated life testing, and maintainability modeling and analysis. In this work, we start by obtaining the probabilistic characteristics of this model, as the explicit expression of the process, its trends, and its distribution by transforming the diffusion process in a Wiener process as shown in the Ricciaardi theorem. Then, we develop the statistical inference of this model using the maximum likelihood methodology. Finally, we analyse with simulated data the computational problems associated with the parameters, an issue of great importance in its application to real data with the use of the convergence analysis methods. Overall, the use of a stochastic model reflects only a pragmatic decision on the part of the modeler. According to the data that is available and the universe of models known to the modeler, this model represents the best currently available description of the phenomenon under consideration.

Keywords: diffusion process, discrete sampling, likelihood estimation method, simulation, stochastic diffusion process, trends functions, bi-parameters weibull density function

Procedia PDF Downloads 309
1924 Stability or Instabilty? Triplet Deficit Analysis In Turkey

Authors: Zeynep Karaçor, Volkan Alptekin, Gökhan Akar, Tuba Akar

Abstract:

This paper aims to review the phenomenon of triplet deficit which is called interaction of budget balance that make up the overall balance of the economy, investment savings balance and current accounts balance in terms of Turkey. In this paper, triplet deficit state in Turkish economy has been analyzed with vector autoregressive model and Granger causality test using data covering the period of 1980-2010. According to VAR results, increase in current accounts is perceived on public sector borrowing requirement. These two variables influence each other bilaterally. Therefore, current accounts increase public deficit, whereas public deficit increases current accounts. It is not possible to mention the existence of a short-term Granger causality between variables at issue.

Keywords: internal and external deficit, stability, triplet deficit, Turkey economy

Procedia PDF Downloads 342
1923 Analytical Study Of Holographic Polymer Dispersed Liquid Crystals Using Finite Difference Time Domain Method

Authors: N. R. Mohamad, H. Ono, H. Haroon, A. Salleh, N. M. Z. Hashim

Abstract:

In this research, we have studied and analyzed the modulation of light and liquid crystal in HPDLCs using Finite Domain Time Difference (FDTD) method. HPDLCs are modeled as a mixture of polymer and liquid crystals (LCs) that categorized as an anisotropic medium. FDTD method is directly solves Maxwell’s equation with less approximation, so this method can analyze more flexible and general approach for the arbitrary anisotropic media. As the results from FDTD simulation, the highest diffraction efficiency occurred at ±19 degrees (Bragg angle) using p polarization incident beam to Bragg grating, Q > 10 when the pitch is 1µm. Therefore, the liquid crystal is assumed to be aligned parallel to the grating constant vector during these parameters.

Keywords: birefringence, diffraction efficiency, finite domain time difference, nematic liquid crystals

Procedia PDF Downloads 460
1922 Exploring the Role of Hydrogen to Achieve the Italian Decarbonization Targets using an OpenScience Energy System Optimization Model

Authors: Alessandro Balbo, Gianvito Colucci, Matteo Nicoli, Laura Savoldi

Abstract:

Hydrogen is expected to become an undisputed player in the ecological transition throughout the next decades. The decarbonization potential offered by this energy vector provides various opportunities for the so-called “hard-to-abate” sectors, including industrial production of iron and steel, glass, refineries and the heavy-duty transport. In this regard, Italy, in the framework of decarbonization plans for the whole European Union, has been considering a wider use of hydrogen to provide an alternative to fossil fuels in hard-to-abate sectors. This work aims to assess and compare different options concerning the pathway to be followed in the development of the future Italian energy system in order to meet decarbonization targets as established by the Paris Agreement and by the European Green Deal, and to infer a techno-economic analysis of the required asset alternatives to be used in that perspective. To accomplish this objective, the Energy System Optimization Model TEMOA-Italy is used, based on the open-source platform TEMOA and developed at PoliTo as a tool to be used for technology assessment and energy scenario analysis. The adopted assessment strategy includes two different scenarios to be compared with a business-as-usual one, which considers the application of current policies in a time horizon up to 2050. The studied scenarios are based on the up-to-date hydrogen-related targets and planned investments included in the National Hydrogen Strategy and in the Italian National Recovery and Resilience Plan, with the purpose of providing a critical assessment of what they propose. One scenario imposes decarbonization objectives for the years 2030, 2040 and 2050, without any other specific target. The second one (inspired to the national objectives on the development of the sector) promotes the deployment of the hydrogen value-chain. These scenarios provide feedback about the applications hydrogen could have in the Italian energy system, including transport, industry and synfuels production. Furthermore, the decarbonization scenario where hydrogen production is not imposed, will make use of this energy vector as well, showing the necessity of its exploitation in order to meet pledged targets by 2050. The distance of the planned policies from the optimal conditions for the achievement of Italian objectives is be clarified, revealing possible improvements of various steps of the decarbonization pathway, which seems to have as a fundamental element Carbon Capture and Utilization technologies for its accomplishment. In line with the European Commission open science guidelines, the transparency and the robustness of the presented results is ensured by the adoption of the open-source open-data model such as the TEMOA-Italy.

Keywords: decarbonization, energy system optimization models, hydrogen, open-source modeling, TEMOA

Procedia PDF Downloads 73
1921 Quantum Kernel Based Regressor for Prediction of Non-Markovianity of Open Quantum Systems

Authors: Diego Tancara, Raul Coto, Ariel Norambuena, Hoseein T. Dinani, Felipe Fanchini

Abstract:

Quantum machine learning is a growing research field that aims to perform machine learning tasks assisted by a quantum computer. Kernel-based quantum machine learning models are paradigmatic examples where the kernel involves quantum states, and the Gram matrix is calculated from the overlapping between these states. With the kernel at hand, a regular machine learning model is used for the learning process. In this paper we investigate the quantum support vector machine and quantum kernel ridge models to predict the degree of non-Markovianity of a quantum system. We perform digital quantum simulation of amplitude damping and phase damping channels to create our quantum dataset. We elaborate on different kernel functions to map the data and kernel circuits to compute the overlapping between quantum states. We observe a good performance of the models.

Keywords: quantum, machine learning, kernel, non-markovianity

Procedia PDF Downloads 182
1920 Seismic Loss Assessment for Peruvian University Buildings with Simulated Fragility Functions

Authors: Jose Ruiz, Jose Velasquez, Holger Lovon

Abstract:

Peruvian university buildings are critical structures for which very little research about its seismic vulnerability is available. This paper develops a probabilistic methodology that predicts seismic loss for university buildings with simulated fragility functions. Two university buildings located in the city of Cusco were analyzed. Fragility functions were developed considering seismic and structural parameters uncertainty. The fragility functions were generated with the Latin Hypercube technique, an improved Montecarlo-based method, which optimizes the sampling of structural parameters and provides at least 100 reliable samples for every level of seismic demand. Concrete compressive strength, maximum concrete strain and yield stress of the reinforcing steel were considered as the key structural parameters. The seismic demand is defined by synthetic records which are compatible with the elastic Peruvian design spectrum. Acceleration records are scaled based on the peak ground acceleration on rigid soil (PGA) which goes from 0.05g to 1.00g. A total of 2000 structural models were considered to account for both structural and seismic variability. These functions represent the overall building behavior because they give rational information regarding damage ratios for defined levels of seismic demand. The university buildings show an expected Mean Damage Factor of 8.80% and 19.05%, respectively, for the 0.22g-PGA scenario, which was amplified by the soil type coefficient and resulted in 0.26g-PGA. These ratios were computed considering a seismic demand related to 10% of probability of exceedance in 50 years which is a requirement in the Peruvian seismic code. These results show an acceptable seismic performance for both buildings.

Keywords: fragility functions, university buildings, loss assessment, Montecarlo simulation, latin hypercube

Procedia PDF Downloads 144