Search results for: Data Retention Voltage
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8400

Search results for: Data Retention Voltage

7080 Content Based Sampling over Transactional Data Streams

Authors: Mansour Tarafdar, Mohammad Saniee Abade

Abstract:

This paper investigates the problem of sampling from transactional data streams. We introduce CFISDS as a content based sampling algorithm that works on a landmark window model of data streams and preserve more informed sample in sample space. This algorithm that work based on closed frequent itemset mining tasks, first initiate a concept lattice using initial data, then update lattice structure using an incremental mechanism.Incremental mechanism insert, update and delete nodes in/from concept lattice in batch manner. Presented algorithm extracts the final samples on demand of user. Experimental results show the accuracy of CFISDS on synthetic and real datasets, despite on CFISDS algorithm is not faster than exist sampling algorithms such as Z and DSS.

Keywords: Sampling, data streams, closed frequent item set mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1709
7079 Quantitative Assessment of Different Formulations of Antimalarials in Sentinel Sites of India

Authors: Taruna Katyal Arora, Geeta Kumari, Hari Shankar, Neelima Mishra

Abstract:

Substandard and counterfeit antimalarials is a major problem in malaria endemic areas. The availability of counterfeit/ substandard medicines is not only decreasing the efficacy in patients, but it is also one of the contributing factors for developing antimalarial drug resistance. Owing to this, a pilot study was conducted to survey quality of drugs collected from different malaria endemic areas of India. Artesunate+Sulphadoxine-Pyrimethamine (AS+SP), Artemether-Lumefantrine (AL), Chloroquine (CQ) tablets were randomly picked from public health facilities in selected states of India. The quality of antimalarial drugs from these areas was assessed by using Global Pharma Health Fund Minilab test kit. This includes physical/visual inspection and disintegration test. Thin-layer chromatography (TLC) was carried out for semi-quantitative assessment of active pharmaceutical ingredients. A total of 45 brands, out of which 21 were for CQ, 14 for AL and 10 for AS+SP were tested from Uttar Pradesh (U.P.), Mizoram, Meghalaya and Gujrat states. One out of 45 samples showed variable disintegration and retension factor. The variable disintegration and retention factor which would have been due to substandard quality or other factors including storage. However, HPLC analysis confirms standard active pharmaceutical ingredient, but may be due to humid temperature and moisture in storage may account for the observed result.

Keywords: Antimalarial medicines, counterfeit, substandard, thin layer chromatography.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1523
7078 Analysis of Current Mirror in 32nm MOSFET and CNTFET Technologies

Authors: Mohini Polimetla, Rajat Mahapatra

Abstract:

There is need to explore emerging technologies based on carbon nanotube electronics as the MOS technology is approaching its limits. As MOS devices scale to the nano ranges, increased short channel effects and process variations considerably effect device and circuit designs. As a promising new transistor, the Carbon Nanotube Field Effect Transistor(CNTFET) avoids most of the fundamental limitations of the Traditional MOSFET devices. In this paper we present the analysis and comparision of a Carbon Nanotube FET(CNTFET) based 10(A current mirror with MOSFET for 32nm technology node. The comparision shows the superiority of the former in terms of 97% increase in output resistance,24% decrease in power dissipation and 40% decrease in minimum voltage required for constant saturation current. Furthermore the effect on performance of current mirror due to change in chirality vector of CNT has also been investigated. The circuit simulations are carried out using HSPICE model.

Keywords: Carbon Nanotube Field Effect Transistor, Chirality Vector, Current Mirror

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3008
7077 Influence of Measurement System on Negative Bias Temperature Instability Characterization: Fast BTI vs Conventional BTI vs Fast Wafer Level Reliability

Authors: Vincent King Soon Wong, Hong Seng Ng, Florinna Sim

Abstract:

Negative Bias Temperature Instability (NBTI) is one of the critical degradation mechanisms in semiconductor device reliability that causes shift in the threshold voltage (Vth). However, thorough understanding of this reliability failure mechanism is still unachievable due to a recovery characteristic known as NBTI recovery. This paper will demonstrate the severity of NBTI recovery as well as one of the effective methods used to mitigate, which is the minimization of measurement system delays. Comparison was done in between two measurement systems that have significant differences in measurement delays to show how NBTI recovery causes result deviations and how fast measurement systems can mitigate NBTI recovery. Another method to minimize NBTI recovery without the influence of measurement system known as Fast Wafer Level Reliability (FWLR) NBTI was also done to be used as reference.

Keywords: Fast vs slow BTI, Fast wafer level reliability, Negative bias temperature instability, NBTI measurement system, metal-oxide-semiconductor field-effect transistor, MOSFET, NBTI recovery, reliability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1664
7076 An Automatic Tool for Checking Consistency between Data Flow Diagrams (DFDs)

Authors: Rosziati Ibrahim, Siow Yen Yen

Abstract:

System development life cycle (SDLC) is a process uses during the development of any system. SDLC consists of four main phases: analysis, design, implement and testing. During analysis phase, context diagram and data flow diagrams are used to produce the process model of a system. A consistency of the context diagram to lower-level data flow diagrams is very important in smoothing up developing process of a system. However, manual consistency check from context diagram to lower-level data flow diagrams by using a checklist is time-consuming process. At the same time, the limitation of human ability to validate the errors is one of the factors that influence the correctness and balancing of the diagrams. This paper presents a tool that automates the consistency check between Data Flow Diagrams (DFDs) based on the rules of DFDs. The tool serves two purposes: as an editor to draw the diagrams and as a checker to check the correctness of the diagrams drawn. The consistency check from context diagram to lower-level data flow diagrams is embedded inside the tool to overcome the manual checking problem.

Keywords: Data Flow Diagram, Context Diagram, ConsistencyCheck, Syntax and Semantic Rules

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3439
7075 Effects of Fermentation Techniques on the Quality of Cocoa Beans

Authors: Monday O. Ale, Adebukola A. Akintade, Olasunbo O. Orungbemi

Abstract:

Fermentation as an important operation in the processing of cocoa beans is now affected by the recent climate change across the globe. The major requirement for effective fermentation is the ability of the material used to retain sufficient heat for the required microbial activities. Apart from the effects of climate on the rate of heat retention, the materials used for fermentation plays an important role. Most Farmers still restrict fermentation activities to the use of traditional methods. Improving on cocoa fermentation in this era of climate change makes it necessary to work on other materials that can be suitable for cocoa fermentation. Therefore, the objective of this study was to determine the effects of fermentation techniques on the quality of cocoa beans. The materials used in this fermentation research were heap-leaves (traditional), stainless steel, plastic tin, plastic basket and wooden box. The period of fermentation varies from zero days to 10 days. Physical and chemical tests were carried out for variables in quality determination in the samples. The weight per bean varied from 1.0-1.2 g after drying across the samples and the major color of the dry beans observed was brown except with the samples from stainless steel. The moisture content varied from 5.5-7%. The mineral content and the heavy metals decreased with increase in the fermentation period. A wooden box can conclusively be used as an alternative to heap-leaves as there was no significant difference in the physical features of the samples fermented with the two methods. The use of a wooden box as an alternative for cocoa fermentation is therefore recommended for cocoa farmers.

Keywords: Effects, fermentation, fermentation materials, period, quality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1022
7074 Real-Time Implementation of STANAG 4539 High-Speed HF Modem

Authors: S. Saraç, F. Kara, C.Vural

Abstract:

High-frequency (HF) communications have been used by military organizations for more than 90 years. The opportunity of very long range communications without the need for advanced equipment makes HF a convenient and inexpensive alternative of satellite communications. Besides the advantages, voice and data transmission over HF is a challenging task, because the HF channel generally suffers from Doppler shift and spread, multi-path, cochannel interference, and many other sources of noise. In constructing an HF data modem, all these effects must be taken into account. STANAG 4539 is a NATO standard for high-speed data transmission over HF. It allows data rates up to 12800 bps over an HF channel of 3 kHz. In this work, an efficient implementation of STANAG 4539 on a single Texas Instruments- TMS320C6747 DSP chip is described. The state-of-the-art algorithms used in the receiver and the efficiency of the implementation enables real-time high-speed data / digitized voice transmission over poor HF channels.

Keywords: High frequency, modem, STANAG 4539.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5341
7073 Detection Efficient Enterprises via Data Envelopment Analysis

Authors: S. Turkan

Abstract:

In this paper, the Turkey’s Top 500 Industrial Enterprises data in 2014 were analyzed by data envelopment analysis. Data envelopment analysis is used to detect efficient decision-making units such as universities, hospitals, schools etc. by using inputs and outputs. The decision-making units in this study are enterprises. To detect efficient enterprises, some financial ratios are determined as inputs and outputs. For this reason, financial indicators related to productivity of enterprises are considered. The efficient foreign weighted owned capital enterprises are detected via super efficiency model. According to the results, it is said that Mercedes-Benz is the most efficient foreign weighted owned capital enterprise in Turkey.

Keywords: Data envelopment analysis, super efficiency, financial ratios, BCC model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 876
7072 System Security Impact on the Dynamic Characteristics of Measurement Sensors in Smart Grids

Authors: Yiyang Su, Jörg Neumann, Jan Wetzlich, Florian Thiel

Abstract:

Smart grid is a term used to describe the next generation power grid. New challenges such as integration of renewable and decentralized energy sources, the requirement for continuous grid estimation and optimization, as well as the use of two-way flows of energy have been brought to the power gird. In order to achieve efficient, reliable, sustainable, as well as secure delivery of electric power more and more information and communication technologies are used for the monitoring and the control of power grids. Consequently, the need for cybersecurity is dramatically increased and has converged into several standards which will be presented here. These standards for the smart grid must be designed to satisfy both performance and reliability requirements. An in depth investigation of the effect of retrospectively embedded security in existing grids on it’s dynamic behavior is required. Therefore, a retrofitting plan for existing meters is offered, and it’s performance in a test low voltage microgrid is investigated. As a result of this, integration of security measures into measurement architectures of smart grids at the design phase is strongly recommended.

Keywords: Cyber security, performance, protocols, security standards, smart grid.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 884
7071 Design of Ultra Fast Polymer Electro-Optic waveguide Switch for Intelligent Optical Networks

Authors: S.Ponmalar, S.Sundaravadivelu

Abstract:

Traditional optical networks are gradually evolving towards intelligent optical networks due to the need for faster bandwidth provisioning, protection and restoration of the network that can be accomplished with devices like optical switch, add drop multiplexer and cross connects. Since dense wavelength multiplexing forms the physical layer for intelligent optical networking, the roll of high speed all optical switch is important. This paper analyzes such an ultra-high speed polymer electro-optic switch. The performances of the 2x2 optical waveguide switch with rectangular, triangular and trapezoidal grating profiles on various device parameters are analyzed. The simulation result shows that trapezoidal grating is the optimized structure which has the coupling length of 81μm and switching voltage of 11V for the operating wavelength of 1550nm. The switching time for this proposed switch is 0.47 picosecond. This makes the proposed switch to be an important element in the intelligent optical network.

Keywords: Intelligent optical network, optical switch, electrooptic effect, coupled mode theory, waveguide grating structures

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1446
7070 Fuzzy Logic Controller Based Shunt Active Filter with Different MFs for Current Harmonics Elimination

Authors: Shreyash Sinai Kunde, Siddhang Tendulkar, Shiv Prakash Gupta, Gaurav Kumar, Suresh Mikkili

Abstract:

One of the major power quality concerns in modern times is the problem of current harmonics. The current harmonics is caused due to the increase in non-linear loads which is largely dominated by power electronics devices. The Shunt active filtering is one of the best solutions for mitigating current harmonics. This paper describes a fuzzy logic controller based (FLC) based three Phase Shunt active Filter to achieve low current harmonic distortion (THD) and Reactive power compensation. The performance of fuzzy logic controller is analysed under both balanced sinusoidal and unbalanced sinusoidal source condition. The above controller serves the purpose of maintaining DC Capacitor Voltage constant. The proposed shunt active filter uses hysteresis current controller for current control of IGBT based PWM inverter. The simulation results of model in Simulink MATLAB reveals satisfying results.

Keywords: Shunt active filter, Current harmonics, Fuzzy logic controller, Hysteresis current controller.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2724
7069 Fusion of ETM+ Multispectral and Panchromatic Texture for Remote Sensing Classification

Authors: Mahesh Pal

Abstract:

This paper proposes to use ETM+ multispectral data and panchromatic band as well as texture features derived from the panchromatic band for land cover classification. Four texture features including one 'internal texture' and three GLCM based textures namely correlation, entropy, and inverse different moment were used in combination with ETM+ multispectral data. Two data sets involving combination of multispectral, panchromatic band and its texture were used and results were compared with those obtained by using multispectral data alone. A decision tree classifier with and without boosting were used to classify different datasets. Results from this study suggest that the dataset consisting of panchromatic band, four of its texture features and multispectral data was able to increase the classification accuracy by about 2%. In comparison, a boosted decision tree was able to increase the classification accuracy by about 3% with the same dataset.

Keywords: Internal texture; GLCM; decision tree; boosting; classification accuracy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1736
7068 Envelope Echo Signal of Metal Sphere in the Fresh Water

Authors: A. Mahfurdz, Sunardi, H. Ahmad

Abstract:

An envelope echo signal measurement is proposed in this paper using echo signal observation from the 200 kHz echo sounder receiver. The envelope signal without any object is compared with the envelope signal of the sphere. Two diameter size steel ball (3.1 cm & 2.2 cm) and two diameter size air filled stainless steel ball (4.8 cm & 7.4 cm) used in this experiment. The target was positioned about 0.5 m and 1.0 meter from the transducer face using nylon rope. From the echo observation in time domain, it is obviously shown that echo signal structure is different between the size, distance and type of metal sphere. The amplitude envelope voltage for the bigger sphere is higher compare to the small sphere and it confirm that the bigger sphere have higher target strength compare to the small sphere. Although the structure signal without any object are different compare to the signal from the sphere, the reflected signal from the tank floor increase linearly with the sphere size. We considered this event happened because of the object position approximately to the tank floor.

Keywords: echo sounder, target strength, sphere, echo signal

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1606
7067 A Formal Approach for Instructional Design Integrated with Data Visualization for Learning Analytics

Authors: Douglas A. Menezes, Isabel D. Nunes, Ulrich Schiel

Abstract:

Most Virtual Learning Environments do not provide support mechanisms for the integrated planning, construction and follow-up of Instructional Design supported by Learning Analytic results. The present work aims to present an authoring tool that will be responsible for constructing the structure of an Instructional Design (ID), without the data being altered during the execution of the course. The visual interface aims to present the critical situations present in this ID, serving as a support tool for the course follow-up and possible improvements, which can be made during its execution or in the planning of a new edition of this course. The model for the ID is based on High-Level Petri Nets and the visualization forms are determined by the specific kind of the data generated by an e-course, a population of students generating sequentially dependent data.

Keywords: Educational data visualization, high-level petri nets, instructional design, learning analytics.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 848
7066 Visual Text Analytics Technologies for Real-Time Big Data: Chronological Evolution and Issues

Authors: Siti Azrina B. A. Aziz, Siti Hafizah A. Hamid

Abstract:

New approaches to analyze and visualize data stream in real-time basis is important in making a prompt decision by the decision maker. Financial market trading and surveillance, large-scale emergency response and crowd control are some example scenarios that require real-time analytic and data visualization. This situation has led to the development of techniques and tools that support humans in analyzing the source data. With the emergence of Big Data and social media, new techniques and tools are required in order to process the streaming data. Today, ranges of tools which implement some of these functionalities are available. In this paper, we present chronological evolution evaluation of technologies for supporting of real-time analytic and visualization of the data stream. Based on the past research papers published from 2002 to 2014, we gathered the general information, main techniques, challenges and open issues. The techniques for streaming text visualization are identified based on Text Visualization Browser in chronological order. This paper aims to review the evolution of streaming text visualization techniques and tools, as well as to discuss the problems and challenges for each of identified tools.

Keywords: Information visualization, visual analytics, text mining, visual text analytics tools, big data visualization.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1002
7065 Churn Prediction for Telecommunication Industry Using Artificial Neural Networks

Authors: Ulas Vural, M. Ergun Okay, E. Mesut Yildiz

Abstract:

Telecommunication service providers demand accurate and precise prediction of customer churn probabilities to increase the effectiveness of their customer relation services. The large amount of customer data owned by the service providers is suitable for analysis by machine learning methods. In this study, expenditure data of customers are analyzed by using an artificial neural network (ANN). The ANN model is applied to the data of customers with different billing duration. The proposed model successfully predicts the churn probabilities at 83% accuracy for only three months expenditure data and the prediction accuracy increases up to 89% when the nine month data is used. The experiments also show that the accuracy of ANN model increases on an extended feature set with information of the changes on the bill amounts.

Keywords: Customer relationship management, churn prediction, telecom industry, deep learning, Artificial Neural Networks, ANN.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 760
7064 A Technical Perspective on Roadway Safety in Eastern Province: Data Evaluation and Spatial Analysis

Authors: Muhammad Farhan, Sayed Faruque, Amr Mohammed, Sami Osman, Omar Al-Jabari, Abdul Almojil

Abstract:

Saudi Arabia in recent years has seen drastic increase in traffic related crashes. With population of over 29 million, Saudi Arabia is considered as a fast growing and emerging economy. The rapid population increase and economic growth has resulted in rapid expansion of transportation infrastructure, which has led to increase in road crashes. Saudi Ministry of Interior reported more than 7,000 people killed and 68,000 injured in 2011 ranking Saudi Arabia to be one of the worst worldwide in traffic safety. The traffic safety issues in the country also result in distress to road users and cause and economic loss exceeding 3.7 billion Euros annually. Keeping this in view, the researchers in Saudi Arabia are investigating ways to improve traffic safety conditions in the country. This paper presents a multilevel approach to collect traffic safety related data required to do traffic safety studies in the region. Two highway corridors including King Fahd Highway 39 kilometre and Gulf Cooperation Council Highway 42 kilometre long connecting the cities of Dammam and Khobar were selected as a study area. Traffic data collected included traffic counts, crash data, travel time data, and speed data. The collected data was analysed using geographic information system to evaluate any correlation. Further research is needed to investigate the effectiveness of traffic safety related data when collected in a concerted effort.

Keywords: Crash Data, Data Collection, Traffic Safety.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2351
7063 Machine Scoring Model Using Data Mining Techniques

Authors: Wimalin S. Laosiritaworn, Pongsak Holimchayachotikul

Abstract:

this article proposed a methodology for computer numerical control (CNC) machine scoring. The case study company is a manufacturer of hard disk drive parts in Thailand. In this company, sample of parts manufactured from CNC machine are usually taken randomly for quality inspection. These inspection data were used to make a decision to shut down the machine if it has tendency to produce parts that are out of specification. Large amount of data are produced in this process and data mining could be very useful technique in analyzing them. In this research, data mining techniques were used to construct a machine scoring model called 'machine priority assessment model (MPAM)'. This model helps to ensure that the machine with higher risk of producing defective parts be inspected before those with lower risk. If the defective prone machine is identified sooner, defective part and rework could be reduced hence improving the overall productivity. The results showed that the proposed method can be successfully implemented and approximately 351,000 baht of opportunity cost could have saved in the case study company.

Keywords: Computer Numerical Control, Data Mining, HardDisk Drive.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1395
7062 The Impact of Seasonality on Rainfall Patterns: A Case Study

Authors: Priti Kaushik, Randhir Singh Baghel, Somil Khandelwal

Abstract:

This study uses whole-year data from Rajasthan, India, at the meteorological divisional level to analyze and evaluate long-term spatiotemporal trends in rainfall and looked at the data from each of the thirteen tehsils in the Jaipur district to see how the rainfall pattern has altered over the last 10 years. Data on daily rainfall from the Indian Meteorological Department (IMD) in Jaipur are available for the years 2012 through 2021. We mainly focus on comparing data of tehsil wise in the Jaipur district, Rajasthan, India. Also analyzed is the fact that July and August always see higher rainfall than any other month. Rainfall usually starts to rise around week 25th and peaks in weeks 32nd or 33rd. They showed that on several occasions, 2017 saw the least amount of rainfall during a long span of 10 years. The greatest rain fell between 2012 and 2021 in 2013, 2019, and 2020.

Keywords: Data analysis, extreme events, rainfall, descriptive case studies, precipitation temperature.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 190
7061 Enhance the Power of Sentiment Analysis

Authors: Yu Zhang, Pedro Desouza

Abstract:

Since big data has become substantially more accessible and manageable due to the development of powerful tools for dealing with unstructured data, people are eager to mine information from social media resources that could not be handled in the past. Sentiment analysis, as a novel branch of text mining, has in the last decade become increasingly important in marketing analysis, customer risk prediction and other fields. Scientists and researchers have undertaken significant work in creating and improving their sentiment models. In this paper, we present a concept of selecting appropriate classifiers based on the features and qualities of data sources by comparing the performances of five classifiers with three popular social media data sources: Twitter, Amazon Customer Reviews, and Movie Reviews. We introduced a couple of innovative models that outperform traditional sentiment classifiers for these data sources, and provide insights on how to further improve the predictive power of sentiment analysis. The modeling and testing work was done in R and Greenplum in-database analytic tools.

Keywords: Sentiment Analysis, Social Media, Twitter, Amazon, Data Mining, Machine Learning, Text Mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3518
7060 Lineup Optimization Model of Basketball Players Based on the Prediction of Recursive Neural Networks

Authors: Wang Yichen, Haruka Yamashita

Abstract:

In recent years, in the field of sports, decision making such as member in the game and strategy of the game based on then analysis of the accumulated sports data are widely attempted. In fact, in the NBA basketball league where the world's highest level players gather, to win the games, teams analyze the data using various statistical techniques. However, it is difficult to analyze the game data for each play such as the ball tracking or motion of the players in the game, because the situation of the game changes rapidly, and the structure of the data should be complicated. Therefore, it is considered that the analysis method for real time game play data is proposed. In this research, we propose an analytical model for "determining the optimal lineup composition" using the real time play data, which is considered to be difficult for all coaches. In this study, because replacing the entire lineup is too complicated, and the actual question for the replacement of players is "whether or not the lineup should be changed", and “whether or not Small Ball lineup is adopted”. Therefore, we propose an analytical model for the optimal player selection problem based on Small Ball lineups. In basketball, we can accumulate scoring data for each play, which indicates a player's contribution to the game, and the scoring data can be considered as a time series data. In order to compare the importance of players in different situations and lineups, we combine RNN (Recurrent Neural Network) model, which can analyze time series data, and NN (Neural Network) model, which can analyze the situation on the field, to build the prediction model of score. This model is capable to identify the current optimal lineup for different situations. In this research, we collected all the data of accumulated data of NBA from 2019-2020. Then we apply the method to the actual basketball play data to verify the reliability of the proposed model.

Keywords: Recurrent Neural Network, players lineup, basketball data, decision making model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 828
7059 New Multisensor Data Fusion Method Based on Probabilistic Grids Representation

Authors: Zhichao Zhao, Yi Liu, Shunping Xiao

Abstract:

A new data fusion method called joint probability density matrix (JPDM) is proposed, which can associate and fuse measurements from spatially distributed heterogeneous sensors to identify the real target in a surveillance region. Using the probabilistic grids representation, we numerically combine the uncertainty regions of all the measurements in a general framework. The NP-hard multisensor data fusion problem has been converted to a peak picking problem in the grids map. Unlike most of the existing data fusion method, the JPDM method dose not need association processing, and will not lead to combinatorial explosion. Its convergence to the CRLB with a diminishing grid size has been proved. Simulation results are presented to illustrate the effectiveness of the proposed technique.

Keywords: Cramer-Rao lower bound (CRLB), data fusion, probabilistic grids, joint probability density matrix, localization, sensor network.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1803
7058 Sampled-Data Model Predictive Tracking Control for Mobile Robot

Authors: Wookyong Kwon, Sangmoon Lee

Abstract:

In this paper, a sampled-data model predictive tracking control method is presented for mobile robots which is modeled as constrained continuous-time linear parameter varying (LPV) systems. The presented sampled-data predictive controller is designed by linear matrix inequality approach. Based on the input delay approach, a controller design condition is derived by constructing a new Lyapunov function. Finally, a numerical example is given to demonstrate the effectiveness of the presented method.

Keywords: Model predictive control, sampled-data control, linear parameter varying systems, LPV.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1277
7057 Current Controlled Current Conveyor (CCCII)and Application using 65nm CMOS Technology

Authors: Zia Abbas, Giuseppe Scotti, Mauro Olivieri

Abstract:

Current mode circuits like current conveyors are getting significant attention in current analog ICs design due to their higher band-width, greater linearity, larger dynamic range, simpler circuitry, lower power consumption and less chip area. The second generation current controlled conveyor (CCCII) has the advantage of electronic adjustability over the CCII i.e. in CCCII; adjustment of the X-terminal intrinsic resistance via a bias current is possible. The presented approach is based on the CMOS implementation of second generation positive (CCCII+), negative (CCCII-) and dual Output Current Controlled Conveyor (DOCCCII) and its application as Universal filter. All the circuits have been designed and simulated using 65nm CMOS technology model parameters on Cadence Virtuoso / Spectre using 1V supply voltage. Various simulations have been carried out to verify the linearity between output and input ports, range of operation frequency, etc. The outcomes show good agreement between expected and experimental results.

Keywords: CCCII+, CCCII-, DOCCCII, Electronic tunability, Universal filter

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4705
7056 Talent Management through Integration of Talent Value Chain and Human Capital Analytics Approaches

Authors: Wuttigrai Ngamsirijit

Abstract:

Talent management in today’s modern organizations has become data-driven due to a demand for objective human resource decision making and development of analytics technologies. HR managers have been faced with some obstacles in exploiting data and information to obtain their effective talent management decisions. These include process-based data and records; insufficient human capital-related measures and metrics; lack of capabilities in data modeling in strategic manners; and, time consuming to add up numbers and make decisions. This paper proposes a framework of talent management through integration of talent value chain and human capital analytics approaches. It encompasses key data, measures, and metrics regarding strategic talent management decisions along the organizational and talent value chain. Moreover, specific predictive and prescriptive models incorporating these data and information are recommended to help managers in understanding the state of talent, gaps in managing talent and the organization, and the ways to develop optimized talent strategies.    

Keywords: Decision making, human capital analytics, talent management, talent value chain.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 966
7055 A Case Study on Optimization of Contractor’s Financing through Allocation of Subcontractors

Authors: Helen S. Ghali, Engy Serag, A. Samer Ezeldin

Abstract:

In many countries, the construction industry relies heavily on outsourcing models in executing their projects and expanding their businesses to fit in the diverse market. Such extensive integration of subcontractors is becoming an influential factor in contractor’s cash flow management. Accordingly, subcontractors’ financial terms are important phenomena and pivotal components for the well-being of the contractor’s cash flow. The aim of this research is to study the contractor’s cash flow with respect to the owner and subcontractor’s payment management plans, considering variable advance payment, payment frequency, and lag and retention policies. The model is developed to provide contractors with a decision support tool that can assist in selecting the optimum subcontracting plan to minimize the contractor’s financing limits and optimize the profit values. The model is built using Microsoft Excel VBA coding, and the genetic algorithm is utilized as the optimization tool. Three objective functions are investigated, which are minimizing the highest negative overdraft value, minimizing the net present worth of overdraft, and maximizing the project net profit. The model is validated on a full-scale project which includes both self-performed and subcontracted work packages. The results show potential outputs in optimizing the contractor’s negative cash flow values and, in the meantime, assisting contractors in selecting suitable subcontractors to achieve the objective function.

Keywords: Cash flow optimization, payment plan, procurement management, subcontracting plan.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 209
7054 Enhancing K-Means Algorithm with Initial Cluster Centers Derived from Data Partitioning along the Data Axis with the Highest Variance

Authors: S. Deelers, S. Auwatanamongkol

Abstract:

In this paper, we propose an algorithm to compute initial cluster centers for K-means clustering. Data in a cell is partitioned using a cutting plane that divides cell in two smaller cells. The plane is perpendicular to the data axis with the highest variance and is designed to reduce the sum squared errors of the two cells as much as possible, while at the same time keep the two cells far apart as possible. Cells are partitioned one at a time until the number of cells equals to the predefined number of clusters, K. The centers of the K cells become the initial cluster centers for K-means. The experimental results suggest that the proposed algorithm is effective, converge to better clustering results than those of the random initialization method. The research also indicated the proposed algorithm would greatly improve the likelihood of every cluster containing some data in it.

Keywords: Clustering algorithm, K-means algorithm, Datapartitioning, Initial cluster centers.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2866
7053 Semi-Supervised Outlier Detection Using a Generative and Adversary Framework

Authors: Jindong Gu, Matthias Schubert, Volker Tresp

Abstract:

In many outlier detection tasks, only training data belonging to one class, i.e., the positive class, is available. The task is then to predict a new data point as belonging either to the positive class or to the negative class, in which case the data point is considered an outlier. For this task, we propose a novel corrupted Generative Adversarial Network (CorGAN). In the adversarial process of training CorGAN, the Generator generates outlier samples for the negative class, and the Discriminator is trained to distinguish the positive training data from the generated negative data. The proposed framework is evaluated using an image dataset and a real-world network intrusion dataset. Our outlier-detection method achieves state-of-the-art performance on both tasks.

Keywords: Outlier detection, generative adversary networks, semi-supervised learning.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1074
7052 Harmonic Elimination of Hybrid Multilevel Inverters Using Particle Swarm Optimization

Authors: N. Janjamraj, A. Oonsivilai

Abstract:

This paper present the harmonic elimination of hybrid multilevel inverters (HMI) which could be increase the number of output voltage level. Total Harmonic Distortion (THD) is one of the most important requirements concerning performance indices. Because of many numbers output levels of HMI, it had numerous unknown variables of eliminate undesired individual harmonic and THD nonlinear equations set. Optimized harmonic stepped waveform (OHSW) is solving switching angles conventional method, but most complicated for solving as added level. The artificial intelligent techniques are deliberation to solve this problem. This paper presents the Particle Swarm Optimization (PSO) technique for solving switching angles to get minimum THD and eliminate undesired individual harmonics of 15-levels hybrid multilevel inverters. Consequently it had many variables and could eliminate numerous harmonics. Both advantages including high level of inverter and Particle Swarm Optimization (PSO) are used as powerful tools for harmonics elimination.

Keywords: Multilevel Inverters, Particle Swarms Optimization, Harmonic Elimination.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2520
7051 Methodology of the Turkey’s National Geographic Information System Integration Project

Authors: Buse A. Ataç, Doğan K. Cenan, Arda Çetinkaya, Naz D. Şahin, Köksal Sanlı, Zeynep Koç, Akın Kısa

Abstract:

With its spatial data reliability, interpretation and questioning capabilities, Geographical Information Systems make significant contributions to scientists, planners and practitioners. Geographic information systems have received great attention in today's digital world, growing rapidly, and increasing the efficiency of use. Access to and use of current and accurate geographical data, which are the most important components of the Geographical Information System, has become a necessity rather than a need for sustainable and economic development. This project aims to enable sharing of data collected by public institutions and organizations on a web-based platform. Within the scope of the project, INSPIRE (Infrastructure for Spatial Information in the European Community) data specifications are considered as a road-map. In this context, Turkey's National Geographic Information System (TUCBS) Integration Project supports sharing spatial data within 61 pilot public institutions as complied with defined national standards. In this paper, which is prepared by the project team members in the TUCBS Integration Project, the technical process with a detailed methodology is explained. In this context, the main technical processes of the Project consist of Geographic Data Analysis, Geographic Data Harmonization (Standardization), Web Service Creation (WMS, WFS) and Metadata Creation-Publication. In this paper, the integration process carried out to provide the data produced by 61 institutions to be shared from the National Geographic Data Portal (GEOPORTAL), have been trying to be conveyed with a detailed methodology.

Keywords: Data specification, geoportal, GIS, INSPIRE, TUCBS, Turkey’s National Geographic Information System.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 694