Search results for: data mining techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 29144

Search results for: data mining techniques

28394 Destination Port Detection For Vessels: An Analytic Tool For Optimizing Port Authorities Resources

Authors: Lubna Eljabu, Mohammad Etemad, Stan Matwin

Abstract:

Port authorities have many challenges in congested ports to allocate their resources to provide a safe and secure loading/ unloading procedure for cargo vessels. Selecting a destination port is the decision of a vessel master based on many factors such as weather, wavelength and changes of priorities. Having access to a tool which leverages AIS messages to monitor vessel’s movements and accurately predict their next destination port promotes an effective resource allocation process for port authorities. In this research, we propose a method, namely, Reference Route of Trajectory (RRoT) to assist port authorities in predicting inflow and outflow traffic in their local environment by monitoring Automatic Identification System (AIS) messages. Our RRoT method creates a reference route based on historical AIS messages. It utilizes some of the best trajectory similarity measure to identify the destination of a vessel using their recent movement. We evaluated five different similarity measures such as Discrete Fr´echet Distance (DFD), Dynamic Time Warping (DTW), Partial Curve Mapping (PCM), Area between two curves (Area) and Curve length (CL). Our experiments show that our method identifies the destination port with an accuracy of 98.97% and an fmeasure of 99.08% using Dynamic Time Warping (DTW) similarity measure.

Keywords: spatial temporal data mining, trajectory mining, trajectory similarity, resource optimization

Procedia PDF Downloads 107
28393 An Information-Based Approach for Preference Method in Multi-Attribute Decision Making

Authors: Serhat Tuzun, Tufan Demirel

Abstract:

Multi-Criteria Decision Making (MCDM) is the modelling of real-life to solve problems we encounter. It is a discipline that aids decision makers who are faced with conflicting alternatives to make an optimal decision. MCDM problems can be classified into two main categories: Multi-Attribute Decision Making (MADM) and Multi-Objective Decision Making (MODM), based on the different purposes and different data types. Although various MADM techniques were developed for the problems encountered, their methodology is limited in modelling real-life. Moreover, objective results are hard to obtain, and the findings are generally derived from subjective data. Although, new and modified techniques are developed by presenting new approaches such as fuzzy logic; comprehensive techniques, even though they are better in modelling real-life, could not find a place in real world applications for being hard to apply due to its complex structure. These constraints restrict the development of MADM. This study aims to conduct a comprehensive analysis of preference methods in MADM and propose an approach based on information. For this purpose, a detailed literature review has been conducted, current approaches with their advantages and disadvantages have been analyzed. Then, the approach has been introduced. In this approach, performance values of the criteria are calculated in two steps: first by determining the distribution of each attribute and standardizing them, then calculating the information of each attribute as informational energy.

Keywords: literature review, multi-attribute decision making, operations research, preference method, informational energy

Procedia PDF Downloads 208
28392 Hybrid Approach for Software Defect Prediction Using Machine Learning with Optimization Technique

Authors: C. Manjula, Lilly Florence

Abstract:

Software technology is developing rapidly which leads to the growth of various industries. Now-a-days, software-based applications have been adopted widely for business purposes. For any software industry, development of reliable software is becoming a challenging task because a faulty software module may be harmful for the growth of industry and business. Hence there is a need to develop techniques which can be used for early prediction of software defects. Due to complexities in manual prediction, automated software defect prediction techniques have been introduced. These techniques are based on the pattern learning from the previous software versions and finding the defects in the current version. These techniques have attracted researchers due to their significant impact on industrial growth by identifying the bugs in software. Based on this, several researches have been carried out but achieving desirable defect prediction performance is still a challenging task. To address this issue, here we present a machine learning based hybrid technique for software defect prediction. First of all, Genetic Algorithm (GA) is presented where an improved fitness function is used for better optimization of features in data sets. Later, these features are processed through Decision Tree (DT) classification model. Finally, an experimental study is presented where results from the proposed GA-DT based hybrid approach is compared with those from the DT classification technique. The results show that the proposed hybrid approach achieves better classification accuracy.

Keywords: decision tree, genetic algorithm, machine learning, software defect prediction

Procedia PDF Downloads 323
28391 Statistical Analysis to Select Evacuation Route

Authors: Zaky Musyarof, Dwi Yono Sutarto, Dwima Rindy Atika, R. B. Fajriya Hakim

Abstract:

Each country should be responsible for the safety of people, especially responsible for the safety of people living in disaster-prone areas. One of those services is provides evacuation route for them. But all this time, the selection of evacuation route is seem doesn’t well organized, it could be seen that when a disaster happen, there will be many accumulation of people on the steps of evacuation route. That condition is dangerous to people because hampers evacuation process. By some methods in Statistical analysis, author tries to give a suggestion how to prepare evacuation route which is organized and based on people habit. Those methods are association rules, sequential pattern mining, hierarchical cluster analysis and fuzzy logic.

Keywords: association rules, sequential pattern mining, cluster analysis, fuzzy logic, evacuation route

Procedia PDF Downloads 490
28390 Development of Typical Meteorological Year for Passive Cooling Applications Using World Weather Data

Authors: Nasser A. Al-Azri

Abstract:

The effectiveness of passive cooling techniques is assessed based on bioclimatic charts that require the typical meteorological year (TMY) for a specified location for their development. However, TMYs are not always available; mainly due to the scarcity of records of solar radiation which is an essential component used in developing common TMYs intended for general uses. Since solar radiation is not required in the development of the bioclimatic chart, this work suggests developing TMYs based solely on the relevant parameters. This approach improves the accuracy of the developed TMY since only the relevant parameters are considered and it also makes the development of the TMY more accessible since solar radiation data are not used. The presented paper will also discuss the development of the TMY from the raw data available at the NOAA-NCDC archive of world weather data and the construction of the bioclimatic charts for some randomly selected locations around the world.

Keywords: bioclimatic charts, passive cooling, TMY, weather data

Procedia PDF Downloads 232
28389 Timely Detection and Identification of Abnormalities for Process Monitoring

Authors: Hyun-Woo Cho

Abstract:

The detection and identification of multivariate manufacturing processes are quite important in order to maintain good product quality. Unusual behaviors or events encountered during its operation can have a serious impact on the process and product quality. Thus they should be detected and identified as soon as possible. This paper focused on the efficient representation of process measurement data in detecting and identifying abnormalities. This qualitative method is effective in representing fault patterns of process data. In addition, it is quite sensitive to measurement noise so that reliable outcomes can be obtained. To evaluate its performance a simulation process was utilized, and the effect of adopting linear and nonlinear methods in the detection and identification was tested with different simulation data. It has shown that the use of a nonlinear technique produced more satisfactory and more robust results for the simulation data sets. This monitoring framework can help operating personnel to detect the occurrence of process abnormalities and identify their assignable causes in an on-line or real-time basis.

Keywords: detection, monitoring, identification, measurement data, multivariate techniques

Procedia PDF Downloads 225
28388 Study of the Transport of ²²⁶Ra Colloidal in Mining Context Using a Multi-Disciplinary Approach

Authors: Marine Reymond, Michael Descostes, Marie Muguet, Clemence Besancon, Martine Leermakers, Catherine Beaucaire, Sophie Billon, Patricia Patrier

Abstract:

²²⁶Ra is one of the radionuclides resulting from the disintegration of ²³⁸U. Due to its half-life (1600 y) and its high specific activity (3.7 x 1010 Bq/g), ²²⁶Ra is found at the ultra-trace level in the natural environment (usually below 1 Bq/L, i.e. 10-13 mol/L). Because of its decay in ²²²Rn, a radioactive gas with a shorter half-life (3.8 days) which is difficult to control and dangerous for humans when inhaled, ²²⁶Ra is subject to a dedicated monitoring in surface waters especially in the context of uranium mining. In natural waters, radionuclides occur in dissolved, colloidal or particular forms. Due to the size of colloids, generally ranging between 1 nm and 1 µm and their high specific surface areas, the colloidal fraction could be involved in the transport of trace elements, including radionuclides in the environment. The colloidal fraction is not always easy to determine and few existing studies focus on ²²⁶Ra. In the present study, a complete multidisciplinary approach is proposed to assess the colloidal transport of ²²⁶Ra. It includes water sampling by conventional filtration (0.2µm) and the innovative Diffusive Gradient in Thin Films technique to measure the dissolved fraction (<10nm), from which the colloidal fraction could be estimated. Suspended matter in these waters were also sampled and characterized mineralogically by X-Ray Diffraction, infrared spectroscopy and scanning electron microscopy. All of these data, which were acquired on a rehabilitated former uranium mine, allowed to build a geochemical model using the geochemical calculation code PhreeqC to describe, as accurately as possible, the colloidal transport of ²²⁶Ra. Colloidal transport of ²²⁶Ra was found, for some of the sampling points, to account for up to 95% of the total ²²⁶Ra measured in water. Mineralogical characterization and associated geochemical modelling highlight the role of barite, a barium sulfate mineral well known to trap ²²⁶Ra into its structure. Barite was shown to be responsible for the colloidal ²²⁶Ra fraction despite the presence of kaolinite and ferrihydrite, which are also known to retain ²²⁶Ra by sorption.

Keywords: colloids, mining context, radium, transport

Procedia PDF Downloads 144
28387 AI-Powered Models for Real-Time Fraud Detection in Financial Transactions to Improve Financial Security

Authors: Shanshan Zhu, Mohammad Nasim

Abstract:

Financial fraud continues to be a major threat to financial institutions across the world, causing colossal money losses and undermining public trust. Fraud prevention techniques, based on hard rules, have become ineffective due to evolving patterns of fraud in recent times. Against such a background, the present study probes into distinct methodologies that exploit emergent AI-driven techniques to further strengthen fraud detection. We would like to compare the performance of generative adversarial networks and graph neural networks with other popular techniques, like gradient boosting, random forests, and neural networks. To this end, we would recommend integrating all these state-of-the-art models into one robust, flexible, and smart system for real-time anomaly and fraud detection. To overcome the challenge, we designed synthetic data and then conducted pattern recognition and unsupervised and supervised learning analyses on the transaction data to identify which activities were fishy. With the use of actual financial statistics, we compare the performance of our model in accuracy, speed, and adaptability versus conventional models. The results of this study illustrate a strong signal and need to integrate state-of-the-art, AI-driven fraud detection solutions into frameworks that are highly relevant to the financial domain. It alerts one to the great urgency that banks and related financial institutions must rapidly implement these most advanced technologies to continue to have a high level of security.

Keywords: AI-driven fraud detection, financial security, machine learning, anomaly detection, real-time fraud detection

Procedia PDF Downloads 6
28386 Improved Network Construction Methods Based on Virtual Rails for Mobile Sensor Network

Authors: Noritaka Shigei, Kazuto Matsumoto, Yoshiki Nakashima, Hiromi Miyajima

Abstract:

Although Mobile Wireless Sensor Networks (MWSNs), which consist of mobile sensor nodes (MSNs), can cover a wide range of observation region by using a small number of sensor nodes, they need to construct a network to collect the sensing data on the base station by moving the MSNs. As an effective method, the network construction method based on Virtual Rails (VRs), which is referred to as VR method, has been proposed. In this paper, we propose two types of effective techniques for the VR method. They can prolong the operation time of the network, which is limited by the battery capabilities of MSNs and the energy consumption of MSNs. The first technique, an effective arrangement of VRs, almost equalizes the number of MSNs belonging to each VR. The second technique, an adaptive movement method of MSNs, takes into account the residual energy of battery. In the simulation, we demonstrate that each technique can improve the network lifetime and the combination of both techniques is the most effective.

Keywords: mobile sensor node, relay of sensing data, residual energy, virtual rail, wireless sensor network

Procedia PDF Downloads 322
28385 Comparing ITV Definitions From 4D CT-PET and Breath-Hold Technique with Abdominal Compression

Authors: R. D. Esposito, P. Dorado Rodriguez, D. Planes Meseguer

Abstract:

In this work, we compare the contour of Internal Target Volume (ITV), for Stereotactic Body Radiation Therapy (SBRT) of a patient affected by a single liver metastasis, obtained from two different patient data acquisition techniques. The first technique consists in a free breathing Computer Tomography (CT) scan acquisition, followed by exhalation breath-hold and inhalation breath-hold CT scans, all of them applying abdominal compression while the second technique consists in a free breathing 4D CT-PET (Positron Emission Tomography) scan. Results obtained with these two methods are consistent, which demonstrate that at least for this specific case, both techniques are adequate for ITV contouring in SBRT treatments.

Keywords: 4D CT-PET, abdominal compression, ITV, SBRT

Procedia PDF Downloads 433
28384 Load Management Using Multiple Sequential Load Shaping Techniques

Authors: Amira M. Attia, Karim H. Youssef, Nabil H. Abbasi

Abstract:

Demand Side Management (DSM) is an essential characteristic of current and future smart grid systems. As one of DSM functions, load management aims to control customers’ total electric consumption and utility’s load factor by using various load shaping techniques. However, applying load shaping techniques such as load shifting, peak clipping, or strategic conservation individually does not provide the desired level of improvement for load factor increment and/or customer’s bill reduction. In this paper, two load shaping techniques will be simulated as constrained optimization problems. The purpose is to reflect the application of combined load shifting and strategic conservation model together at the same time, and the application of combined load shifting and peak clipping model as well. The problem will be formulated and solved by using disciplined convex programming (CVX) based MATLAB® R2013b. Simulation results will be evaluated and compared for studying the most impactful multi-techniques model in improving load curve.

Keywords: convex programing, demand side management, load shaping, multiple, building energy optimization

Procedia PDF Downloads 301
28383 The Use of Piezocone Penetration Test Data for the Assessment of Iron Ore Tailings Liquefaction Susceptibility

Authors: Breno M. Castilho

Abstract:

The Iron Ore Quadrangle, located in the state of Minas Gerais, Brazil is responsible for most of the country’s iron ore production. As a result, some of the biggest tailings dams in the country are located in this area. In recent years, several major failure events have happened in Tailings Storage Facilities (TSF) located in the Iron Ore Quadrangle. Some of these failures were found to be caused by liquefaction flowslides. This paper presents Piezocone Penetration Test (CPTu) data that was used, by applying Olson and Peterson methods, for the liquefaction susceptibility assessment of the iron ore tailings that are typically found in most TSF in the area. Piezocone data was also used to determine the steady-state strength of the tailings so as to allow for comparison with its drained strength. Results have shown great susceptibility for liquefaction to occur in the studied tailings and, more importantly, a large reduction in its strength. These results are key to understanding the failures that took place over the last few years.

Keywords: Piezocone Penetration Test CPTu, iron ore tailings, mining, liquefaction susceptibility assessment

Procedia PDF Downloads 222
28382 Cross-Validation of the Data Obtained for ω-6 Linoleic and ω-3 α-Linolenic Acids Concentration of Hemp Oil Using Jackknife and Bootstrap Resampling

Authors: Vibha Devi, Shabina Khanam

Abstract:

Hemp (Cannabis sativa) possesses a rich content of ω-6 linoleic and ω-3 linolenic essential fatty acid in the ratio of 3:1, which is a rare and most desired ratio that enhances the quality of hemp oil. These components are beneficial for the development of cell and body growth, strengthen the immune system, possess anti-inflammatory action, lowering the risk of heart problem owing to its anti-clotting property and a remedy for arthritis and various disorders. The present study employs supercritical fluid extraction (SFE) approach on hemp seed at various conditions of parameters; temperature (40 - 80) °C, pressure (200 - 350) bar, flow rate (5 - 15) g/min, particle size (0.430 - 1.015) mm and amount of co-solvent (0 - 10) % of solvent flow rate through central composite design (CCD). CCD suggested 32 sets of experiments, which was carried out. As SFE process includes large number of variables, the present study recommends the application of resampling techniques for cross-validation of the obtained data. Cross-validation refits the model on each data to achieve the information regarding the error, variability, deviation etc. Bootstrap and jackknife are the most popular resampling techniques, which create a large number of data through resampling from the original dataset and analyze these data to check the validity of the obtained data. Jackknife resampling is based on the eliminating one observation from the original sample of size N without replacement. For jackknife resampling, the sample size is 31 (eliminating one observation), which is repeated by 32 times. Bootstrap is the frequently used statistical approach for estimating the sampling distribution of an estimator by resampling with replacement from the original sample. For bootstrap resampling, the sample size is 32, which was repeated by 100 times. Estimands for these resampling techniques are considered as mean, standard deviation, variation coefficient and standard error of the mean. For ω-6 linoleic acid concentration, mean value was approx. 58.5 for both resampling methods, which is the average (central value) of the sample mean of all data points. Similarly, for ω-3 linoleic acid concentration, mean was observed as 22.5 through both resampling. Variance exhibits the spread out of the data from its mean. Greater value of variance exhibits the large range of output data, which is 18 for ω-6 linoleic acid (ranging from 48.85 to 63.66 %) and 6 for ω-3 linoleic acid (ranging from 16.71 to 26.2 %). Further, low value of standard deviation (approx. 1 %), low standard error of the mean (< 0.8) and low variance coefficient (< 0.2) reflect the accuracy of the sample for prediction. All the estimator value of variance coefficients, standard deviation and standard error of the mean are found within the 95 % of confidence interval.

Keywords: resampling, supercritical fluid extraction, hemp oil, cross-validation

Procedia PDF Downloads 131
28381 Seasonal Variation of the Impact of Mining Activities on Ga-Selati River in Limpopo Province, South Africa

Authors: Joshua N. Edokpayi, John O. Odiyo, Patience P. Shikwambana

Abstract:

Water is a very rare natural resource in South Africa. Ga-Selati River is used for both domestic and industrial purposes. This study was carried out in order to assess the quality of Ga-Selati River in a mining area of Limpopo Province-Phalaborwa. The pH, Electrical Conductivity (EC) and Total Dissolved Solids (TDS) were determined using a Crinson multimeter while turbidity was measured using a Labcon Turbidimeter. The concentrations of Al, Ca, Cd, Cr, Fe, K, Mg, Mn, Na and Pb were analysed in triplicate using a Varian 520 flame atomic absorption spectrometer (AAS) supplied by PerkinElmer, after acid digestion with nitric acid in a fume cupboard. The average pH of the river from eight different sampling sites was 8.00 and 9.38 in wet and dry season respectively. Higher EC values were determined in the dry season (138.7 mS/m) than in the wet season (96.93 mS/m). Similarly, TDS values were higher in dry (929.29 mg/L) than in the wet season (640.72 mg/L) season. These values exceeded the recommended guideline of South Africa Department of Water Affairs and Forestry (DWAF) for domestic water use (70 mS/m) and that of the World Health Organization (WHO) (600 mS/m), respectively. Turbidity varied between 1.78-5.20 and 0.95-2.37 NTU in both wet and dry seasons. Total hardness of 312.50 mg/L and 297.75 mg/L as the concentration of CaCO3 was computed for the river in both the wet and the dry seasons and the river water was categorised as very hard. Mean concentration of the metals studied in both the wet and the dry seasons are: Na (94.06 mg/L and 196.3 mg/L), K (11.79 mg/L and 13.62 mg/L), Ca (45.60 mg/L and 41.30 mg/L), Mg (48.41 mg/L and 44.71 mg/L), Al (0.31 mg/L and 0.38 mg/L), Cd (0.01 mg/L and 0.01 mg/L), Cr (0.02 mg/L and 0.09 mg/L), Pb (0.05 mg/L and 0.06 mg/L), Mn (0.31 mg/L and 0.11 mg/L) and Fe (0.76 mg/L and 0.69 mg/L). Results from this study reveal that most of the metals were present in concentrations higher than the recommended guidelines of DWAF and WHO for domestic use and the protection of aquatic life.

Keywords: contamination, mining activities, surface water, trace metals

Procedia PDF Downloads 305
28380 Cluster Analysis of Students’ Learning Satisfaction

Authors: Purevdolgor Luvsantseren, Ajnai Luvsan-Ish, Oyuntsetseg Sandag, Javzmaa Tsend, Akhit Tileubai, Baasandorj Chilhaasuren, Jargalbat Puntsagdash, Galbadrakh Chuluunbaatar

Abstract:

One of the indicators of the quality of university services is student satisfaction. Aim: We aimed to study the level of satisfaction of students in the first year of premedical courses in the course of Medical Physics using the cluster method. Materials and Methods: In the framework of this goal, a questionnaire was collected from a total of 324 students who studied the medical physics course of the 1st course of the premedical course at the Mongolian National University of Medical Sciences. When determining the level of satisfaction, the answers were obtained on five levels of satisfaction: "excellent", "good", "medium", "bad" and "very bad". A total of 39 questionnaires were collected from students: 8 for course evaluation, 19 for teacher evaluation, and 12 for student evaluation. From the research, a database with 39 fields and 324 records was created. Results: In this database, cluster analysis was performed in MATLAB and R programs using the k-means method of data mining. Calculated the Hopkins statistic in the created database, the values are 0.88, 0.87, and 0.97. This shows that cluster analysis methods can be used. The course evaluation sub-fund is divided into three clusters. Among them, cluster I has 150 objects with a "good" rating of 46.2%, cluster II has 119 objects with a "medium" rating of 36.7%, and Cluster III has 54 objects with a "good" rating of 16.6%. The teacher evaluation sub-base into three clusters, there are 179 objects with a "good" rating of 55.2% in cluster II, 108 objects with an "average" rating of 33.3% in cluster III, and 36 objects with an "excellent" rating in cluster I of 11.1%. The sub-base of student evaluations is divided into two clusters: cluster II has 215 objects with an "excellent" rating of 66.3%, and cluster I has 108 objects with an "excellent" rating of 33.3%. Evaluating the resulting clusters with the Silhouette coefficient, 0.32 for the course evaluation cluster, 0.31 for the teacher evaluation cluster, and 0.30 for student evaluation show statistical significance. Conclusion: Finally, to conclude, cluster analysis in the model of the medical physics lesson “good” - 46.2%, “middle” - 36.7%, “bad” - 16.6%; 55.2% - “good”, 33.3% - “middle”, 11.1% - “bad” in the teacher evaluation model; 66.3% - “good” and 33.3% of “bad” in the student evaluation model.

Keywords: questionnaire, data mining, k-means method, silhouette coefficient

Procedia PDF Downloads 33
28379 Water End-Use Classification with Contemporaneous Water-Energy Data and Deep Learning Network

Authors: Khoi A. Nguyen, Rodney A. Stewart, Hong Zhang

Abstract:

‘Water-related energy’ is energy use which is directly or indirectly influenced by changes to water use. Informatics applying a range of mathematical, statistical and rule-based approaches can be used to reveal important information on demand from the available data provided at second, minute or hourly intervals. This study aims to combine these two concepts to improve the current water end use disaggregation problem through applying a wide range of most advanced pattern recognition techniques to analyse the concurrent high-resolution water-energy consumption data. The obtained results have shown that recognition accuracies of all end-uses have significantly increased, especially for mechanised categories, including clothes washer, dishwasher and evaporative air cooler where over 95% of events were correctly classified.

Keywords: deep learning network, smart metering, water end use, water-energy data

Procedia PDF Downloads 293
28378 Improving Grade Control Turnaround Times with In-Pit Hyperspectral Assaying

Authors: Gary Pattemore, Michael Edgar, Andrew Job, Marina Auad, Kathryn Job

Abstract:

As critical commodities become more scarce, significant time and resources have been used to better understand complicated ore bodies and extract their full potential. These challenging ore bodies provide several pain points for geologists and engineers to overcome, poor handling of these issues flows downs stream to the processing plant affecting throughput rates and recovery. Many open cut mines utilise blast hole drilling to extract additional information to feed back into the modelling process. This method requires samples to be collected during or after blast hole drilling. Samples are then sent for assay with turnaround times varying from 1 to 12 days. This method is time consuming, costly, requires human exposure on the bench and collects elemental data only. To address this challenge, research has been undertaken to utilise hyperspectral imaging across a broad spectrum to scan samples, collars or take down hole measurements for minerals and moisture content and grade abundances. Automation of this process using unmanned vehicles and on-board processing reduces human in pit exposure to ensure ongoing safety. On-board processing allows data to be integrated into modelling workflows with immediacy. The preliminary results demonstrate numerous direct and indirect benefits from this new technology, including rapid and accurate grade estimates, moisture content and mineralogy. These benefits allow for faster geo modelling updates, better informed mine scheduling and improved downstream blending and processing practices. The paper presents recommendations for implementation of the technology in open cut mining environments.

Keywords: grade control, hyperspectral scanning, artificial intelligence, autonomous mining, machine learning

Procedia PDF Downloads 99
28377 Testing the Change in Correlation Structure across Markets: High-Dimensional Data

Authors: Malay Bhattacharyya, Saparya Suresh

Abstract:

The Correlation Structure associated with a portfolio is subjected to vary across time. Studying the structural breaks in the time-dependent Correlation matrix associated with a collection had been a subject of interest for a better understanding of the market movements, portfolio selection, etc. The current paper proposes a methodology for testing the change in the time-dependent correlation structure of a portfolio in the high dimensional data using the techniques of generalized inverse, singular valued decomposition and multivariate distribution theory which has not been addressed so far. The asymptotic properties of the proposed test are derived. Also, the performance and the validity of the method is tested on a real data set. The proposed test performs well for detecting the change in the dependence of global markets in the context of high dimensional data.

Keywords: correlation structure, high dimensional data, multivariate distribution theory, singular valued decomposition

Procedia PDF Downloads 113
28376 Applying Theory of Inventive Problem Solving to Develop Innovative Solutions: A Case Study

Authors: Y. H. Wang, C. C. Hsieh

Abstract:

Good service design can increase organization revenue and consumer satisfaction while reducing labor and time costs. The problems facing consumers in the original serve model for eyewear and optical industry includes the following issues: 1. Insufficient information on eyewear products 2. Passively dependent on recommendations, insufficient selection 3. Incomplete records on progression of vision conditions 4. Lack of complete customer records. This study investigates the case of Kobayashi Optical, applying the Theory of Inventive Problem Solving (TRIZ) to develop innovative solutions for eyewear and optical industry. Analysis results raise the following conclusions and management implications: In order to provide customers with improved professional information and recommendations, Kobayashi Optical is suggested to establish customer purchasing records. Overall service efficiency can be enhanced by applying data mining techniques to analyze past consumer preferences and purchase histories. Furthermore, Kobayashi Optical should continue to develop a 3D virtual trial service which can allow customers for easy browsing of different frame styles and colors. This 3D virtual trial service will save customer waiting times in during peak service times at stores.

Keywords: theory of inventive problem solving (TRIZ), service design, augmented reality (AR), eyewear and optical industry

Procedia PDF Downloads 272
28375 Low Voltage Ride through Capability Techniques for DFIG-Based Wind Turbines

Authors: Sherif O. Zain Elabideen, Ahmed A. Helal, Ibrahim F. El-Arabawy

Abstract:

Due to the drastic increase of the wind turbines installed capacity; the grid codes are increasing the restrictions aiming to treat the wind turbines like other conventional sources sooner. In this paper, an intensive review has been presented for different techniques used to add low voltage ride through capability to Doubly Fed Induction Generator (DFIG) wind turbine. A system model with 1.5 MW DFIG wind turbine is constructed and simulated using MATLAB/SIMULINK to explore the effectiveness of the reviewed techniques.

Keywords: DFIG, grid side converters, low voltage ride through, wind turbine

Procedia PDF Downloads 412
28374 A QoS Aware Cluster Based Routing Algorithm for Wireless Mesh Network Using LZW Lossless Compression

Authors: J. S. Saini, P. P. K. Sandhu

Abstract:

The multi-hop nature of Wireless Mesh Networks and the hasty progression of throughput demands results in multi- channels and multi-radios structures in mesh networks, but the main problem of co-channels interference reduces the total throughput, specifically in multi-hop networks. Quality of Service mentions a vast collection of networking technologies and techniques that guarantee the ability of a network to make available desired services with predictable results. Quality of Service (QoS) can be directed at a network interface, towards a specific server or router's performance, or in specific applications. Due to interference among various transmissions, the QoS routing in multi-hop wireless networks is formidable task. In case of multi-channel wireless network, since two transmissions using the same channel may interfere with each other. This paper has considered the Destination Sequenced Distance Vector (DSDV) routing protocol to locate the secure and optimised path. The proposed technique also utilizes the Lempel–Ziv–Welch (LZW) based lossless data compression and intra cluster data aggregation to enhance the communication between the source and the destination. The use of clustering has the ability to aggregate the multiple packets and locates a single route using the clusters to improve the intra cluster data aggregation. The use of the LZW based lossless data compression has ability to reduce the data packet size and hence it will consume less energy, thus increasing the network QoS. The MATLAB tool has been used to evaluate the effectiveness of the projected technique. The comparative analysis has shown that the proposed technique outperforms over the existing techniques.

Keywords: WMNS, QOS, flooding, collision avoidance, LZW, congestion control

Procedia PDF Downloads 327
28373 Blind Data Hiding Technique Using Interpolation of Subsampled Images

Authors: Singara Singh Kasana, Pankaj Garg

Abstract:

In this paper, a blind data hiding technique based on interpolation of sub sampled versions of a cover image is proposed. Sub sampled image is taken as a reference image and an interpolated image is generated from this reference image. Then difference between original cover image and interpolated image is used to embed secret data. Comparisons with the existing interpolation based techniques show that proposed technique provides higher embedding capacity and better visual quality marked images. Moreover, the performance of the proposed technique is more stable for different images.

Keywords: interpolation, image subsampling, PSNR, SIM

Procedia PDF Downloads 568
28372 To Study the New Invocation of Biometric Authentication Technique

Authors: Aparna Gulhane

Abstract:

Biometrics is the science and technology of measuring and analyzing biological data form the basis of research in biological measuring techniques for the purpose of people identification and recognition. In information technology, biometrics refers to technologies that measure and analyze human body characteristics, such as DNA, fingerprints, eye retinas and irises, voice patterns, facial patterns and hand measurements. Biometric systems are used to authenticate the person's identity. The idea is to use the special characteristics of a person to identify him. These papers present a biometric authentication techniques and actual deployment of potential by overall invocation of biometrics recognition, with an independent testing of various biometric authentication products and technology.

Keywords: types of biometrics, importance of biometric, review for biometrics and getting a new implementation, biometric authentication technique

Procedia PDF Downloads 306
28371 A Study of Using Different Printed Circuit Board Design Methods on Ethernet Signals

Authors: Bahattin Kanal, Nursel Akçam

Abstract:

Data transmission size and frequency requirements are increasing rapidly in electronic communication protocols. Increasing data transmission speeds have made the design of printed circuit boards much more important. It is important to carefully examine the requirements and make analyses before and after the design of the digital electronic circuit board. It delves into impedance matching techniques, signal trace routing considerations, and the impact of layer stacking on signal performance. The paper extensively explores techniques for minimizing crosstalk issues and interference, presenting a holistic perspective on design strategies to optimize the quality of high-speed signals. Through a comprehensive review of these design methodologies, this study aims to provide insights into achieving reliable and high-performance printed circuit board layouts for these signals. In this study, the effect of different design methods on Ethernet signals was examined from the type of S parameters. Siemens company HyperLynx software tool was used for the analyses.

Keywords: HyperLynx, printed circuit board, s parameters, ethernet

Procedia PDF Downloads 17
28370 Intrusion Detection and Prevention System (IDPS) in Cloud Computing Using Anomaly-Based and Signature-Based Detection Techniques

Authors: John Onyima, Ikechukwu Ezepue

Abstract:

Virtualization and cloud computing are among the fast-growing computing innovations in recent times. Organisations all over the world are moving their computing services towards the cloud this is because of its rapid transformation of the organization’s infrastructure and improvement of efficient resource utilization and cost reduction. However, this technology brings new security threats and challenges about safety, reliability and data confidentiality. Evidently, no single security technique can guarantee security or protection against malicious attacks on a cloud computing network hence an integrated model of intrusion detection and prevention system has been proposed. Anomaly-based and signature-based detection techniques will be integrated to enable the network and its host defend themselves with some level of intelligence. The anomaly-base detection was implemented using the local deviation factor graph-based (LDFGB) algorithm while the signature-based detection was implemented using the snort algorithm. Results from this collaborative intrusion detection and prevention techniques show robust and efficient security architecture for cloud computing networks.

Keywords: anomaly-based detection, cloud computing, intrusion detection, intrusion prevention, signature-based detection

Procedia PDF Downloads 289
28369 Framework for Performance Measure of Super Resolution Imaging

Authors: Varsha Hemant Patil, Swati A. Bhavsar, Abolee H. Patil

Abstract:

Image quality assessment plays an important role in image evaluation. This paper aims to present an investigation of classic techniques in use for image quality assessment, especially for super-resolution imaging. Researchers have contributed a lot towards the development of super-resolution imaging techniques. However, not much attention is paid to the development of metrics for testing the performance of developed techniques. In this paper, the study report of existing image quality measures is given. The paper classifies reviewed approaches according to functionality and suitability for super-resolution imaging. Probable modifications and improvements of these to suit super-resolution imaging are presented. The prime goal of the paper is to provide a comprehensive reference source for researchers working towards super-resolution imaging and suggest a better framework for measuring the performance of super-resolution imaging techniques.

Keywords: interpolation, MSE, PSNR, SSIM, super resolution

Procedia PDF Downloads 84
28368 Comparison of Various Landfill Ground Improvement Techniques for Redevelopment of Closed Landfills to Cater Transport Infrastructure

Authors: Michael D. Vinod, Hadi Khabbaz

Abstract:

Construction of infrastructure above or adjacent to landfills is becoming more common to capitalize on the limited space available within urban areas. However, development above landfills is a challenging task due to large voids, the presence of organic matter, heterogeneous nature of waste and ambiguity surrounding landfill settlement prediction. Prior to construction of infrastructure above landfills, ground improvement techniques are being employed to improve the geotechnical properties of landfill material. Although the ground improvement techniques have little impact on long term biodegradation and creep related landfill settlement, they have shown some notable short term success with a variety of techniques, including methods for verifying the level of effectiveness of ground improvement techniques. This paper provides geotechnical and landfill engineers a guideline for selection of landfill ground improvement techniques and their suitability to project-specific sites. Ground improvement methods assessed and compared in this paper include concrete injected columns (CIC), dynamic compaction, rapid impact compaction (RIC), preloading, high energy impact compaction (HEIC), vibro compaction, vibro replacement, chemical stabilization and the inclusion of geosynthetics such as geocells. For each ground improvement technique a summary of the existing theory, benefits, limitations, suitable modern ground improvement monitoring methods, the applicability of ground improvement techniques for landfills and supporting case studies are provided. The authors highlight the importance of implementing cost-effective monitoring techniques to allow observation and necessary remediation of the subsidence effects associated with long term landfill settlement. These ground improvement techniques are primarily for the purpose of construction above closed landfills to cater for transport infrastructure loading.

Keywords: closed landfills, ground improvement, monitoring, settlement, transport infrastructure

Procedia PDF Downloads 205
28367 Experiencing Negative Thoughts? Write It, Crumple It and Throw It

Authors: Yasmin Othman Mydin

Abstract:

When one experiences problems, this may lead to negative thoughts. These thoughts may occur repetitively. The present study investigates the effectiveness of cognitive and behavioural techniques to reduce negative thoughts. 20 undergraduate university students participated as the sample in these experimental therapy sessions. Ten students received the intervention while the other ten students were in control group. 15 items Perseverative Thinking Questionnaire was administered before and after the intervention to test the effectiveness of the techniques. The behavioural techniques applied were such as, write down the negative thoughts, crumple it and throw it away. While the cognitive technique was to imagine that the thoughts are being taken out of the mind while throwing it away. Paired samples t-test analysis revealed that there were significant reductions (t=4.245,df=9, p .003) in the negative thoughts in the group that received the intervention compared to the control group. This indicates that these techniques are effective to reduce the repetitive negative thoughts.

Keywords: behaviour and cognitive intervention, negative thoughts, writing, psychology

Procedia PDF Downloads 362
28366 Earthquake Risk Assessment Using Out-of-Sequence Thrust Movement

Authors: Rajkumar Ghosh

Abstract:

Earthquakes are natural disasters that pose a significant risk to human life and infrastructure. Effective earthquake mitigation measures require a thorough understanding of the dynamics of seismic occurrences, including thrust movement. Traditionally, estimating thrust movement has relied on typical techniques that may not capture the full complexity of these events. Therefore, investigating alternative approaches, such as incorporating out-of-sequence thrust movement data, could enhance earthquake mitigation strategies. This review aims to provide an overview of the applications of out-of-sequence thrust movement in earthquake mitigation. By examining existing research and studies, the objective is to understand how precise estimation of thrust movement can contribute to improving structural design, analyzing infrastructure risk, and developing early warning systems. The study demonstrates how to estimate out-of-sequence thrust movement using multiple data sources, including GPS measurements, satellite imagery, and seismic recordings. By analyzing and synthesizing these diverse datasets, researchers can gain a more comprehensive understanding of thrust movement dynamics during seismic occurrences. The review identifies potential advantages of incorporating out-of-sequence data in earthquake mitigation techniques. These include improving the efficiency of structural design, enhancing infrastructure risk analysis, and developing more accurate early warning systems. By considering out-of-sequence thrust movement estimates, researchers and policymakers can make informed decisions to mitigate the impact of earthquakes. This study contributes to the field of seismic monitoring and earthquake risk assessment by highlighting the benefits of incorporating out-of-sequence thrust movement data. By broadening the scope of analysis beyond traditional techniques, researchers can enhance their knowledge of earthquake dynamics and improve the effectiveness of mitigation measures. The study collects data from various sources, including GPS measurements, satellite imagery, and seismic recordings. These datasets are then analyzed using appropriate statistical and computational techniques to estimate out-of-sequence thrust movement. The review integrates findings from multiple studies to provide a comprehensive assessment of the topic. The study concludes that incorporating out-of-sequence thrust movement data can significantly enhance earthquake mitigation measures. By utilizing diverse data sources, researchers and policymakers can gain a more comprehensive understanding of seismic dynamics and make informed decisions. However, challenges exist, such as data quality difficulties, modelling uncertainties, and computational complications. To address these obstacles and improve the accuracy of estimates, further research and advancements in methodology are recommended. Overall, this review serves as a valuable resource for researchers, engineers, and policymakers involved in earthquake mitigation, as it encourages the development of innovative strategies based on a better understanding of thrust movement dynamics.

Keywords: earthquake, out-of-sequence thrust, disaster, human life

Procedia PDF Downloads 66
28365 Evaluation of Three Digital Graphical Methods of Baseflow Separation Techniques in the Tekeze Water Basin in Ethiopia

Authors: Alebachew Halefom, Navsal Kumar, Arunava Poddar

Abstract:

The purpose of this work is to specify the parameter values, the base flow index (BFI), and to rank the methods that should be used for base flow separation. Three different digital graphical approaches are chosen and used in this study for the purpose of comparison. The daily time series discharge data were collected from the site for a period of 30 years (1986 up to 2015) and were used to evaluate the algorithms. In order to separate the base flow and the surface runoff, daily recorded streamflow (m³/s) data were used to calibrate procedures and get parameter values for the basin. Additionally, the performance of the model was assessed by the use of the standard error (SE), the coefficient of determination (R²), and the flow duration curve (FDC) and baseflow indexes. The findings indicate that, in general, each strategy can be used worldwide to differentiate base flow; however, the Sliding Interval Method (SIM) performs significantly better than the other two techniques in this basin. The average base flow index was calculated to be 0.72 using the local minimum method, 0.76 using the fixed interval method, and 0.78 using the sliding interval method, respectively.

Keywords: baseflow index, digital graphical methods, streamflow, Emba Madre Watershed

Procedia PDF Downloads 64