Search results for: data analysis.
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 13541

Search results for: data analysis.

11021 Tracking Activity of Real Individuals in Web Logs

Authors: Sándor Juhász, Renáta Iváncsy

Abstract:

This paper describes an enhanced cookie-based method for counting the visitors of web sites by using a web log processing system that aims to cope with the ambitious goal of creating countrywide statistics about the browsing practices of real human individuals. The focus is put on describing a new more efficient way of detecting human beings behind web users by placing different identifiers on the client computers. We briefly introduce our processing system designed to handle the massive amount of data records continuously gathered from the most important content providers of the Hungary. We conclude by showing statistics of different time spans comparing the efficiency of multiple visitor counting methods to the one presented here, and some interesting charts about content providers and web usage based on real data recorded in 2007 will also be presented.

Keywords: Cookie based identification, real data, user activitytracking, web auditing, web log processing

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1321
11020 Microgrid: Low Power Network Topology and Control

Authors: Amit Sachan

Abstract:

The network designing and data modeling developments which are the two significant research tasks in direction to tolerate power control of Microgrid concluded using IEC 61850 data models and facilities. The current casing areas of IEC 61580 include infrastructures in substation automation systems, among substations and to DERs. So, for LV microgrid power control, previously using the IEC 61850 amenities to control the smart electrical devices, we have to model those devices as IEC 61850 data models and design a network topology to maintenance all-in-one communiqué amid those devices. In adding, though IEC 61850 assists modeling a portion by open-handed several object models for common functions similar measurement, metering, monitoring…etc., there are motionless certain missing smithereens for building a multiplicity of functions for household appliances like tuning the temperature of an electric heater or refrigerator.

Keywords: IEC 61850, RCMC, HCMC, DER Unit Controller.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2502
11019 Identification of Wideband Sources Using Higher Order Statistics in Noisy Environment

Authors: S. Bourennane, A. Bendjama

Abstract:

This paper deals with the localization of the wideband sources. We develop a new approach for estimating the wide band sources parameters. This method is based on the high order statistics of the recorded data in order to eliminate the Gaussian components from the signals received on the various hydrophones.In fact the noise of sea bottom is regarded as being Gaussian. Thanks to the coherent signal subspace algorithm based on the cumulant matrix of the received data instead of the cross-spectral matrix the wideband correlated sources are perfectly located in the very noisy environment. We demonstrate the performance of the proposed algorithm on the real data recorded during an underwater acoustics experiments.

Keywords: Higher-order statistics, high resolution array processing techniques, localization of acoustics sources, wide band sources.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1604
11018 An AK-Chart for the Non-Normal Data

Authors: Chia-Hau Liu, Tai-Yue Wang

Abstract:

Traditional multivariate control charts assume that measurement from manufacturing processes follows a multivariate normal distribution. However, this assumption may not hold or may be difficult to verify because not all the measurement from manufacturing processes are normal distributed in practice. This study develops a new multivariate control chart for monitoring the processes with non-normal data. We propose a mechanism based on integrating the one-class classification method and the adaptive technique. The adaptive technique is used to improve the sensitivity to small shift on one-class classification in statistical process control. In addition, this design provides an easy way to allocate the value of type I error so it is easier to be implemented. Finally, the simulation study and the real data from industry are used to demonstrate the effectiveness of the propose control charts.

Keywords: Multivariate control chart, statistical process control, one-class classification method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2281
11017 Selection the Optimum Cooling Scheme for Generators based on the Electro-Thermal Analysis

Authors: Diako Azizi, Ahmad Gholami, Vahid Abbasi

Abstract:

Optimal selection of electrical insulations in electrical machinery insures reliability during operation. From the insulation studies of view for electrical machines, stator is the most important part. This fact reveals the requirement for inspection of the electrical machine insulation along with the electro-thermal stresses. In the first step of the study, a part of the whole structure of machine in which covers the general characteristics of the machine is chosen, then based on the electromagnetic analysis (finite element method), the machine operation is simulated. In the simulation results, the temperature distribution of the total structure is presented simultaneously by using electro-thermal analysis. The results of electro-thermal analysis can be used for designing an optimal cooling system. In order to design, review and comparing the cooling systems, four wiring structures in the slots of Stator are presented. The structures are compared to each other in terms of electrical, thermal distribution and remaining life of insulation by using Finite Element analysis. According to the steps of the study, an optimization algorithm has been presented for selection of appropriate structure.

Keywords: Electrical field, field distribution, insulation, winding, finite element method, electro thermal

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1755
11016 Condition Monitoring for Controlling the Stability of the Rotating Machinery

Authors: A. Chellil, I. Gahlouz, S. Lecheb, A. Nour, S. Chellil, H. Mechakra, H. Kebir

Abstract:

In this paper, the experimental study for the instability of a separator rotor is presented, under dynamic loading response in the harmonic analysis condition. The global measurement and analysis of vibration on the cement separator RC500 is carried, the points of measurement used are radial dots, vertical, horizontal and oblique. The measures of trends and spectral analysis for reconnaissance of the main anomalies, the main defects in the separator and manifestation, the results prove that the defects effect has a negative effect on the stability of the rotor. Experimentally the study of the rotor in transient system allowed to determine the vibratory responses due to the unbalances and various excitations.

Keywords: Rotor, experimental, defect, frequency, specter.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1761
11015 Shannon-Weaver Biodiversity of Neutrophils in Fractal Networks of Immunofluorescence for Medical Diagnostics

Authors: N.E.Galich

Abstract:

We develop new nonlinear methods of immunofluorescence analysis for a sensitive technology of respiratory burst reaction of DNA fluorescence due to oxidative activity in the peripheral blood neutrophils. Histograms in flow cytometry experiments represent a fluorescence flashes frequency as functions of fluorescence intensity. We used the Shannon-Weaver index for definition of neutrophils- biodiversity and Hurst index for definition of fractal-s correlations in immunofluorescence for different donors, as the basic quantitative criteria for medical diagnostics of health status. We analyze frequencies of flashes, information, Shannon entropies and their fractals in immunofluorescence networks due to reduction of histogram range. We found the number of simplest universal correlations for biodiversity, information and Hurst index in diagnostics and classification of pathologies for wide spectra of diseases. In addition is determined the clear criterion of a common immunity and human health status in a form of yes/no answers type. These answers based on peculiarities of information in immunofluorescence networks and biodiversity of neutrophils. Experimental data analysis has shown the existence of homeostasis for information entropy in oxidative activity of DNA in neutrophil nuclei for all donors.

Keywords: blood and cells fluorescence in diagnostics ofdiseases, cytometric histograms, entropy and information in fractalnetworks of oxidative activity of DNA, long-range chromosomalcorrelations in living cells.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1707
11014 Estimation Model of Dry Docking Duration Using Data Mining

Authors: Isti Surjandari, Riara Novita

Abstract:

Maintenance is one of the most important activities in the shipyard industry. However, sometimes it is not supported by adequate services from the shipyard, where inaccuracy in estimating the duration of the ship maintenance is still common. This makes estimation of ship maintenance duration is crucial. This study uses Data Mining approach, i.e., CART (Classification and Regression Tree) to estimate the duration of ship maintenance that is limited to dock works or which is known as dry docking. By using the volume of dock works as an input to estimate the maintenance duration, 4 classes of dry docking duration were obtained with different linear model and job criteria for each class. These linear models can then be used to estimate the duration of dry docking based on job criteria.

Keywords: Classification and regression tree (CART), data mining, dry docking, maintenance duration.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2438
11013 Extended Low Power Bus Binding Combined with Data Sequence Reordering

Authors: Jihyung Kim, Taejin Kim, Sungho Park, Jun-Dong Cho

Abstract:

In this paper, we address the problem of reducing the switching activity (SA) in on-chip buses through the use of a bus binding technique in high-level synthesis. While many binding techniques to reduce the SA exist, we present yet another technique for further reducing the switching activity. Our proposed method combines bus binding and data sequence reordering to explore a wider solution space. The problem is formulated as a multiple traveling salesman problem and solved using simulated annealing technique. The experimental results revealed that a binding solution obtained with the proposed method reduces 5.6-27.2% (18.0% on average) and 2.6-12.7% (6.8% on average) of the switching activity when compared with conventional binding-only and hybrid binding-encoding methods, respectively.

Keywords: low power, bus binding, switching activity, multiple traveling salesman problem, data sequence reordering

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1342
11012 Accessibility and Visibility through Space Syntax Analysis of the Linga Raj Temple in Odisha, India

Authors: S. Pramanik

Abstract:

Since the early ages, the Hindu temples have been interpreted through various Vedic philosophies. These temples are visited by pilgrims which demonstrate the rituals and religious belief of communities, reflecting a variety of actions and behaviors. Darsana a direct seeing, is a part of the pilgrimage activity. During the process of Darsana, a devotee is prepared for entry in the temple to realize the cognizing Truth culminating in visualizing the idol of God, placed at the Garbhagriha (sanctum sanctorum). For this, the pilgrim must pass through a sequential arrangement of spaces. During the process of progress, the pilgrims visualize the spaces differently from various points of views. The viewpoints create a variety of spatial patterns in the minds of pilgrims coherent to the Hindu philosophies. The space organization and its order are perceived by various techniques of spatial analysis. A temple, as examples of Kalinga stylistic variations, has been chosen for the study. This paper intends to demonstrate some visual patterns generated during the process of Darsana (visibility) and its accessibility by Point Isovist Studies and Visibility Graph Analysis from the entrance (Simha Dwara) to The Sanctum sanctorum (Garbhagriha).

Keywords: Hindu Temple Architecture, Point Isovist, space syntax analysis, visibility graph analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1307
11011 Bayesian Network Model for Students- Laboratory Work Performance Assessment: An Empirical Investigation of the Optimal Construction Approach

Authors: Ifeyinwa E. Achumba, Djamel Azzi, Rinat Khusainov

Abstract:

There are three approaches to complete Bayesian Network (BN) model construction: total expert-centred, total datacentred, and semi data-centred. These three approaches constitute the basis of the empirical investigation undertaken and reported in this paper. The objective is to determine, amongst these three approaches, which is the optimal approach for the construction of a BN-based model for the performance assessment of students- laboratory work in a virtual electronic laboratory environment. BN models were constructed using all three approaches, with respect to the focus domain, and compared using a set of optimality criteria. In addition, the impact of the size and source of the training, on the performance of total data-centred and semi data-centred models was investigated. The results of the investigation provide additional insight for BN model constructors and contribute to literature providing supportive evidence for the conceptual feasibility and efficiency of structure and parameter learning from data. In addition, the results highlight other interesting themes.

Keywords: Bayesian networks, model construction, parameterlearning, structure learning, performance index, model comparison.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1736
11010 Iteration Acceleration for Nonlinear Coupled Parabolic-Hyperbolic System

Authors: Xia Cui, Guang-wei Yuan, Jing-yan Yue

Abstract:

A Picard-Newton iteration method is studied to accelerate the numerical solution procedure of a class of two-dimensional nonlinear coupled parabolic-hyperbolic system. The Picard-Newton iteration is designed by adding higher-order terms of small quantity to an existing Picard iteration. The discrete functional analysis and inductive hypothesis reasoning techniques are used to overcome difficulties coming from nonlinearity and coupling, and theoretical analysis is made for the convergence and approximation properties of the iteration scheme. The Picard-Newton iteration has a quadratic convergent ratio, and its solution has second order spatial approximation and first order temporal approximation to the exact solution of the original problem. Numerical tests verify the results of the theoretical analysis, and show the Picard-Newton iteration is more efficient than the Picard iteration.

Keywords: Nonlinearity, iterative acceleration, coupled parabolic hyperbolic system, quadratic convergence, numerical analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1562
11009 Educational Data Mining: The Case of Department of Mathematics and Computing in the Period 2009-2018

Authors: M. Sitoe, O. Zacarias

Abstract:

University education is influenced by several factors that range from the adoption of strategies to strengthen the whole process to the academic performance improvement of the students themselves. This work uses data mining techniques to develop a predictive model to identify students with a tendency to evasion and retention. To this end, a database of real students’ data from the Department of University Admission (DAU) and the Department of Mathematics and Informatics (DMI) was used. The data comprised 388 undergraduate students admitted in the years 2009 to 2014. The Weka tool was used for model building, using three different techniques, namely: K-nearest neighbor, random forest, and logistic regression. To allow for training on multiple train-test splits, a cross-validation approach was employed with a varying number of folds. To reduce bias variance and improve the performance of the models, ensemble methods of Bagging and Stacking were used. After comparing the results obtained by the three classifiers, Logistic Regression using Bagging with seven folds obtained the best performance, showing results above 90% in all evaluated metrics: accuracy, rate of true positives, and precision. Retention is the most common tendency.

Keywords: Evasion and retention, cross validation, bagging, stacking.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 135
11008 A CTL Specification of Serializability for Transactions Accessing Uniform Data

Authors: Rafat Alshorman, Walter Hussak

Abstract:

Existing work in temporal logic on representing the execution of infinitely many transactions, uses linear-time temporal logic (LTL) and only models two-step transactions. In this paper, we use the comparatively efficient branching-time computational tree logic CTL and extend the transaction model to a class of multistep transactions, by introducing distinguished propositional variables to represent the read and write steps of n multi-step transactions accessing m data items infinitely many times. We prove that the well known correspondence between acyclicity of conflict graphs and serializability for finite schedules, extends to infinite schedules. Furthermore, in the case of transactions accessing the same set of data items in (possibly) different orders, serializability corresponds to the absence of cycles of length two. This result is used to give an efficient encoding of the serializability condition into CTL.

Keywords: computational tree logic, serializability, multi-step transactions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1183
11007 Fast Approximate Bayesian Contextual Cold Start Learning (FAB-COST)

Authors: Jack R. McKenzie, Peter A. Appleby, Thomas House, Neil Walton

Abstract:

Cold-start is a notoriously difficult problem which can occur in recommendation systems, and arises when there is insufficient information to draw inferences for users or items. To address this challenge, a contextual bandit algorithm – the Fast Approximate Bayesian Contextual Cold Start Learning algorithm (FAB-COST) – is proposed, which is designed to provide improved accuracy compared to the traditionally used Laplace approximation in the logistic contextual bandit, while controlling both algorithmic complexity and computational cost. To this end, FAB-COST uses a combination of two moment projection variational methods: Expectation Propagation (EP), which performs well at the cold start, but becomes slow as the amount of data increases; and Assumed Density Filtering (ADF), which has slower growth of computational cost with data size but requires more data to obtain an acceptable level of accuracy. By switching from EP to ADF when the dataset becomes large, it is able to exploit their complementary strengths. The empirical justification for FAB-COST is presented, and systematically compared to other approaches on simulated data. In a benchmark against the Laplace approximation on real data consisting of over 670, 000 impressions from autotrader.co.uk, FAB-COST demonstrates at one point increase of over 16% in user clicks. On the basis of these results, it is argued that FAB-COST is likely to be an attractive approach to cold-start recommendation systems in a variety of contexts.

Keywords: Cold-start, expectation propagation, multi-armed bandits, Thompson sampling, variational inference.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 559
11006 Evaluating the Baseline Characteristics of Static Balance in Young Adults

Authors: K. Abuzayan, H. Alabed, K. Zarug

Abstract:

The objectives of this study (baseline study, n = 20) were to implement Matlab procedures for quantifying selected static  balance variables, establish baseline data of selected variables which characterize static balance activities in a population of healthy young adult males, and to examine any trial effects on these variables. The results indicated that the implementation of Matlab procedures for quantifying selected static balance variables was practical and enabled baseline data to be established for selected variables. There was no significant trial effect. Recommendations were made for suitable tests to be used in later studies. Specifically it was found that one foot-tiptoes tests either in static balance is too challenging for most participants in normal circumstances. A one foot-flat eyes open test was considered to be representative and challenging for static balance.

Keywords: Static Balance, Base of support, Baseline Data.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1831
11005 A Qualitative Study of Health-Related Beliefs and Practices among Vegetarians

Authors: Lorena Antonovici, Maria Nicoleta Turliuc

Abstract:

The process of becoming a vegetarian involves changes in several life aspects, including health. Despite its relevance, however, little research has been carried out to analyze vegetarians' self-perceived health, and even less empirical attention has received in the Romanian population. This study aimed to assess health-related beliefs and practices among vegetarian adults in a Romanian sample. We have undertaken 20 semi-structured interviews (10 males, 10 females) based on a snowball sample with a mean age of 31 years. The interview guide was divided into three sections: causes of adopting the diet, general aspects (beliefs, practices, tensions, and conflicts) and consequences of adopting the diet (significant changes, positive aspects, and difficulties, physical and mental health). Additional anamnestic data were reported by means of a questionnaire. Data analyses were performed using Tropes text analysis software (v. 8.2) and SPSS software (v. 24.0.) Findings showed that most of the participants considered a vegetarian diet as a natural and healthy choice as opposed to meat-eating, which is not healthy, and its consumption should be moderated among omnivores. A higher proportion of participants (65%) had an average body mass index (BMI), and several women even assumed having certain affections that no longer occur after following a vegetarian diet. Moreover, participants admitted having better moods and mental health status, given their self-contentment with the dietary choice. Relatives were perceived as more skeptical about their practices than others, and especially women had this view. This study provides a valuable insight into health-related beliefs and practices and how a vegetarian diet might interact.

Keywords: Health-related beliefs, health, practices, vegetarians.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 786
11004 Analysis of Knowledge Management Trend by Bibliometric Approach

Authors: Hsu-Hao Tsai, Jiann-Min Yang

Abstract:

The analysis is mainly concentrating on the knowledge management literatures productivity trend which subjects as “knowledge management" in SSCI database. The purpose what the analysis will propose is to summarize the trend information for knowledge management researchers since core knowledge will be concentrated in core categories. The result indicated that the literature productivity which topic as “knowledge management" is still increasing extremely and will demonstrate the trend by different categories including author, country/territory, institution name, document type, language, publication year, and subject area. Focus on the right categories, you will catch the core research information. This implies that the phenomenon "success breeds success" is more common in higher quality publications.

Keywords: Knowledge Management, SSCI, Bibliometric, Lotka's Law

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1243
11003 Performance Analysis and Optimization for Diagonal Sparse Matrix-Vector Multiplication on Machine Learning Unit

Authors: Qiuyu Dai, Haochong Zhang, Xiangrong Liu

Abstract:

Efficient matrix-vector multiplication with diagonal sparse matrices is pivotal in a multitude of computational domains, ranging from scientific simulations to machine learning workloads. When encoded in the conventional Diagonal (DIA) format, these matrices often induce computational overheads due to extensive zero-padding and non-linear memory accesses, which can hamper the computational throughput, and elevate the usage of precious compute and memory resources beyond necessity. The ’DIA-Adaptive’ approach, a methodological enhancement introduced in this paper, confronts these challenges head-on by leveraging the advanced parallel instruction sets embedded within Machine Learning Units (MLUs). This research presents a thorough analysis of the DIA-Adaptive scheme’s efficacy in optimizing Sparse Matrix-Vector Multiplication (SpMV) operations. The scope of the evaluation extends to a variety of hardware architectures, examining the repercussions of distinct thread allocation strategies and cluster configurations across multiple storage formats. A dedicated computational kernel, intrinsic to the DIA-Adaptive approach, has been meticulously developed to synchronize with the nuanced performance characteristics of MLUs. Empirical results, derived from rigorous experimentation, reveal that the DIA-Adaptive methodology not only diminishes the performance bottlenecks associated with the DIA format but also exhibits pronounced enhancements in execution speed and resource utilization. The analysis delineates a marked improvement in parallelism, showcasing the DIA-Adaptive scheme’s ability to adeptly manage the interplay between storage formats, hardware capabilities, and algorithmic design. The findings suggest that this approach could set a precedent for accelerating SpMV tasks, thereby contributing significantly to the broader domain of high-performance computing and data-intensive applications.

Keywords: Adaptive method, DIA, diagonal sparse matrices, MLU, sparse matrix-vector multiplication.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 249
11002 Risk Assessment of Musculoskeletal Disorders in an Electronic Components Company

Authors: Sara Bragança, Eric Costa

Abstract:

The work presented in this paper was performed for a workstation of an assembly section in a company that manufactures radio modules and air conditioning for cars. After performing a workstation analysis and a questionnaire to the operators it was possible to understand the need to investigate the risk of musculoskeletal disorders originated from both the handling of loads as the incorrect dimensioning of the workstation. Regarding the handling of loads the NIOSH Equation was used and it was verified that there was no risk of musculoskeletal disorders. As the operators expressed their lack of satisfaction regarding back pains due to posture adopted they were established the appropriate dimensions (to satisfy 97.5% of the population and using the table of anthropometric data of the Portuguese population) for the workstation and it was proposed the availability of a chair for the workers.

Keywords: Anthropometry, Musculoskeletal disorders, NIOSH Equation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1813
11001 Artificial Neural Network Modeling and Genetic Algorithm Based Optimization of Hydraulic Design Related to Seepage under Concrete Gravity Dams on Permeable Soils

Authors: Muqdad Al-Juboori, Bithin Datta

Abstract:

Hydraulic structures such as gravity dams are classified as essential structures, and have the vital role in providing strong and safe water resource management. Three major aspects must be considered to achieve an effective design of such a structure: 1) The building cost, 2) safety, and 3) accurate analysis of seepage characteristics. Due to the complexity and non-linearity relationships of the seepage process, many approximation theories have been developed; however, the application of these theories results in noticeable errors. The analytical solution, which includes the difficult conformal mapping procedure, could be applied for a simple and symmetrical problem only. Therefore, the objectives of this paper are to: 1) develop a surrogate model based on numerical simulated data using SEEPW software to approximately simulate seepage process related to a hydraulic structure, 2) develop and solve a linked simulation-optimization model based on the developed surrogate model to describe the seepage occurring under a concrete gravity dam, in order to obtain optimum and safe design at minimum cost. The result shows that the linked simulation-optimization model provides an efficient and optimum design of concrete gravity dams.

Keywords: Artificial neural network, concrete gravity dam, genetic algorithm, seepage analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1384
11000 Dynamic Stability of Beams with Piezoelectric Layers Located on a Continuous Elastic Foundation

Authors: A. R. Nezamabadi, M. Karami Khorramabadi

Abstract:

This paper studies dynamic stability of homogeneous beams with piezoelectric layers subjected to periodic axial compressive load that is simply supported at both ends lies on a continuous elastic foundation. The displacement field of beam is assumed based on Bernoulli-Euler beam theory. Applying the Hamilton's principle, the governing dynamic equation is established. The influences of applied voltage, foundation coefficient and piezoelectric thickness on the unstable regions are presented. To investigate the accuracy of the present analysis, a compression study is carried out with a known data.

Keywords: Dynamic stability, Homogeneous graded beam-Piezoelectric layer, Harmonic balance method.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1733
10999 Being a Lay Partner in Jesuit Higher Education in the Philippines: A Grounded Theory Application

Authors: Janet B. Badong-Badilla

Abstract:

In Jesuit universities, laypersons, who come from the same or different faith backgrounds or traditions, are considered as collaborators in mission. The Jesuits themselves support the contributions of the lay partners in realizing the mission of the Society of Jesus and recognize the important role that they play in education. This study aims to investigate and generate particular notions and understandings of lived experiences of being a lay partner in Jesuit universities in the Philippines, particularly those involved in higher education. Using the qualitative approach as introduced by grounded theorist Barney Glaser, the lay partners’ concept of being a partner, as lived in higher education, is generated systematically from the data collected in the field primarily through in-depth interviews, field notes and observations. Glaser’s constant comparative method of analysis of data is used going through the phases of open coding, theoretical coding, and selective coding from memoing to theoretical sampling to sorting and then writing. In this study, Glaser’s grounded theory as a methodology will provide a substantial insight into and articulation of the layperson’s actual experience of being a partner of the Jesuits in education. Such articulation provides a phenomenological approach or framework to an understanding of the meaning and core characteristics of Jesuit-Lay partnership in Jesuit educational institution of higher learning in the country. This study is expected to provide a framework or model for lay partnership in academic institutions that have the same practice of having lay partners in mission.

Keywords: Grounded theory, Jesuit mission in higher education, lay partner, lived experience.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1077
10998 Program Memories Error Detection and Correction On-Board Earth Observation Satellites

Authors: Y. Bentoutou

Abstract:

Memory Errors Detection and Correction aim to secure the transaction of data between the central processing unit of a satellite onboard computer and its local memory. In this paper, the application of a double-bit error detection and correction method is described and implemented in Field Programmable Gate Array (FPGA) technology. The performance of the proposed EDAC method is measured and compared with two different EDAC devices, using the same FPGA technology. Statistical analysis of single-event upset (SEU) and multiple-bit upset (MBU) activity in commercial memories onboard the first Algerian microsatellite Alsat-1 is given.

Keywords: Error Detection and Correction, On-board computer, small satellite missions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2229
10997 Analytical Studies on Volume Determination of Leg Ulcer using Structured Light and Laser Triangulation Data Acquisition Techniques

Authors: M. Abdul-Rani, K. K. Chong, A. F. M. Hani, Y. B. Yap, A. Jamil

Abstract:

Imaging is defined as the process of obtaining geometric images either two dimensional or three dimensional by scanning or digitizing the existing objects or products. In this research, it applied to retrieve 3D information of the human skin surface in medical application. This research focuses on analyzing and determining volume of leg ulcers using imaging devices. Volume determination is one of the important criteria in clinical assessment of leg ulcer. The volume and size of the leg ulcer wound will give the indication on responding to treatment whether healing or worsening. Different imaging techniques are expected to give different result (and accuracies) in generating data and images. Midpoint projection algorithm was used to reconstruct the cavity to solid model and compute the volume. Misinterpretation of the results can affect the treatment efficacy. The objectives of this paper is to compare the accuracy between two 3D data acquisition method, which is laser triangulation and structured light methods, It was shown that using models with known volume, that structured-light-based 3D technique produces better accuracy compared with laser triangulation data acquisition method for leg ulcer volume determination.

Keywords: Imaging, Laser Triangulation, Structured Light, Volume Determination.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1514
10996 Multi-Temporal Mapping of Built-up Areas Using Daytime and Nighttime Satellite Images Based on Google Earth Engine Platform

Authors: S. Hutasavi, D. Chen

Abstract:

The built-up area is a significant proxy to measure regional economic growth and reflects the Gross Provincial Product (GPP). However, an up-to-date and reliable database of built-up areas is not always available, especially in developing countries. The cloud-based geospatial analysis platform such as Google Earth Engine (GEE) provides an opportunity with accessibility and computational power for those countries to generate the built-up data. Therefore, this study aims to extract the built-up areas in Eastern Economic Corridor (EEC), Thailand using day and nighttime satellite imagery based on GEE facilities. The normalized indices were generated from Landsat 8 surface reflectance dataset, including Normalized Difference Built-up Index (NDBI), Built-up Index (BUI), and Modified Built-up Index (MBUI). These indices were applied to identify built-up areas in EEC. The result shows that MBUI performs better than BUI and NDBI, with the highest accuracy of 0.85 and Kappa of 0.82. Moreover, the overall accuracy of classification was improved from 79% to 90%, and error of total built-up area was decreased from 29% to 0.7%, after night-time light data from the Visible and Infrared Imaging Suite (VIIRS) Day Night Band (DNB). The results suggest that MBUI with night-time light imagery is appropriate for built-up area extraction and be utilize for further study of socioeconomic impacts of regional development policy over the EEC region.

Keywords: Built-up area extraction, Google earth engine, adaptive thresholding method, rapid mapping.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 621
10995 Hybrid Neural Network Methods for Lithology Identification in the Algerian Sahara

Authors: S. Chikhi, M. Batouche, H. Shout

Abstract:

In this paper, we combine a probabilistic neural method with radial-bias functions in order to construct the lithofacies of the wells DF01, DF02 and DF03 situated in the Triassic province of Algeria (Sahara). Lithofacies is a crucial problem in reservoir characterization. Our objective is to facilitate the experts' work in geological domain and to allow them to obtain quickly the structure and the nature of lands around the drilling. This study intends to design a tool that helps automatic deduction from numerical data. We used a probabilistic formalism to enhance the classification process initiated by a Self-Organized Map procedure. Our system gives lithofacies, from well-log data, of the concerned reservoir wells in an aspect easy to read by a geology expert who identifies the potential for oil production at a given source and so forms the basis for estimating the financial returns and economic benefits.

Keywords: Classification, Lithofacies, Probabilistic formalism, Reservoir characterization, Well-log data.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1907
10994 An Automatic Bayesian Classification System for File Format Selection

Authors: Roman Graf, Sergiu Gordea, Heather M. Ryan

Abstract:

This paper presents an approach for the classification of an unstructured format description for identification of file formats. The main contribution of this work is the employment of data mining techniques to support file format selection with just the unstructured text description that comprises the most important format features for a particular organisation. Subsequently, the file format indentification method employs file format classifier and associated configurations to support digital preservation experts with an estimation of required file format. Our goal is to make use of a format specification knowledge base aggregated from a different Web sources in order to select file format for a particular institution. Using the naive Bayes method, the decision support system recommends to an expert, the file format for his institution. The proposed methods facilitate the selection of file format and the quality of a digital preservation process. The presented approach is meant to facilitate decision making for the preservation of digital content in libraries and archives using domain expert knowledge and specifications of file formats. To facilitate decision-making, the aggregated information about the file formats is presented as a file format vocabulary that comprises most common terms that are characteristic for all researched formats. The goal is to suggest a particular file format based on this vocabulary for analysis by an expert. The sample file format calculation and the calculation results including probabilities are presented in the evaluation section.

Keywords: Data mining, digital libraries, digital preservation, file format.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1668
10993 Molecular Dynamics and Circular Dichroism Studies on Aurein 1.2 and Retro Analog

Authors: Safyeh Soufian, Hoosein Naderi-Manesh, Abdoali Alizadeh, Mohammad Nabi Sarbolouki

Abstract:

Aurein 1.2 is a 13-residue amphipathic peptide with antibacterial and anticancer activity. Aurein1.2 and its retro analog were synthesized to study the activity of the peptides in relation to their structure. The antibacterial test result showed the retro-analog is inactive. The secondary structural analysis by CD spectra indicated that both of the peptides at TFE/Water adopt alpha-helical conformation. MD simulation was performed on aurein 1.2 and retro-analog in water and TFE in order to analyse the factors that are involved in the activity difference between retro and the native peptide. The simulation results are discussed and validated in the light of experimental data from the CD experiment. Both of the peptides showed a relatively similar pattern for their hydrophobicity, hydrophilicity, solvent accessible surfaces, and solvent accessible hydrophobic surfaces. However, they showed different in directions of dipole moment of peptides. Also, Our results further indicate that the reversion of the amino acid sequence affects flexibility .The data also showed that factors causing structural rigidity may decrease the activity. Consequently, our finding suggests that in the case of sequence-reversed peptide strategy, one has to pay attention to the role of amino acid sequence order in making flexibility and role of dipole moment direction in peptide activity. KeywordsAntimicrobial peptides, retro, molecular dynamic, circular dichroism.

Keywords: Antimicrobial peptides, retro, molecular dynamic, circular dichroism.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1854
10992 Computing Entropy for Ortholog Detection

Authors: Hsing-Kuo Pao, John Case

Abstract:

Biological sequences from different species are called or-thologs if they evolved from a sequence of a common ancestor species and they have the same biological function. Approximations of Kolmogorov complexity or entropy of biological sequences are already well known to be useful in extracting similarity information between such sequences -in the interest, for example, of ortholog detection. As is well known, the exact Kolmogorov complexity is not algorithmically computable. In prac-tice one can approximate it by computable compression methods. How-ever, such compression methods do not provide a good approximation to Kolmogorov complexity for short sequences. Herein is suggested a new ap-proach to overcome the problem that compression approximations may notwork well on short sequences. This approach is inspired by new, conditional computations of Kolmogorov entropy. A main contribution of the empir-ical work described shows the new set of entropy-based machine learning attributes provides good separation between positive (ortholog) and nega-tive (non-ortholog) data - better than with good, previously known alter-natives (which do not employ some means to handle short sequences well).Also empirically compared are the new entropy based attribute set and a number of other, more standard similarity attributes sets commonly used in genomic analysis. The various similarity attributes are evaluated by cross validation, through boosted decision tree induction C5.0, and by Receiver Operating Characteristic (ROC) analysis. The results point to the conclu-sion: the new, entropy based attribute set by itself is not the one giving the best prediction; however, it is the best attribute set for use in improving the other, standard attribute sets when conjoined with them.

Keywords: compression, decision tree, entropy, ortholog, ROC.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1833