Search results for: psychophysiological methods.
3677 Predicting Dietary Practice Behavior among Type 2 Diabetics Using the Theory of Planned Behavior and Mixed Methods Design
Authors: D.O. Omondi, M.K. Walingo, G.M. Mbagaya, L.O.A. Othuon
Abstract:
This study applied the Theory of Planned Behavior model in predicting dietary behavior among Type 2 diabetics in a Kenyan environment. The study was conducted for three months within the diabetic clinic at Kisii Hospital in Nyanza Province in Kenya and adopted sequential mixed methods design combing both qualitative and quantitative phases. Qualitative data was analyzed using grounded theory analysis method. Structural equation modeling using maximum likelihood was used to analyze quantitative data. The results based on the common fit indices revealed that the theory of planned behavior fitted the data acceptably well among the Type 2 diabetes and within dietary behavior {χ2 = 223.3, df = 77, p = .02, χ2/df = 2.9, n=237; TLI = .93; CFI =.91; RMSEA (90CI) = .090(.039, .146)}. This implies that the Theory of Planned Behavior holds and forms a framework for promoting dietary practice among Type 2 diabetics.Keywords: Dietary practice, Kenya, Theory of PlannedBehavior, Type 2 diabetes, Mixed Methods Design.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20983676 Study of Remote Sensing and Satellite Images Ability in Preparing Agricultural Land Use Map (ALUM)
Authors: Ali Gholami
Abstract:
In this research the Preparation of Land use map of scanner LISS III satellite data, belonging to the IRS in the Aghche region in Isfahan province, is studied carefully. For this purpose, the IRS satellite images of August 2008 and various land preparation uses in region including rangelands, irrigation farming, dry farming, gardens and urban areas were separated and identified. Therefore, the GPS and Erdas Imaging software were used and three methods of Maximum Likelihood, Mahalanobis Distance and Minimum Distance were analyzed. In each of these methods, matrix error and Kappa index were calculated and accuracy of each method, based on percentages: 53.13, 56.64 and 48.44, were obtained respectively. Considering the low accuracy of these methods in separation of land preparation use, the visual interpretation of the map was used. Finally, regional visits of 150 points were noted at random and no error was observed. It shows that the map prepared by visual interpretation is in high accuracy. Although the probable errors due to visual interpretation and geometric correction might happen but the desired accuracy of the map which is more than 85 percent is reliable.Keywords: Land use map, Aghche Region, Erdas Imagine, satellite images
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15713675 Transforming Personal Healthcare through Patient Engagement: An In-Depth Analysis of Tools and Methods for the Digital Age
Authors: Emily Hickmann, Peggy Richter, Maren Kählig, Hannes Schlieter
Abstract:
Patient engagement is a cornerstone of high-quality care and essential for patients with chronic diseases to achieve improved health outcomes. Through digital transformation, possibilities to engage patients in their personal healthcare have multiplied. However, the exploitation of this potential is still lagging. To support the transmission of patient engagement theory into practice, this paper’s objective is to give a state-of-the-art overview of patient engagement tools and methods. A systematic literature review was conducted. Overall, 56 tools and methods were extracted and synthesized according to the four attributes of patient engagement, i.e., personalization, access, commitment, and therapeutic alliance. The results are discussed in terms of their potential to be implemented in digital health solutions under consideration of the “computers are social actors” (CASA) paradigm. It is concluded that digital health can catalyze patient engagement in practice, and a broad future research agenda is formulated.
Keywords: Chronic diseases, digitalization, patient-centeredness, patient empowerment, patient engagement.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 763674 Estimation of Skew Angle in Binary Document Images Using Hough Transform
Authors: Nandini N., Srikanta Murthy K., G. Hemantha Kumar
Abstract:
This paper includes two novel techniques for skew estimation of binary document images. These algorithms are based on connected component analysis and Hough transform. Both these methods focus on reducing the amount of input data provided to Hough transform. In the first method, referred as word centroid approach, the centroids of selected words are used for skew detection. In the second method, referred as dilate & thin approach, the selected characters are blocked and dilated to get word blocks and later thinning is applied. The final image fed to Hough transform has the thinned coordinates of word blocks in the image. The methods have been successful in reducing the computational complexity of Hough transform based skew estimation algorithms. Promising experimental results are also provided to prove the effectiveness of the proposed methods.Keywords: Dilation, Document processing, Hough transform, Optical Character Recognition, Skew estimation, and Thinning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32663673 Application of Process Approach to Evaluate the Information Security Risk and its Implementation in an Iranian Private Bank
Authors: Isa Nakhai Kamal Abadi, Esmaeel Saberi, Ehsan Mirjafari
Abstract:
Every organization is continually subject to new damages and threats which can be resulted from their operations or their goal accomplishment. Methods of providing the security of space and applied tools have been widely changed with increasing application and development of information technology (IT). From this viewpoint, information security management systems were evolved to construct and prevent reiterating the experienced methods. In general, the correct response in information security management systems requires correct decision making, which in turn requires the comprehensive effort of managers and everyone involved in each plan or decision making. Obviously, all aspects of work or decision are not defined in all decision making conditions; therefore, the possible or certain risks should be considered when making decisions. This is the subject of risk management and it can influence the decisions. Investigation of different approaches in the field of risk management demonstrates their progress from quantitative to qualitative methods with a process approach.
Keywords: Risk Management, Information Security, Methodology, Probability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15303672 A Study of Agile-Based Approaches to Improve Software Quality
Authors: Gurmeet Kaur, Jyoti Pruthi
Abstract:
Agile Software development approaches and techniques are being considered as efficient, effective, and popular methods to the development of software. Agile software developments are useful for developing high-quality software that completes client requirements with zero defects, and in short delivery period. In agile software development methodology, quality is related to coding, which means quality, is managed through the use of approaches like refactoring, pair programming, test-driven development, behavior-driven development, acceptance test-driven development, and demand-driven development. The quality of software is measured using metrics like the number of defects during the development and improvement of the software. Usage of the above-mentioned methods or approaches reduces the possibilities of defects in developed software, and hence improves quality. This paper focuses on the study of agile-based quality methods or approaches for software development that ensures improved quality of software as well as reduced cost, and customer satisfaction.
Keywords: Agile software development, ASD, Acceptance test-driven development, ATDD, Behavior-driven development, BDD, Demand-driven development. DDD, Test-driven development, TDD.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6913671 Standardization of Ayurvedic Formulation (Marichyadi Vati) Using HPLC and HPTLC Methods
Authors: Pathan Imran Khan, Bhandari Anil, Kumar Amit
Abstract:
The present investigation was aimed to develop methodology for the standardization of Marichyadi Vati and its raw materials. Standardization was carried using systematic Pharmacognostical and physicochemical parameters as per WHO guidelines. The detailed standardization of Marichyadi Vati, it is concluded that there are no major differences prevailed in the quality of marketed products and laboratory samples of Marichyadi Vati. However, market samples showed slightly better amount of Piperine than the laboratory sample by both methods. This is the first attempt to generate complete set of standards required for the Marichyadi Vati.
Keywords: Marichyadi Vati, Standardization, Piperine.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 24853670 Motivational Orientation of the Methodical System of Teaching Mathematics in Secondary Schools
Authors: M. Rodionov, Z. Dedovets
Abstract:
The article analyses the composition and structure of the motivationally oriented methodological system of teaching mathematics (purpose, content, methods, forms, and means of teaching), viewed through the prism of the student as the subject of the learning process. Particular attention is paid to the problem of methods of teaching mathematics, which are represented in the form of an ordered triad of attributes corresponding to the selected characteristics. A systematic analysis of possible options and their methodological interpretation enriched existing ideas about known methods and technologies of training, and significantly expanded their nomenclature by including previously unstudied combinations of characteristics. In addition, examples outlined in this article illustrate the possibilities of enhancing the motivational capacity of a particular method or technology in the real learning practice of teaching mathematics through more free goal-setting and varying the conditions of the problem situations. The authors recommend the implementation of different strategies according to their characteristics in teaching and learning mathematics in secondary schools.
Keywords: Education, methodological system, teaching of mathematics, teachers, lesson, students motivation, secondary school.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8573669 Password Cracking on Graphics Processing Unit Based Systems
Authors: N. Gopalakrishna Kini, Ranjana Paleppady, Akshata K. Naik
Abstract:
Password authentication is one of the widely used methods to achieve authentication for legal users of computers and defense against attackers. There are many different ways to authenticate users of a system and there are many password cracking methods also developed. This paper proposes how best password cracking can be performed on a CPU-GPGPU based system. The main objective of this work is to project how quickly a password can be cracked with some knowledge about the computer security and password cracking if sufficient security is not incorporated to the system.Keywords: GPGPU, password cracking, secret key, user authentication.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26243668 Photomechanical Analysis of Wooden Testing Bodies under Flexural Loadings
Authors: J. Gazzola, I. M. Dal Fabbro, J. Soriano, M. V. G. Silva, S. Rodrigues
Abstract:
Application of wood in rural construction is diffused all around the world since remote times. However, its inclusion in structural design deserves strong support from broad knowledge of material properties. The pertinent literature reveals the application of optical methods in determining the complete field displacement on bodies exhibiting regular as well as irregular surfaces. The use of moiré techniques in experimental mechanics consists in analyzing the patterns generated on the body surface before and after deformation. The objective of this research work is to study the qualitative deformation behavior of wooden testing specimens under specific loading situations. The experiment setup follows the literature description of shadow moiré methods. Results indicate strong anisotropy influence of the generated displacement field. Important qualitative as well as quantitative stress and strain distribution were obtained wooden members which are applicable to rural constructions.Keywords: Moiré methods, wooden structural material, rural constructions.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15283667 Fingerprint Verification System Using Minutiae Extraction Technique
Authors: Manvjeet Kaur, Mukhwinder Singh, Akshay Girdhar, Parvinder S. Sandhu
Abstract:
Most fingerprint recognition techniques are based on minutiae matching and have been well studied. However, this technology still suffers from problems associated with the handling of poor quality impressions. One problem besetting fingerprint matching is distortion. Distortion changes both geometric position and orientation, and leads to difficulties in establishing a match among multiple impressions acquired from the same finger tip. Marking all the minutiae accurately as well as rejecting false minutiae is another issue still under research. Our work has combined many methods to build a minutia extractor and a minutia matcher. The combination of multiple methods comes from a wide investigation into research papers. Also some novel changes like segmentation using Morphological operations, improved thinning, false minutiae removal methods, minutia marking with special considering the triple branch counting, minutia unification by decomposing a branch into three terminations, and matching in the unified x-y coordinate system after a two-step transformation are used in the work.
Keywords: Biometrics, Minutiae, Crossing number, False Accept Rate (FAR), False Reject Rate (FRR).
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 65743666 On Speeding Up Support Vector Machines: Proximity Graphs Versus Random Sampling for Pre-Selection Condensation
Authors: Xiaohua Liu, Juan F. Beltran, Nishant Mohanchandra, Godfried T. Toussaint
Abstract:
Support vector machines (SVMs) are considered to be the best machine learning algorithms for minimizing the predictive probability of misclassification. However, their drawback is that for large data sets the computation of the optimal decision boundary is a time consuming function of the size of the training set. Hence several methods have been proposed to speed up the SVM algorithm. Here three methods used to speed up the computation of the SVM classifiers are compared experimentally using a musical genre classification problem. The simplest method pre-selects a random sample of the data before the application of the SVM algorithm. Two additional methods use proximity graphs to pre-select data that are near the decision boundary. One uses k-Nearest Neighbor graphs and the other Relative Neighborhood Graphs to accomplish the task.Keywords: Machine learning, data mining, support vector machines, proximity graphs, relative-neighborhood graphs, k-nearestneighbor graphs, random sampling, training data condensation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19193665 Prediction of the Lateral Bearing Capacity of Short Piles in Clayey Soils Using Imperialist Competitive Algorithm-Based Artificial Neural Networks
Authors: Reza Dinarvand, Mahdi Sadeghian, Somaye Sadeghian
Abstract:
Prediction of the ultimate bearing capacity of piles (Qu) is one of the basic issues in geotechnical engineering. So far, several methods have been used to estimate Qu, including the recently developed artificial intelligence methods. In recent years, optimization algorithms have been used to minimize artificial network errors, such as colony algorithms, genetic algorithms, imperialist competitive algorithms, and so on. In the present research, artificial neural networks based on colonial competition algorithm (ANN-ICA) were used, and their results were compared with other methods. The results of laboratory tests of short piles in clayey soils with parameters such as pile diameter, pile buried length, eccentricity of load and undrained shear resistance of soil were used for modeling and evaluation. The results showed that ICA-based artificial neural networks predicted lateral bearing capacity of short piles with a correlation coefficient of 0.9865 for training data and 0.975 for test data. Furthermore, the results of the model indicated the superiority of ICA-based artificial neural networks compared to back-propagation artificial neural networks as well as the Broms and Hansen methods.
Keywords: Lateral bearing capacity, short pile, clayey soil, artificial neural network, Imperialist competition algorithm.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9423664 Analysing and Classifying VLF Transients
Authors: Ernst D. Schmitter
Abstract:
Monitoring lightning electromagnetic pulses (sferics) and other terrestrial as well as extraterrestrial transient radiation signals is of considerable interest for practical and theoretical purposes in astro- and geophysics as well as meteorology. Managing a continuous flow of data, automation of the analysis and classification process is important. Features based on a combination of wavelet and statistical methods proved efficient for this task and serve as input into a radial basis function network that is trained to discriminate transient shapes from pulse like to wave like. We concentrate on signals in the Very Low Frequency (VLF, 3 -30 kHz) range in this paper, but the developed methods are independent of this specific choice.
Keywords: Transient signals, statistics, wavelets, neural networks
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18803663 A Comparison of Energy Calculations for a Single-Family Detached Home with Two Energy Simulation Methods
Authors: Amir Sattari
Abstract:
For newly produced houses and energy renovations, an energy calculation needs to be conducted. This is done to verify whether the energy consumption criteria of the house -to reach the energy targets by 2020 and 2050- are in-line with the norms. The main purpose of this study is to confirm whether easy to use energy calculation software or hand calculations used by small companies or individuals give logical results compared to advanced energy simulation program used by researchers or bigger companies. There are different methods for calculating energy consumption. In this paper, two energy calculation programs are used and the relation of energy consumption with solar radiation is compared. A hand calculation is also done to validate whether the hand calculations are still reasonable. The two computer programs which have been used are TMF Energi (the easy energy calculation variant used by small companies or individuals) and IDA ICE - Indoor Climate and Energy (the advanced energy simulation program used by researchers or larger companies). The calculations are done for a standard house from the Swedish house supplier Fiskarhedenvillan. The method is based on having the same conditions and inputs in the different calculation forms so that the results can be compared and verified. The house has been faced differently to see how the orientation affects energy consumption in different methods. The results for the simulations are close to each other and the hand calculation differs from the computer programs by only 5%. Even if solar factors differ due to the orientation of the house, energy calculation results from different computer programs and even hand calculation methods are in line with each other.
Keywords: Energy calculation, energy consumption, energy simulation, IDA ICE, TMF Energi.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10553662 3D Shape Knitting: Loop Alignment on a Surface with Positive Gaussian Curvature
Authors: C. T. Cheung, R. K. P. Ng, T. Y. Lo, Zhou Jinyun
Abstract:
This paper aims at manipulating loop alignment in knitting a three-dimensional (3D) shape by its geometry. Two loop alignment methods are introduced to handle a surface with positive Gaussian curvature. As weft knitting is a two-dimensional (2D) knitting mechanism that the knitting cam carrying the feeders moves in two directions only, left and right, the knitted fabric generated grows in width and length but not in depth. Therefore, a 3D shape is required to be flattened to a 2D plane with surface area preserved for knitting. On this flattened plane, dimensional measurements are taken for loop alignment. The way these measurements being taken derived two different loop alignment methods. In this paper, only plain knitted structure was considered. Each knitted loop was taken as a basic unit for loop alignment in order to achieve the required geometric dimensions, without the inclusion of other stitches which give textural dimensions to the fabric. Two loop alignment methods were experimented and compared. Only one of these two can successfully preserve the dimensions of the shape.Keywords: 3D knitting, 3D shape, loop alignment, positive Gaussian curvature.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15473661 Calculation of Voided Slabs Rigidities
Authors: Gee-Cheol Kim, Joo-Won Kang
Abstract:
A theoretical study of the rigidities of slabs with circular voids oriented in the longitudinal and in the transverse direction is discussed. Equations are presented for predicting the bending and torsional rigidities of the voided slabs. This paper summarizes the results of an extensive literature search and initial review of the current methods of analyzing voided slab. The various methods of calculating the equivalent plate parameters, which are necessary for two-dimensional analysis, are also reviewed. Static deflections on voided slabs are shown to be in good agreement with proposed equation.Keywords: voided slab, bending rigidity, torsional rigidity, orthotropic plate
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 38653660 Rapid Processing Techniques Applied to Sintered Nickel Battery Technologies for Utility Scale Applications
Authors: J. D. Marinaccio, I. Mabbett, C. Glover, D. Worsley
Abstract:
Through use of novel modern/rapid processing techniques such as screen printing and Near-Infrared (NIR) radiative curing, process time for the sintering of sintered nickel plaques, applicable to alkaline nickel battery chemistries, has been drastically reduced from in excess of 200 minutes with conventional convection methods to below 2 minutes using NIR curing methods. Steps have also been taken to remove the need for forming gas as a reducing agent by implementing carbon as an in-situ reducing agent, within the ink formulation.Keywords: Batteries, energy, iron, nickel, storage.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23413659 Earthquake Classification in Molluca Collision Zone Using Conventional Statistical Methods
Authors: H. J. Wattimanela, U. S. Passaribu, N. T. Puspito, S. W. Indratno
Abstract:
Molluca Collision Zone is located at the junction of the Eurasian, Australian, Pacific and the Philippines plates. Between the Sangihe arc, west of the collision zone, and to the east of Halmahera arc is active collision and convex toward the Molluca Sea. This research will analyze the behavior of earthquake occurrence in Molluca Collision Zone related to the distributions of an earthquake in each partition regions, determining the type of distribution of a occurrence earthquake of partition regions, and the mean occurence of earthquakes each partition regions, and the correlation between the partitions region. We calculate number of earthquakes using partition method and its behavioral using conventional statistical methods. In this research, we used data of shallow earthquakes type and its magnitudes ≥4 SR (period 1964-2013). From the results, we can classify partitioned regions based on the correlation into two classes: strong and very strong. This classification can be used for early warning system in disaster management.
Keywords: Molluca Collision Zone, partition regions, conventional statistical methods, Earthquakes, classifications, disaster management.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19823658 Efficient Tools for Managing Uncertainties in Design and Operation of Engineering Structures
Authors: J. Menčík
Abstract:
Actual load, material characteristics and other quantities often differ from the design values. This can cause worse function, shorter life or failure of a civil engineering structure, a machine, vehicle or another appliance. The paper shows main causes of the uncertainties and deviations and presents a systematic approach and efficient tools for their elimination or mitigation of consequences. Emphasis is put on the design stage, which is most important for reliability ensuring. Principles of robust design and important tools are explained, including FMEA, sensitivity analysis and probabilistic simulation methods. The lifetime prediction of long-life objects can be improved by long-term monitoring of the load response and damage accumulation in operation. The condition evaluation of engineering structures, such as bridges, is often based on visual inspection and verbal description. Here, methods based on fuzzy logic can reduce the subjective influences.Keywords: Design, fuzzy methods, Monte Carlo, reliability, robust design, sensitivity analysis, simulation, uncertainties.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18153657 Combining Bagging and Boosting
Authors: S. B. Kotsiantis, P. E. Pintelas
Abstract:
Bagging and boosting are among the most popular resampling ensemble methods that generate and combine a diversity of classifiers using the same learning algorithm for the base-classifiers. Boosting algorithms are considered stronger than bagging on noisefree data. However, there are strong empirical indications that bagging is much more robust than boosting in noisy settings. For this reason, in this work we built an ensemble using a voting methodology of bagging and boosting ensembles with 10 subclassifiers in each one. We performed a comparison with simple bagging and boosting ensembles with 25 sub-classifiers, as well as other well known combining methods, on standard benchmark datasets and the proposed technique was the most accurate.
Keywords: data mining, machine learning, pattern recognition.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25623656 New Newton's Method with Third-order Convergence for Solving Nonlinear Equations
Authors: Osama Yusuf Ababneh
Abstract:
For the last years, the variants of the Newton-s method with cubic convergence have become popular iterative methods to find approximate solutions to the roots of non-linear equations. These methods both enjoy cubic convergence at simple roots and do not require the evaluation of second order derivatives. In this paper, we present a new Newton-s method based on contra harmonic mean with cubically convergent. Numerical examples show that the new method can compete with the classical Newton's method.
Keywords: Third-order convergence, non-linear equations, root finding, iterative method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29643655 Selecting an Advanced Creep Model or a Sophisticated Time-Integration? A New Approach by Means of Sensitivity Analysis
Authors: Holger Keitel
Abstract:
The prediction of long-term deformations of concrete and reinforced concrete structures has been a field of extensive research and several different creep models have been developed so far. Most of the models were developed for constant concrete stresses, thus, in case of varying stresses a specific superposition principle or time-integration, respectively, is necessary. Nowadays, when modeling concrete creep the engineering focus is rather on the application of sophisticated time-integration methods than choosing the more appropriate creep model. For this reason, this paper presents a method to quantify the uncertainties of creep prediction originating from the selection of creep models or from the time-integration methods. By adapting variance based global sensitivity analysis, a methodology is developed to quantify the influence of creep model selection or choice of time-integration method. Applying the developed method, general recommendations how to model creep behavior for varying stresses are given.
Keywords: Concrete creep models, time-integration methods, sensitivity analysis, prediction uncertainty.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15383654 Selecting Negative Examples for Protein-Protein Interaction
Authors: Mohammad Shoyaib, M. Abdullah-Al-Wadud, Oksam Chae
Abstract:
Proteomics is one of the largest areas of research for bioinformatics and medical science. An ambitious goal of proteomics is to elucidate the structure, interactions and functions of all proteins within cells and organisms. Predicting Protein-Protein Interaction (PPI) is one of the crucial and decisive problems in current research. Genomic data offer a great opportunity and at the same time a lot of challenges for the identification of these interactions. Many methods have already been proposed in this regard. In case of in-silico identification, most of the methods require both positive and negative examples of protein interaction and the perfection of these examples are very much crucial for the final prediction accuracy. Positive examples are relatively easy to obtain from well known databases. But the generation of negative examples is not a trivial task. Current PPI identification methods generate negative examples based on some assumptions, which are likely to affect their prediction accuracy. Hence, if more reliable negative examples are used, the PPI prediction methods may achieve even more accuracy. Focusing on this issue, a graph based negative example generation method is proposed, which is simple and more accurate than the existing approaches. An interaction graph of the protein sequences is created. The basic assumption is that the longer the shortest path between two protein-sequences in the interaction graph, the less is the possibility of their interaction. A well established PPI detection algorithm is employed with our negative examples and in most cases it increases the accuracy more than 10% in comparison with the negative pair selection method in that paper.Keywords: Interaction graph, Negative training data, Protein-Protein interaction, Support vector machine.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17023653 Using Structural Equation Modeling in Causal Relationship Design for Balanced-Scorecards' Strategic Map
Authors: A. Saghaei, R. Ghasemi
Abstract:
Through 1980s, management accounting researchers described the increasing irrelevance of traditional control and performance measurement systems. The Balanced Scorecard (BSC) is a critical business tool for a lot of organizations. It is a performance measurement system which translates mission and strategy into objectives. Strategy map approach is a development variant of BSC in which some necessary causal relations must be established. To recognize these relations, experts usually use experience. It is also possible to utilize regression for the same purpose. Structural Equation Modeling (SEM), which is one of the most powerful methods of multivariate data analysis, obtains more appropriate results than traditional methods such as regression. In the present paper, we propose SEM for the first time to identify the relations between objectives in the strategy map, and a test to measure the importance of relations. In SEM, factor analysis and test of hypotheses are done in the same analysis. SEM is known to be better than other techniques at supporting analysis and reporting. Our approach provides a framework which permits the experts to design the strategy map by applying a comprehensive and scientific method together with their experience. Therefore this scheme is a more reliable method in comparison with the previously established methods.Keywords: BSC, SEM, Strategy map.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 27053652 Select-Low and Select-High Methods for the Wheeled Robot Dynamic States Control
Authors: Bogusław Schreyer
Abstract:
The paper enquires on the two methods of the wheeled robot braking torque control. Those two methods are applied when the adhesion coefficient under left side wheels is different from the adhesion coefficient under the right side wheels. In case of the select-low (SL) method the braking torque on both wheels is controlled by the signals originating from the wheels on the side of the lower adhesion. In the select-high (SH) method the torque is controlled by the signals originating from the wheels on the side of the higher adhesion. The SL method is securing stable and secure robot behaviors during the braking process. However, the efficiency of this method is relatively low. The SH method is more efficient in terms of time and braking distance but in some situations may cause wheels blocking. It is important to monitor the velocity of all wheels and then take a decision about the braking torque distribution accordingly. In case of the SH method the braking torque slope may require significant decrease in order to avoid wheel blocking.
Keywords: Select-high method, select-low method, torque distribution, wheeled robot.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4893651 Various Speech Processing Techniques For Speech Compression And Recognition
Authors: Jalal Karam
Abstract:
Years of extensive research in the field of speech processing for compression and recognition in the last five decades, resulted in a severe competition among the various methods and paradigms introduced. In this paper we include the different representations of speech in the time-frequency and time-scale domains for the purpose of compression and recognition. The examination of these representations in a variety of related work is accomplished. In particular, we emphasize methods related to Fourier analysis paradigms and wavelet based ones along with the advantages and disadvantages of both approaches.Keywords: Time-Scale, Wavelets, Time-Frequency, Compression, Recognition.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23313650 A Study of Various Numerical Turbulence Modeling Methods in Boundary Layer Excitation of a Square Ribbed Channel
Authors: Hojjat Saberinejad, Adel Hashiehbaf, Ehsan Afrasiabian
Abstract:
Among the various cooling processes in industrial applications such as: electronic devices, heat exchangers, gas turbines, etc. Gas turbine blades cooling is the most challenging one. One of the most common practices is using ribbed wall because of the boundary layer excitation and therefore making the ultimate cooling. Vortex formation between rib and channel wall will result in a complicated behavior of flow regime. At the other hand, selecting the most efficient method for capturing the best results comparing to experimental works would be a fascinating issue. In this paper 4 common methods in turbulence modeling: standard k-e, rationalized k-e with enhanced wall boundary layer treatment, k-w and RSM (Reynolds stress model) are employed to a square ribbed channel to investigate the separation and thermal behavior of the flow in the channel. Finally all results from different methods which are used in this paper will be compared with experimental data available in literature to ensure the numerical method accuracy.Keywords: boundary layer, turbulence, numerical method, rib cooling
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16903649 A Comparative Study of Malware Detection Techniques Using Machine Learning Methods
Authors: Cristina Vatamanu, Doina Cosovan, Dragoş Gavriluţ, Henri Luchian
Abstract:
In the past few years, the amount of malicious software increased exponentially and, therefore, machine learning algorithms became instrumental in identifying clean and malware files through (semi)-automated classification. When working with very large datasets, the major challenge is to reach both a very high malware detection rate and a very low false positive rate. Another challenge is to minimize the time needed for the machine learning algorithm to do so. This paper presents a comparative study between different machine learning techniques such as linear classifiers, ensembles, decision trees or various hybrids thereof. The training dataset consists of approximately 2 million clean files and 200.000 infected files, which is a realistic quantitative mixture. The paper investigates the above mentioned methods with respect to both their performance (detection rate and false positive rate) and their practicability.Keywords: Detection Rate, False Positives, Perceptron, One Side Class, Ensembles, Decision Tree, Hybrid methods, Feature Selection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 32803648 Dynamic Safety-Stock Calculation
Authors: Julian Becker, Wiebke Hartmann, Sebastian Bertsch, Johannes Nywlt, Matthias Schmidt
Abstract:
In order to ensure a high service level industrial enterprises have to maintain safety-stock that directly influences the economic efficiency at the same time. This paper analyses established mathematical methods to calculate safety-stock. Therefore, the performance measured in stock and service level is appraised and the limits of several methods are depicted. Afterwards, a new dynamic approach is presented to gain an extensive method to calculate safety-stock that also takes the knowledge of future volatility into account.
Keywords: Inventory dimensioning, material requirement planning, safety-stock calculation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6877