Search results for: Analysis methods
10434 Comparative Canadian Online News Coverage Analysis of Sex Trafficking Reported Cases in Ontario and Nova Scotia
Authors: Alisha Fisher
Abstract:
Sex trafficking is a worldwide crisis that requires trauma-informed and survivor-centered media attention to accurate disseminate information. Much of the previous literature of sex trafficking tends to focus on frequency of incidents, intervention, and support strategies for survivors, with few of them looking to how the media is conducting their reporting on sex trafficking cases to the public. Utilizing data of reports from the media of cases of sex trafficking in the two Canadian provinces with the highest cases of sex trafficking, Ontario and Nova Scotia, we sought to analyze the similarities and differences of how sex trafficking cases were being reported. A total of 20 articles were examined, with 10 based within the province of Ontario and the remaining 10 from the province of Nova Scotia. We coded in two processes, first, who the article was about, and second, the framing and content inclusion. The results suggest that there is high usage, and reliance of voices and images of authority, with male people of color being shown as the perpetrators, and white women being shown as the survivors. These findings can aid in the expansion of trauma-informed, survivor-centered media literacy of reports of sex trafficking to provide accurate insights, and further developing robust methods to intersectional approaches to reporting cases of sex trafficking.
Keywords: Sex Trafficking, media coverage, canada sex trafficking, content analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 77510433 Attribute Weighted Class Complexity: A New Metric for Measuring Cognitive Complexity of OO Systems
Authors: Dr. L. Arockiam, A. Aloysius
Abstract:
In general, class complexity is measured based on any one of these factors such as Line of Codes (LOC), Functional points (FP), Number of Methods (NOM), Number of Attributes (NOA) and so on. There are several new techniques, methods and metrics with the different factors that are to be developed by the researchers for calculating the complexity of the class in Object Oriented (OO) software. Earlier, Arockiam et.al has proposed a new complexity measure namely Extended Weighted Class Complexity (EWCC) which is an extension of Weighted Class Complexity which is proposed by Mishra et.al. EWCC is the sum of cognitive weights of attributes and methods of the class and that of the classes derived. In EWCC, a cognitive weight of each attribute is considered to be 1. The main problem in EWCC metric is that, every attribute holds the same value but in general, cognitive load in understanding the different types of attributes cannot be the same. So here, we are proposing a new metric namely Attribute Weighted Class Complexity (AWCC). In AWCC, the cognitive weights have to be assigned for the attributes which are derived from the effort needed to understand their data types. The proposed metric has been proved to be a better measure of complexity of class with attributes through the case studies and experimentsKeywords: Software Complexity, Attribute Weighted Class Complexity, Weighted Class Complexity, Data Type
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 212110432 Analysis of Aiming Performance for Games Using Mapping Method of Corneal Reflections Based on Two Different Light Sources
Authors: Yoshikazu Onuki, Itsuo Kumazawa
Abstract:
Fundamental motivation of this paper is how gaze estimation can be utilized effectively regarding an application to games. In games, precise estimation is not always important in aiming targets but an ability to move a cursor to an aiming target accurately is also significant. Incidentally, from a game producing point of view, a separate expression of a head movement and gaze movement sometimes becomes advantageous to expressing sense of presence. A case that panning a background image associated with a head movement and moving a cursor according to gaze movement can be a representative example. On the other hand, widely used technique of POG estimation is based on a relative position between a center of corneal reflection of infrared light sources and a center of pupil. However, a calculation of a center of pupil requires relatively complicated image processing, and therefore, a calculation delay is a concern, since to minimize a delay of inputting data is one of the most significant requirements in games. In this paper, a method to estimate a head movement by only using corneal reflections of two infrared light sources in different locations is proposed. Furthermore, a method to control a cursor using gaze movement as well as a head movement is proposed. By using game-like-applications, proposed methods are evaluated and, as a result, a similar performance to conventional methods is confirmed and an aiming control with lower computation power and stressless intuitive operation is obtained.
Keywords: Point-of-gaze, gaze estimation, head movement, corneal reflections, two infrared light sources, game.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 107110431 The Vulnerability Analysis of Java Bytecode Based on Points-to Dataflow
Authors: Tang Hong, Zhang Lufeng, Chen Hua, Zhang Jianbo
Abstract:
Today many developers use the Java components collected from the Internet as external LIBs to design and develop their own software. However, some unknown security bugs may exist in these components, such as SQL injection bug may comes from the components which have no specific check for the input string by users. To check these bugs out is very difficult without source code. So a novel method to check the bugs in Java bytecode based on points-to dataflow analysis is in need, which is different to the common analysis techniques base on the vulnerability pattern check. It can be used as an assistant tool for security analysis of Java bytecode from unknown softwares which will be used as extern LIBs.Keywords: Java bytecode, points-to dataflow, vulnerability analysis
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 176310430 Advanced Neural Network Learning Applied to Pulping Modeling
Authors: Z. Zainuddin, W. D. Wan Rosli, R. Lanouette, S. Sathasivam
Abstract:
This paper reports work done to improve the modeling of complex processes when only small experimental data sets are available. Neural networks are used to capture the nonlinear underlying phenomena contained in the data set and to partly eliminate the burden of having to specify completely the structure of the model. Two different types of neural networks were used for the application of pulping problem. A three layer feed forward neural networks, using the Preconditioned Conjugate Gradient (PCG) methods were used in this investigation. Preconditioning is a method to improve convergence by lowering the condition number and increasing the eigenvalues clustering. The idea is to solve the modified odified problem M-1 Ax= M-1b where M is a positive-definite preconditioner that is closely related to A. We mainly focused on Preconditioned Conjugate Gradient- based training methods which originated from optimization theory, namely Preconditioned Conjugate Gradient with Fletcher-Reeves Update (PCGF), Preconditioned Conjugate Gradient with Polak-Ribiere Update (PCGP) and Preconditioned Conjugate Gradient with Powell-Beale Restarts (PCGB). The behavior of the PCG methods in the simulations proved to be robust against phenomenon such as oscillations due to large step size.
Keywords: Convergence, pulping modeling, neural networks, preconditioned conjugate gradient.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 140910429 Comparative Analysis of Photovoltaic Systems
Authors: Irtaza M. Syed, Kaamran Raahemifar
Abstract:
This paper presents comparative analysis of photovoltaic systems (PVS) and propose practical techniques to improve operational efficiency of the PVS. The best engineering and construction practices for PVS are identified and field oriented recommendation are made. Comparative analysis of central and string inverter based, as well as 600 and 1000VDC PVS are performed. In addition, direct current (DC) and alternating current (AC) photovoltaic (PV) module based systems are compared. Comparison shows that 1000V DC String Inverters based PVS is the best choice.Keywords: Photovoltaic module, photovoltaic systems, operational efficiency improvement, comparative analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 229810428 The Incidence of Obesity among Adult Women in Pekanbaru City, Indonesia, Related to High Fat Consumption, Stress Level, and Physical Activity
Authors: Yudia Mailani Putri, Martalena Purba, B. J. Istiti Kandarina
Abstract:
Background: Obesity has been recognized as a global health problem. Individuals classified as overweight and obese are increasing at an alarming rate. This condition is associated with psychological and physiological problems. as a person reaches adulthood, somatic growth ceases. At this stage, the human body has developed fully, to a stable state. As the capital of Riau Province in Indonesia, Pekanbaru is dominated by Malay ethnic population habitually consuming cholesterol-rich fatty foods as a daily menu, a trigger to the onset of obesity resulting in high prevalence of degenerative diseases. Research objectives: The aim of this study is elaborating the relationship between high-fat consumption pattern, stress level, physical activity and the incidence of obesity in adult women in Pekanbaru city. Research Methods: Among the combined research methods applied in this study, the first stage is quantitative observational, analytical cross-sectional research design with adult women aged 20-40 living in Pekanbaru city. The sample consists of 200 women with BMI≥25. Sample data is processed with univariate, bivariate (correlation and simple linear regression) and multivariate (multiple linear regression) analysis. The second phase is qualitative descriptive study purposive sampling by in-depth interviews. six participants withdrew from the study. Results: According to the results of the bivariate analysis, there are relationships between the incidence of obesity and the pattern of high fat foods consumption (energy intake (p≤0.000; r = 0.536), protein intake (p≤0.000; r=0.307), fat intake (p≤0.000; r=0.416), carbohydrate intake (p≤0.000; r=0.430), frequency of fatty food consumption (p≤0.000; r=0.506) and frequency of viscera foods consumption (p≤0.000; r=0.535). There is a relationship between physical activity and incidence of obesity (p≤0.000; r=-0.631). However, there is no relationship between the level of stress (p=0.741; r=0.019-) and the incidence of obesity. Physical activity is a predominant factor in the incidence of obesity in adult women in Pekanbaru city. Conclusion: There are relationships between high-fat food consumption pattern, physical activity and the incidence of obesity in Pekanbaru city whereas physical activity is a predominant factor in the occurrence of obesity, supported by the unchangeable pattern of high-fat foods consumption.
Keywords: Obesity, adult, high in fat, stress, physical activity, consumption pattern.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 82010427 Identification of Coauthors in Scientific Database
Authors: Thiago M. R Dias, Gray F. Moita
Abstract:
The analysis of scientific collaboration networks has contributed significantly to improving the understanding of how does the process of collaboration between researchers and also to understand how the evolution of scientific production of researchers or research groups occurs. However, the identification of collaborations in large scientific databases is not a trivial task given the high computational cost of the methods commonly used. This paper proposes a method for identifying collaboration in large data base of curriculum researchers. The proposed method has low computational cost with satisfactory results, proving to be an interesting alternative for the modeling and characterization of large scientific collaboration networks.
Keywords: Extraction and data integration, Information Retrieval, Scientific Collaboration.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 171210426 Investigation of Relationship between Organizational Climate and Organizational Citizenship Behavior: A Research on Health Sector
Authors: Serdar Öge, Pınar Erdogan
Abstract:
The main objective of this research is to describe the relationship between organizational climate and organizational citizenship behavior. In order to examine this relationship, a research is intended to be carried out in relevant institutions and organizations operating in the health sector in Turkey. It will be researched that whether there is a statistically significant relationship between organizational climate and organizational citizenship behavior through elated scientific research methods and statistical analysis. In addition, relationships between the dimensions of organizational climate and organizational citizenship behavior subscales will be questioned statistically.Keywords: Organizational climate, organizational citizenship, organizational citizenship behavior, climate.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 234610425 Effects of Capacitor Bank Defects on Harmonic Distortion and Park's Pattern Analysis in Induction Motors
Authors: G. Das, S. Das, P. Purkait, A. Dasgupta, M. Kumar
Abstract:
Properly sized capacitor banks are connected across induction motors for several reasons including power factor correction, reducing distortions, increasing capacity, etc. Total harmonic distortion (THD) and power factor (PF) are used in such cases to quantify the improvements obtained through connection of the external capacitor banks. On the other hand, one of the methods for assessing the motor internal condition is by the use of Park-s pattern analysis. In spite of taking adequate precautionary measures, the capacitor banks may sometimes malfunction. Such a minor fault in the capacitor bank is often not apparently discernible. This may however, give rise to substantial degradation of power factor correction performance and may also damage the supply profile. The case is more severe with the fact that the Park-s pattern gets distorted due to such external capacitor faults, and can give anomalous results about motor internal fault analyses. The aim of this paper is to present simulation and hardware laboratory test results to have an understanding of the anomalies in harmonic distortion and Park-s pattern analyses in induction motors due to capacitor bank defects.
Keywords: Capacitor bank, harmonic distortion, induction motor, Park's pattern, PSCAD simulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 393610424 Numerical Modelling of Crack Initiation around a Wellbore Due to Explosion
Authors: Meysam Lak, Mohammad Fatehi Marji, Alireza Yarahamdi Bafghi, Abolfazl Abdollahipour
Abstract:
A wellbore is a hole that is drilled to aid in the exploration and recovery of natural resources including oil and gas. Occasionally, in order to increase productivity index and porosity of the wellbore and reservoir, the well stimulation methods have been used. Hydraulic fracturing is one of these methods. Moreover, several explosions at the end of the well can stimulate the reservoir and create fractures around it. In this study, crack initiation in rock around the wellbore has been numerically modeled due to explosion. One, two, three, and four pairs of explosion have been set at the end of the wellbore on its wall. After each stage of the explosion, results have been presented and discussed. Results show that this method can initiate and probably propagate several fractures around the wellbore.
Keywords: Crack initiation, explosion, finite difference modelling, well productivity.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 81010423 Automatic Intelligent Analysis of Malware Behaviour
Authors: H. Dornhackl, K. Kadletz, R. Luh, P. Tavolato
Abstract:
In this paper, we describe the use of formal methods to model malware behaviour. The modelling of harmful behaviour rests upon syntactic structures that represent malicious procedures inside malware. The malicious activities are modelled by a formal grammar, where API calls’ components are the terminals and the set of API calls used in combination to achieve a goal are designated non-terminals. The combination of different non-terminals in various ways and tiers make up the attack vectors that are used by harmful software. Based on these syntactic structures a parser can be generated which takes execution traces as input for pattern recognition.
Keywords: Malware behaviour, modelling, parsing, search, pattern matching.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 152410422 Comparative Study of QRS Complex Detection in ECG
Authors: Ibtihel Nouira, Asma Ben Abdallah, Ibtissem Kouaja, Mohamed Hèdi Bedoui
Abstract:
The processing of the electrocardiogram (ECG) signal consists essentially in the detection of the characteristic points of signal which are an important tool in the diagnosis of heart diseases. The most suitable are the detection of R waves. In this paper, we present various mathematical tools used for filtering ECG using digital filtering and Discreet Wavelet Transform (DWT) filtering. In addition, this paper will include two main R peak detection methods by applying a windowing process: The first method is based on calculations derived, the second is a time-frequency method based on Dyadic Wavelet Transform DyWT.Keywords: Derived calculation methods, Electrocardiogram, R peaks, Wavelet Transform.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 257010421 A Study on the Relation among Primary Care Professionals Serving the Disadvantaged Community, Socioeconomic Status, and Adverse Health Outcome
Authors: Chau-Kuang Chen, Juanita Buford, Colette Davis, Raisha Allen, John Hughes, Jr., James Tyus, Dexter Samuels
Abstract:
During the post-Civil War era, the city of Nashville, Tennessee, had the highest mortality rate in the United States. The elevated death and disease rates among former slaves were attributable to lack of quality healthcare. To address the paucity of healthcare services, Meharry Medical College, an institution with the mission of educating minority professionals and serving the underserved population, was established in 1876. Purpose: The social ecological framework and partial least squares (PLS) path modeling were used to quantify the impact of socioeconomic status and adverse health outcome on primary care professionals serving the disadvantaged community. Thus, the study results could demonstrate the accomplishment of the College’s mission of training primary care professionals to serve in underserved areas. Methods: Various statistical methods were used to analyze alumni data from 1975 – 2013. K-means cluster analysis was utilized to identify individual medical and dental graduates in the cluster groups of the practice communities (Disadvantaged or Non-disadvantaged Communities). Discriminant analysis was implemented to verify the classification accuracy of cluster analysis. The independent t-test was performed to detect the significant mean differences of respective clustering and criterion variables. Chi-square test was used to test if the proportions of primary care and non-primary care specialists are consistent with those of medical and dental graduates practicing in the designated community clusters. Finally, the PLS path model was constructed to explore the construct validity of analytic model by providing the magnitude effects of socioeconomic status and adverse health outcome on primary care professionals serving the disadvantaged community. Results: Approximately 83% (3,192/3,864) of Meharry Medical College’s medical and dental graduates from 1975 to 2013 were practicing in disadvantaged communities. Independent t-test confirmed the content validity of the cluster analysis model. Also, the PLS path modeling demonstrated that alumni served as primary care professionals in communities with significantly lower socioeconomic status and higher adverse health outcome (p < .001). The PLS path modeling exhibited the meaningful interrelation between primary care professionals practicing communities and surrounding environments (socioeconomic statues and adverse health outcome), which yielded model reliability, validity, and applicability. Conclusion: This study applied social ecological theory and analytic modeling approaches to assess the attainment of Meharry Medical College’s mission of training primary care professionals to serve in underserved areas, particularly in communities with low socioeconomic status and high rates of adverse health outcomes. In summary, the majority of medical and dental graduates from Meharry Medical College provided primary care services to disadvantaged communities with low socioeconomic status and high adverse health outcome, which demonstrated that Meharry Medical College has fulfilled its mission. The high reliability, validity, and applicability of this model imply that it could be replicated for comparable universities and colleges elsewhere.Keywords: Disadvantaged Community, K-means Cluster Analysis, PLS Path Modeling, Primary care.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 203810420 Estimation of Real Power Transfer Allocation Using Intelligent Systems
Authors: H. Shareef, A. Mohamed, S. A. Khalid, Aziah Khamis
Abstract:
This paper presents application artificial intelligent (AI) techniques, namely artificial neural network (ANN), adaptive neuro fuzzy interface system (ANFIS), to estimate the real power transfer between generators and loads. Since these AI techniques adopt supervised learning, it first uses modified nodal equation method (MNE) to determine real power contribution from each generator to loads. Then the results of MNE method and load flow information are utilized to estimate the power transfer using AI techniques. The 25-bus equivalent system of south Malaysia is utilized as a test system to illustrate the effectiveness of both AI methods compared to that of the MNE method. The mean squared error of the estimate of ANN and ANFIS power transfer allocation methods are 1.19E-05 and 2.97E-05, respectively. Furthermore, when compared to MNE method, ANN and ANFIS methods computes generator contribution to loads within 20.99 and 39.37msec respectively whereas the MNE method took 360msec for the calculation of same real power transfer allocation.
Keywords: Artificial intelligence, Power tracing, Artificial neural network, ANFIS, Power system deregulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 258410419 A Social Cognitive Investigation in the Context of Vocational Training Performance of People with Disabilities
Authors: Majid A. AlSayari
Abstract:
The study reported here investigated social cognitive theory (SCT) in the context of Vocational Rehab (VR) for people with disabilities. The prime purpose was to increase knowledge of VR phenomena and make recommendations for improving VR services. The sample consisted of 242 persons with Spinal Cord Injuries (SCI) who completed questionnaires. A further 32 participants were Trainers. Analysis of questionnaire data was carried out using factor analysis, multiple regression analysis, and thematic analysis. The analysis suggested that, in motivational terms, and consistent with research carried out in other academic contexts, self-efficacy was the best predictor of VR performance. The author concludes that that VR self-efficacy predicted VR training performance.
Keywords: Social cognitive theory, vocational rehab, self-efficacy, proxy efficacy, people with disabilities.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 77010418 Methodology Issues and Design Approach of VLE on Mathematical Concepts Acquisition within Secondary Education in England
Authors: Aaron A. R. Nwabude
Abstract:
This study used positivist quantitative approach to examine the mathematical concepts acquisition of- KS4 (14-16) Special Education Needs (SENs) students within the school sector education in England. The research is based on a pilot study and the design is completely holistic in its approach with mixing methodologies. The study combines the qualitative and quantitative methods of approach in gathering formative data for the design process. Although, the approach could best be described as a mix method, fundamentally with a strong positivist paradigm, hence my earlier understanding of the differentiation of the students, student – teacher body and the various elements of indicators that is being measured which will require an attenuated description of individual research subjects. The design process involves four phases with five key stages which are; literature review and document analysis, the survey, interview, and observation; then finally the analysis of data set. The research identified the need for triangulation with Reid-s phases of data management providing scaffold for the study. The study clearly identified the ideological and philosophical aspects of educational research design for the study of mathematics by the special education needs (SENs) students in England using the virtual learning environment (VLE) platform.
Keywords: VLE, Special Education Needs, Key stage4, School, Mathematics, Concepts Acquisition
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 197810417 Development and Validation of a UPLC Method for the Determination of Albendazole Residues on Pharmaceutical Manufacturing Equipment Surfaces
Authors: R. S. Chandan, M. Vasudevan, Deecaraman, B. M. Gurupadayya
Abstract:
In Pharmaceutical industries, it is very important to remove drug residues from the equipment and areas used. The cleaning procedure must be validated, so special attention must be devoted to the methods used for analysis of trace amounts of drugs. A rapid, sensitive and specific reverse phase ultra performance liquid chromatographic (UPLC) method was developed for the quantitative determination of Albendazole in cleaning validation swab samples. The method was validated using an ACQUITY HSS C18, 50 x 2.1mm, 1.8μ column with a isocratic mobile phase containing a mixture of 1.36g of Potassium dihydrogenphosphate in 1000mL MilliQ water, 2mL of triethylamine and pH adjusted to 2.3 ± 0.05 with ortho-phosphoric acid, Acetonitrile and Methanol (50:40:10 v/v). The flow rate of the mobile phase was 0.5 mL min-1 with a column temperature of 350C and detection wavelength at 254nm using PDA detector. The injection volume was 2µl. Cotton swabs, moisten with acetonitrile were used to remove any residue of drug from stainless steel, teflon, rubber and silicon plates which mimic the production equipment surface and the mean extraction-recovery was found to be 91.8. The selected chromatographic condition was found to effectively elute Albendazole with retention time of 0.67min. The proposed method was found to be linear over the range of 0.2 to 150µg/mL and correlation coefficient obtained is 0.9992. The proposed method was found to be accurate, precise, reproducible and specific and it can also be used for routine quality control analysis of these drugs in biological samples either alone or in combined pharmaceutical dosage forms.
Keywords: Cleaning validation, Albendazole, residues, swab analysis, UPLC.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 310610416 A Comparison of Some Thresholding Selection Methods for Wavelet Regression
Authors: Alsaidi M. Altaher, Mohd T. Ismail
Abstract:
In wavelet regression, choosing threshold value is a crucial issue. A too large value cuts too many coefficients resulting in over smoothing. Conversely, a too small threshold value allows many coefficients to be included in reconstruction, giving a wiggly estimate which result in under smoothing. However, the proper choice of threshold can be considered as a careful balance of these principles. This paper gives a very brief introduction to some thresholding selection methods. These methods include: Universal, Sure, Ebays, Two fold cross validation and level dependent cross validation. A simulation study on a variety of sample sizes, test functions, signal-to-noise ratios is conducted to compare their numerical performances using three different noise structures. For Gaussian noise, EBayes outperforms in all cases for all used functions while Two fold cross validation provides the best results in the case of long tail noise. For large values of signal-to-noise ratios, level dependent cross validation works well under correlated noises case. As expected, increasing both sample size and level of signal to noise ratio, increases estimation efficiency.
Keywords: wavelet regression, simulation, Threshold.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 176710415 Gamification as a Tool for Influencing Customers' Behaviour
Authors: B. Zatwarnicka-Madura
Abstract:
The objective of the article was to identify the impacts of gamification on customers' behaviour. The most important applications of games in marketing and mechanisms of gamification are presented in the article. A detailed analysis of the influence of gamification on customers using two brands, Foursquare and Nike, was also presented. Research studies using auditory survey methods were carried out among 176 young respondents, who are potential targets of gamification. The studies confirmed a huge participation of young people in customer loyalty programs with relatively low participation in other gamificationbased marketing activities. The research findings clearly indicate that gamification mechanisms are the most attractive.
Keywords: Customer loyalty, games, gamification, social aspects.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 241810414 Synthesis of the Robust Regulators on the Basis of the Criterion of the Maximum Stability Degree
Authors: S. A. Gayvoronsky, T. A. Ezangina
Abstract:
The robust control system objects with interval- undermined parameters is considers in this paper. Initial information about the system is its characteristic polynomial with interval coefficients. On the basis of coefficient estimations of quality indices and criterion of the maximum stability degree, the methods of synthesis of a robust regulator parametric is developed. The example of the robust stabilization system synthesis of the rope tension is given in this article.
Keywords: An interval polynomial, controller synthesis, analysis of quality factors, maximum degree of stability, robust degree of stability, robust oscillation, system accuracy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 159310413 An Estimating Parameter of the Mean in Normal Distribution by Maximum Likelihood, Bayes, and Markov Chain Monte Carlo Methods
Authors: Autcha Araveeporn
Abstract:
This paper is to compare the parameter estimation of the mean in normal distribution by Maximum Likelihood (ML), Bayes, and Markov Chain Monte Carlo (MCMC) methods. The ML estimator is estimated by the average of data, the Bayes method is considered from the prior distribution to estimate Bayes estimator, and MCMC estimator is approximated by Gibbs sampling from posterior distribution. These methods are also to estimate a parameter then the hypothesis testing is used to check a robustness of the estimators. Data are simulated from normal distribution with the true parameter of mean 2, and variance 4, 9, and 16 when the sample sizes is set as 10, 20, 30, and 50. From the results, it can be seen that the estimation of MLE, and MCMC are perceivably different from the true parameter when the sample size is 10 and 20 with variance 16. Furthermore, the Bayes estimator is estimated from the prior distribution when mean is 1, and variance is 12 which showed the significant difference in mean with variance 9 at the sample size 10 and 20.
Keywords: Bayes method, Markov Chain Monte Carlo method, Maximum Likelihood method, normal distribution.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 143510412 Identification of Factors Influencing Company's Competitiveness
Authors: D. Ščeulovs, E. Gaile-Sarkane
Abstract:
Fast development of technologies, economic globalization and many other external circumstances stimulate company’s competitiveness. One of the major trends in today’s business is the shift to the exploitation of the Internet and electronic environment for entrepreneurial needs. Latest researches confirm that e-environment provides a range of possibilities and opportunities for companies, especially for micro-, small- and medium-sized companies, which have limited resources. The usage of e-tools raises the effectiveness and the profitability of an organization, as well as its competitiveness. In the electronic market, as in the classic one, there are factors, such as globalization, development of new technology, price sensitive consumers, Internet, new distribution and communication channels that influence entrepreneurship. As a result of eenvironment development, e-commerce and e-marketing grow as well.
Objective of the paper: To describe and identify factors influencing company’s competitiveness in e-environment.
Research methodology: The authors employ well-established quantitative and qualitative methods of research: grouping, analysis, statistics method, factor analysis in SPSS 20 environment, etc. The theoretical and methodological background of the research is formed by using scientific researches and publications, such as that from mass media and professional literature; statistical information from legal institutions as well as information collected by the authors during the surveying process. Research result: The authors detected and classified factors influencing competitiveness in e-environment.
In this paper, the authors presented their findings based on theoretical, scientific, and field research. Authors have conducted a research on e-environment utilization among Latvian enterprises.
Keywords: Competitiveness, e-environment, factors, factor analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 209610411 Refined Buckling Analysis of Rectangular Plates Under Uniaxial and Biaxial Compression
Authors: V. Piscopo
Abstract:
In the traditional buckling analysis of rectangular plates the classical thin plate theory is generally applied, so neglecting the plating shear deformation. It seems quite clear that this method is not totally appropriate for the analysis of thick plates, so that in the following the two variable refined plate theory proposed by Shimpi (2006), that permits to take into account the transverse shear effects, is applied for the buckling analysis of simply supported isotropic rectangular plates, compressed in one and two orthogonal directions. The relevant results are compared with the classical ones and, for rectangular plates under uniaxial compression, a new direct expression, similar to the classical Bryan-s formula, is proposed for the Euler buckling stress. As the buckling analysis is a widely diffused topic for a variety of structures, such as ship ones, some applications for plates uniformly compressed in one and two orthogonal directions are presented and the relevant theoretical results are compared with those ones obtained by a FEM analysis, carried out by ANSYS, to show the feasibility of the presented method.Keywords: Buckling analysis, Thick plates, Biaxial stresses
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 262710410 Towards End-To-End Disease Prediction from Raw Metagenomic Data
Authors: Maxence Queyrel, Edi Prifti, Alexandre Templier, Jean-Daniel Zucker
Abstract:
Analysis of the human microbiome using metagenomic sequencing data has demonstrated high ability in discriminating various human diseases. Raw metagenomic sequencing data require multiple complex and computationally heavy bioinformatics steps prior to data analysis. Such data contain millions of short sequences read from the fragmented DNA sequences and stored as fastq files. Conventional processing pipelines consist in multiple steps including quality control, filtering, alignment of sequences against genomic catalogs (genes, species, taxonomic levels, functional pathways, etc.). These pipelines are complex to use, time consuming and rely on a large number of parameters that often provide variability and impact the estimation of the microbiome elements. Training Deep Neural Networks directly from raw sequencing data is a promising approach to bypass some of the challenges associated with mainstream bioinformatics pipelines. Most of these methods use the concept of word and sentence embeddings that create a meaningful and numerical representation of DNA sequences, while extracting features and reducing the dimensionality of the data. In this paper we present an end-to-end approach that classifies patients into disease groups directly from raw metagenomic reads: metagenome2vec. This approach is composed of four steps (i) generating a vocabulary of k-mers and learning their numerical embeddings; (ii) learning DNA sequence (read) embeddings; (iii) identifying the genome from which the sequence is most likely to come and (iv) training a multiple instance learning classifier which predicts the phenotype based on the vector representation of the raw data. An attention mechanism is applied in the network so that the model can be interpreted, assigning a weight to the influence of the prediction for each genome. Using two public real-life data-sets as well a simulated one, we demonstrated that this original approach reaches high performance, comparable with the state-of-the-art methods applied directly on processed data though mainstream bioinformatics workflows. These results are encouraging for this proof of concept work. We believe that with further dedication, the DNN models have the potential to surpass mainstream bioinformatics workflows in disease classification tasks.Keywords: Metagenomics, phenotype prediction, deep learning, embeddings, multiple instance learning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 91010409 A Study on Prediction of Cavitation for Centrifugal Pump
Authors: Myung Jin Kim, Hyun Bae Jin, Wui Jun Chung
Abstract:
In this study, to accurately predict cavitation of a centrifugal pump, numerical analysis was compared with experimental results modeled on a small industrial centrifugal pump. In this study, numerical analysis was compared with experimental results modeled on a small industrial centrifugal pump for reliable prediction on cavitation of a centrifugal pump. To improve validity of the numerical analysis, transient analysis was conducted on the calculated domain of full-type geometry, such as an experimental apparatus. The numerical analysis from the results was considered to be a reliable prediction of cavitaion.Keywords: Centrifugal Pump, Cavitation, NPSH, CFD.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 422210408 A Robust and Efficient Segmentation Method Applied for Cardiac Left Ventricle with Abnormal Shapes
Authors: Peifei Zhu, Zisheng Li, Yasuki Kakishita, Mayumi Suzuki, Tomoaki Chono
Abstract:
Segmentation of left ventricle (LV) from cardiac ultrasound images provides a quantitative functional analysis of the heart to diagnose disease. Active Shape Model (ASM) is widely used for LV segmentation, but it suffers from the drawback that initialization of the shape model is not sufficiently close to the target, especially when dealing with abnormal shapes in disease. In this work, a two-step framework is improved to achieve a fast and efficient LV segmentation. First, a robust and efficient detection based on Hough forest localizes cardiac feature points. Such feature points are used to predict the initial fitting of the LV shape model. Second, ASM is applied to further fit the LV shape model to the cardiac ultrasound image. With the robust initialization, ASM is able to achieve more accurate segmentation. The performance of the proposed method is evaluated on a dataset of 810 cardiac ultrasound images that are mostly abnormal shapes. This proposed method is compared with several combinations of ASM and existing initialization methods. Our experiment results demonstrate that accuracy of the proposed method for feature point detection for initialization was 40% higher than the existing methods. Moreover, the proposed method significantly reduces the number of necessary ASM fitting loops and thus speeds up the whole segmentation process. Therefore, the proposed method is able to achieve more accurate and efficient segmentation results and is applicable to unusual shapes of heart with cardiac diseases, such as left atrial enlargement.Keywords: Hough forest, active shape model, segmentation, cardiac left ventricle.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 150410407 Performance Analysis of MATLAB Solvers in the Case of a Quadratic Programming Generation Scheduling Optimization Problem
Authors: Dávid Csercsik, Péter Kádár
Abstract:
In the case of the proposed method, the problem is parallelized by considering multiple possible mode of operation profiles, which determine the range in which the generators operate in each period. For each of these profiles, the optimization is carried out independently, and the best resulting dispatch is chosen. For each such profile, the resulting problem is a quadratic programming (QP) problem with a potentially negative definite Q quadratic term, and constraints depending on the actual operation profile. In this paper we analyze the performance of available MATLAB optimization methods and solvers for the corresponding QP.Keywords: Economic dispatch, optimization, quadratic programming, MATLAB.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 94910406 Automata-Based String Analysis for Detecting Malware in Android Programs
Authors: Assad Maalouf, Lunjin Lu, James Lynott
Abstract:
We design and implement a precise model of string operations using finite state machine transformers and state transformers to approximate the values string variables can take throughout the execution of the program.We use our model to analyze Android program string variables. Our experimental results show that our string analysis is very efficient at detecting the contextual effect of string operations on the string variables. Our model proved to be very useful when it came to verifying statements about the string variables of the program.Keywords: Abstract interpretation, android, static analysis, string analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 72710405 Increased Capacity of Information Hiding in LSB-s Method for Text and Image
Authors: H.B.Kekre, Archana Athawale, Pallavi N.Halarnkar
Abstract:
Steganography, derived from Greek, literally means “covered writing". It includes a vast array of secret communications methods that conceal the message-s very existence. These methods include invisible inks, microdots, character arrangement, digital signatures, covert channels, and spread spectrum communications. This paper proposes a new improved version of Least Significant Bit (LSB) method. The approach proposed is simple for implementation when compared to Pixel value Differencing (PVD) method and yet achieves a High embedding capacity and imperceptibility. The proposed method can also be applied to 24 bit color images and achieve embedding capacity much higher than PVD.Keywords: Information Hiding, LSB Matching, PVD Steganography.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3166