Search results for: asset-oriented approach
7607 Physical Training in the Context of Preparation for the Performance of Junior Two: Sports Dance Practitioners
Authors: Rosa Alin Cristian
Abstract:
As in any other sports branch, there is also a relationship of dependence between the motor qualities and the technical skills in the sports dance, in the sense that superior performances from a technical, artistic point of view can be obtained only on the basis of a certain level of motor qualities and of the morphological and functional indices of the organism. Starting from the premise that physical training is a basic component of the dancers' training process, determining the efficacy and efficiency of the athletes in training and competitions, its main objectives are to obtain an optimal functional capacity of the body, which is reached through a superior level of development and manifestation of the basic and specific motor qualities, through appropriate values of the morph-functional indices, all against the background of a perfect state of health. We propose in this paper to create an inventory of the motor qualities specific to the sports dance, of their forms of manifestation, to establish some methodical priorities for their development, in order to support the specialists in their attempt to approach the physical training in the most rigorous and efficient way, according to the characteristics of each age category.Keywords: physical training, motor skills, sports dance, performance
Procedia PDF Downloads 787606 The Role of Organizational Culture, Organizational Commitment, and Styles of Transformational Leadership towards Employee Performance
Authors: Ahmad Badawi Saluy, Novawiguna Kemalasari
Abstract:
This study aims to examine and analyze the influence of organizational culture, organizational commitment, and transformational leadership style on employee performance. This study used descriptive survey method with quantitative approach, and questionnaires as a tool used for basic data collection. The sampling technique used is proportionate stratified random sampling technique; all respondents in this study were 70 respondents. The analytical method used in this research is multiple linear regressions. The result of determination coefficient of 52.3% indicates that organizational culture, organizational commitment, and transformational leadership style simultaneously have a significant influence on the performance of employees, while the remaining 47.7% is explained by other factors outside the research variables. Partially, organization culture has strong and positive influence on employee performance, organizational commitment has a moderate and positive effect on employee performance, while the transformational leadership style has a strong and positive influence on employee performance and this is also the variable that has the most impact on employee performance.Keywords: organizational culture, organizational commitment, transformational leadership style, employee performance
Procedia PDF Downloads 2297605 A Periodogram-Based Spectral Method Approach: The Relationship between Tourism and Economic Growth in Turkey
Authors: Mesut BALIBEY, Serpil TÜRKYILMAZ
Abstract:
A popular topic in the econometrics and time series area is the cointegrating relationships among the components of a nonstationary time series. Engle and Granger’s least squares method and Johansen’s conditional maximum likelihood method are the most widely-used methods to determine the relationships among variables. Furthermore, a method proposed to test a unit root based on the periodogram ordinates has certain advantages over conventional tests. Periodograms can be calculated without any model specification and the exact distribution under the assumption of a unit root is obtained. For higher order processes the distribution remains the same asymptotically. In this study, in order to indicate advantages over conventional test of periodograms, we are going to examine a possible relationship between tourism and economic growth during the period 1999:01-2010:12 for Turkey by using periodogram method, Johansen’s conditional maximum likelihood method, Engle and Granger’s ordinary least square method.Keywords: cointegration, economic growth, periodogram ordinate, tourism
Procedia PDF Downloads 2737604 Analyzing the Heat Transfer Mechanism in a Tube Bundle Air-PCM Heat Exchanger: An Empirical Study
Authors: Maria De Los Angeles Ortega, Denis Bruneau, Patrick Sebastian, Jean-Pierre Nadeau, Alain Sommier, Saed Raji
Abstract:
Phase change materials (PCM) present attractive features that made them a passive solution for thermal comfort assessment in buildings during summer time. They show a large storage capacity per volume unit in comparison with other structural materials like bricks or concrete. If their use is matched with the peak load periods, they can contribute to the reduction of the primary energy consumption related to cooling applications. Despite these promising characteristics, they present some drawbacks. Commercial PCMs, as paraffines, offer a low thermal conductivity affecting the overall performance of the system. In some cases, the material can be enhanced, adding other elements that improve the conductivity, but in general, a design of the unit that optimizes the thermal performance is sought. The material selection is the departing point during the designing stage, and it does not leave plenty of room for optimization. The PCM melting point depends highly on the atmospheric characteristics of the building location. The selection must relay within the maximum, and the minimum temperature reached during the day. The geometry of the PCM container and the geometrical distribution of these containers are designing parameters, as well. They significantly affect the heat transfer, and therefore its phenomena must be studied exhaustively. During its lifetime, an air-PCM unit in a building must cool down the place during daytime, while the melting of the PCM occurs. At night, the PCM must be regenerated to be ready for next uses. When the system is not in service, a minimal amount of thermal exchanges is desired. The aforementioned functions result in the presence of sensible and latent heat storage and release. Hence different types of mechanisms drive the heat transfer phenomena. An experimental test was designed to study the heat transfer phenomena occurring in a circular tube bundle air-PCM exchanger. An in-line arrangement was selected as the geometrical distribution of the containers. With the aim of visual identification, the containers material and a section of the test bench were transparent. Some instruments were placed on the bench for measuring temperature and velocity. The PCM properties were also available through differential scanning calorimeter (DSC) tests. An evolution of the temperature during both cycles, melting and solidification were obtained. The results showed some phenomena at a local level (tubes) and on an overall level (exchanger). Conduction and convection appeared as the main heat transfer mechanisms. From these results, two approaches to analyze the heat transfer were followed. The first approach described the phenomena in a single tube as a series of thermal resistances, where a pure conduction controlled heat transfer was assumed in the PCM. For the second approach, the temperature measurements were used to find some significant dimensionless numbers and parameters as Stefan, Fourier and Rayleigh numbers, and the melting fraction. These approaches allowed us to identify the heat transfer phenomena during both cycles. The presence of natural convection during melting might have been stated from the influence of the Rayleigh number on the correlations obtained.Keywords: phase change materials, air-PCM exchangers, convection, conduction
Procedia PDF Downloads 1827603 Negative Pressure Waves in Hydraulic Systems
Authors: Fuad H. Veliev
Abstract:
Negative pressure phenomenon appears in many thermodynamic, geophysical and biophysical processes in the Nature and technological systems. For more than 100 years of the laboratory researches beginning from F. M. Donny’s tests, the great values of negative pressure have been achieved. But this phenomenon has not been practically applied, being only a nice lab toy due to the special demands for the purity and homogeneity of the liquids for its appearance. The possibility of creation of direct wave of negative pressure in real heterogeneous liquid systems was confirmed experimentally under the certain kinetic and hydraulic conditions. The negative pressure can be considered as the factor of both useful and destroying energies. The new approach to generation of the negative pressure waves in impure, unclean fluids has allowed the creation of principally new energy saving technologies and installations to increase the effectiveness and efficiency of different production processes. It was proved that the negative pressure is one of the main factors causing hard troubles in some technological and natural processes. Received results emphasize the necessity to take into account the role of the negative pressure as an energy factor in evaluation of many transient thermohydrodynamic processes in the Nature and production systems.Keywords: liquid systems, negative pressure, temperature, wave, metastable state
Procedia PDF Downloads 4197602 Solutions for Large Diameter Piles Stifness Used in Offshore Wind Turbine Farms
Authors: M. H. Aissa, Amar Bouzid Dj
Abstract:
As known, many countries are now planning to build new wind farms with high capacity up to 5MW. Consequently, the size of the foundation increase. These kinds of structures are subject to fatigue damage from environmental loading mainly due to wind and waves as well as from cyclic loading imposed through the rotational frequency (1P) through mass and aerodynamic imbalances and from the blade passing frequency (3P) of the wind turbine which make them behavior dynamically very sensitive. That is why natural frequency must be determined with accuracy from the existing data of the soil and the foundation stiffness sources of uncertainties, to avoid the resonance of the system. This paper presents analytical expressions of stiffness foundation with large diameter in linear soil behavior in different soil stiffness profile. To check the accuracy of the proposed formulas, a mathematical model approach based on non-dimensional parameters is used to calculate the natural frequency taking into account the soil structure interaction (SSI) compared with the p-y method and measured frequency in the North Sea Wind farms.Keywords: offshore wind turbines, semi analytical FE analysis, p-y curves, piles foundations
Procedia PDF Downloads 4697601 Risk Factors of Becoming NEET Youth in Iran: A Machine Learning Approach
Authors: Hamed Rahmani, Wim Groot
Abstract:
The term "youth not in employment, education or training (NEET)" refers to a combination of youth unemployment and school dropout. This study investigates the variables that increase the risk of becoming NEET in Iran. A selection bias-adjusted Probit model was employed using machine learning to identify these risk factors. We used cross-sectional data obtained from the Statistical Centre of Iran and the Ministry of Cooperatives Labour and Social Welfare that was taken from the labour force survey conducted in the spring of 2021. We look at years of education, work experience, housework, the number of children under the age of six in the home, family education, birthplace, and the amount of land owned by households. Results show that hours spent performing domestic chores enhance the likelihood of youth becoming NEET, and years of education and years of potential work experience decrease the chance of being NEET. The findings also show that female youth born in cities were less likely than those born in rural regions to become NEET.Keywords: NEET youth, probit, CART, machine learning, unemployment
Procedia PDF Downloads 1107600 Blind Super-Resolution Reconstruction Based on PSF Estimation
Authors: Osama A. Omer, Amal Hamed
Abstract:
Successful blind image Super-Resolution algorithms require the exact estimation of the Point Spread Function (PSF). In the absence of any prior information about the imagery system and the true image; this estimation is normally done by trial and error experimentation until an acceptable restored image quality is obtained. Multi-frame blind Super-Resolution algorithms often have disadvantages of slow convergence and sensitiveness to complex noises. This paper presents a Super-Resolution image reconstruction algorithm based on estimation of the PSF that yields the optimum restored image quality. The estimation of PSF is performed by the knife-edge method and it is implemented by measuring spreading of the edges in the reproduced HR image itself during the reconstruction process. The proposed image reconstruction approach is using L1 norm minimization and robust regularization based on a bilateral prior to deal with different data and noise models. A series of experiment results show that the proposed method can outperform other previous work robustly and efficiently.Keywords: blind, PSF, super-resolution, knife-edge, blurring, bilateral, L1 norm
Procedia PDF Downloads 3677599 Banking and Accounting Analysis Researches Effect on Environment
Authors: Marina Magdy Naguib Karas
Abstract:
New methods of providing banking services to the customer have been introduced, such as online banking. Banks have begun to consider electronic banking (e-banking) as a way to replace some traditional branch functions by using the Internet as a new distribution channel. Some consumers have at least one account at multiple banks and access these accounts through online banking. To check their current net worth, clients need to log into each of their accounts, get detailed information, and work toward consolidation. Not only is it time-consuming, but it is also a repeatable activity with a certain frequency. To solve this problem, the concept of account aggregation was added as a solution. Account consolidation in e-banking as a form of electronic banking appears to build a stronger relationship with customers. An account linking service is generally referred to as a service that allows customers to manage their bank accounts held at different institutions via a common online banking platform that places a high priority on security and data protection. The article provides an overview of the account aggregation approach in e-banking as a new service in the area of e-banking.Keywords: compatibility, complexity, mobile banking, observation, risk banking technology, Internet banks, modernization of banks, banks, account aggregation, security, enterprise development
Procedia PDF Downloads 527598 Images Selection and Best Descriptor Combination for Multi-Shot Person Re-Identification
Authors: Yousra Hadj Hassen, Walid Ayedi, Tarek Ouni, Mohamed Jallouli
Abstract:
To re-identify a person is to check if he/she has been already seen over a cameras network. Recently, re-identifying people over large public cameras networks has become a crucial task of great importance to ensure public security. The vision community has deeply investigated this area of research. Most existing researches rely only on the spatial appearance information from either one or multiple person images. Actually, the real person re-id framework is a multi-shot scenario. However, to efficiently model a person’s appearance and to choose the best samples to remain a challenging problem. In this work, an extensive comparison of descriptors of state of the art associated with the proposed frame selection method is studied. Specifically, we evaluate the samples selection approach using multiple proposed descriptors. We show the effectiveness and advantages of the proposed method by extensive comparisons with related state-of-the-art approaches using two standard datasets PRID2011 and iLIDS-VID.Keywords: camera network, descriptor, model, multi-shot, person re-identification, selection
Procedia PDF Downloads 2817597 Absolute Quantification of the Bexsero Vaccine Component Factor H Binding Protein (fHbp) by Selected Reaction Monitoring: The Contribution of Mass Spectrometry in Vaccinology
Authors: Massimiliano Biagini, Marco Spinsanti, Gabriella De Angelis, Sara Tomei, Ilaria Ferlenghi, Maria Scarselli, Alessia Biolchi, Alessandro Muzzi, Brunella Brunelli, Silvana Savino, Marzia M. Giuliani, Isabel Delany, Paolo Costantino, Rino Rappuoli, Vega Masignani, Nathalie Norais
Abstract:
The gram-negative bacterium Neisseria meningitidis serogroup B (MenB) is an exclusively human pathogen representing the major cause of meningitides and severe sepsis in infants and children but also in young adults. This pathogen is usually present in the 30% of healthy population that act as a reservoir, spreading it through saliva and respiratory fluids during coughing, sneezing, kissing. Among surface-exposed protein components of this diplococcus, factor H binding protein is a lipoprotein proved to be a protective antigen used as a component of the recently licensed Bexsero vaccine. fHbp is a highly variable meningococcal protein: to reflect its remarkable sequence variability, it has been classified in three variants (or two subfamilies), and with poor cross-protection among the different variants. Furthermore, the level of fHbp expression varies significantly among strains, and this has also been considered an important factor for predicting MenB strain susceptibility to anti-fHbp antisera. Different methods have been used to assess fHbp expression on meningococcal strains, however, all these methods use anti-fHbp antibodies, and for this reason, the results are affected by the different affinity that antibodies can have to different antigenic variants. To overcome the limitations of an antibody-based quantification, we developed a quantitative Mass Spectrometry (MS) approach. Selected Reaction Monitoring (SRM) recently emerged as a powerful MS tool for detecting and quantifying proteins in complex mixtures. SRM is based on the targeted detection of ProteoTypicPeptides (PTPs), which are unique signatures of a protein that can be easily detected and quantified by MS. This approach, proven to be highly sensitive, quantitatively accurate and highly reproducible, was used to quantify the absolute amount of fHbp antigen in total extracts derived from 105 clinical isolates, evenly distributed among the three main variant groups and selected to be representative of the fHbp circulating subvariants around the world. We extended the study at the genetic level investigating the correlation between the differential level of expression and polymorphisms present within the genes and their promoter sequences. The implications of fHbp expression on the susceptibility of the strain to killing by anti-fHbp antisera are also presented. To date this is the first comprehensive fHbp expression profiling in a large panel of Neisseria meningitidis clinical isolates driven by an antibody-independent MS-based methodology, opening the door to new applications in vaccine coverage prediction and reinforcing the molecular understanding of released vaccines.Keywords: quantitative mass spectrometry, Neisseria meningitidis, vaccines, bexsero, molecular epidemiology
Procedia PDF Downloads 3157596 Financial Statement Fraud: The Need for a Paradigm Shift to Forensic Accounting
Authors: Ifedapo Francis Awolowo
Abstract:
The unrelenting series of embarrassing audit failures should stimulate a paradigm shift in accounting. And in this age of information revolution, there is need for a constant improvement on the products or services one offers to the market in order to be relevant. This study explores the perceptions of external auditors, forensic accountants and accounting academics on whether a paradigm shift to forensic accounting can reduce financial statement frauds. Through Neo-empiricism/inductive analytical approach, findings reveal that a paradigm shift to forensic accounting might be the right step in the right direction in order to increase the chances of fraud prevention and detection in the financial statement. This research has implication on accounting education on the need to incorporate forensic accounting into present day accounting curriculum. Accounting professional bodies, accounting standard setters and accounting firms all have roles to play in incorporating forensic accounting education into accounting curriculum. Particularly, there is need to alter the ISA 240 to make the prevention and detection of frauds the responsibilities of bot those charged with the management and governance of companies and statutory auditors.Keywords: financial statement fraud, forensic accounting, fraud prevention and detection, auditing, audit expectation gap, corporate governance
Procedia PDF Downloads 3687595 Sustainable Lessons learnt from the attitudes of Language Instructors towards Computer Assisted Language Teaching (CALT)
Authors: Theophilus Adedokun, Sylvia Zulu, Felix Awung, Sam Usadolo
Abstract:
The proliferation of technology into teaching process has brought about transformation into the field of education. Language teaching is not left behind from this tremendous transformation which has drastically altered the teaching of language. It is, however, appalling that some language instructors seem to possess negative attitudes toward the use of technology in language teaching, which in this study is referred to as Computer Assisted Language Teaching (CALT). The purpose of this study, therefore, is to explore sustainable lesson that can be learnt from the attitudes of language instructors towards language teaching in some public universities. The knowledge gained from this study could inform and advance the use of Computer Assisted Language Teaching. This study considers the historical progression of CALT and recommends that a fundamental approach is required for institutions to develop and advance the use of CALT for teaching. A review of sustainable lessons learnt from the attitudes of language instructors towards CALT are provided, and the CALT experience of 3 institutions are described. Drawing from this succinct description, this study makes recommendations on how operative CALT could be executed on a personal and institutional basis.Keywords: attitudes, language instructors, sustainable lessons, computer assisted language teaching
Procedia PDF Downloads 887594 The Truth about Good and Evil: A Mixed-Methods Approach to Color Theory
Authors: Raniya Alsharif
Abstract:
The color theory of good and evil is the association of colors to the omnipresent concept of good and evil, where human behavior and perception can be highly influenced by seeing black and white, making these connotations almost dangerously distinctive where they can be very hard to distinguish. This theory is a human construct that dates back to ancient Egypt and has been used since then in almost all forms of communication and expression, such as art, fashion, literature, and religious manuscripts, helping the implantation of preconceived ideas that influence behavior and society. This is a mixed-methods research that uses both surveys to collect quantitative data related to the theory and a vignette to collect qualitative data by using a scenario where participants aged between 18-25 will style two characters of good and bad characteristics with color contrasting clothes, both yielding results about the nature of the preconceived perceptions associated with ‘black and white’ and ‘good and evil’, illustrating the important role of media and communications in human behavior and subconscious, and also uncover how far this theory goes in the age of social media enlightenment.Keywords: color perception, interpretivism, thematic analysis, vignettes
Procedia PDF Downloads 1287593 Beyond Classic Program Evaluation and Review Technique: A Generalized Model for Subjective Distributions with Flexible Variance
Authors: Byung Cheol Kim
Abstract:
The Program Evaluation and Review Technique (PERT) is widely used for project management, but it struggles with subjective distributions, particularly due to its assumptions of constant variance and light tails. To overcome these limitations, we propose the Generalized PERT (G-PERT) model, which enhances PERT by incorporating variability in three-point subjective estimates. Our methodology extends the original PERT model to cover the full range of unimodal beta distributions, enabling the model to handle thick-tailed distributions and offering formulas for computing mean and variance. This maintains the simplicity of PERT while providing a more accurate depiction of uncertainty. Our empirical analysis demonstrates that the G-PERT model significantly improves performance, particularly when dealing with heavy-tail subjective distributions. In comparative assessments with alternative models such as triangular and lognormal distributions, G-PERT shows superior accuracy and flexibility. These results suggest that G-PERT offers a more robust solution for project estimation while still retaining the user-friendliness of the classic PERT approach.Keywords: PERT, subjective distribution, project management, flexible variance
Procedia PDF Downloads 217592 Research on Knowledge Graph Inference Technology Based on Proximal Policy Optimization
Authors: Yihao Kuang, Bowen Ding
Abstract:
With the increasing scale and complexity of knowledge graph, modern knowledge graph contains more and more types of entity, relationship, and attribute information. Therefore, in recent years, it has been a trend for knowledge graph inference to use reinforcement learning to deal with large-scale, incomplete, and noisy knowledge graphs and improve the inference effect and interpretability. The Proximal Policy Optimization (PPO) algorithm utilizes a near-end strategy optimization approach. This allows for more extensive updates of policy parameters while constraining the update extent to maintain training stability. This characteristic enables PPOs to converge to improved strategies more rapidly, often demonstrating enhanced performance early in the training process. Furthermore, PPO has the advantage of offline learning, effectively utilizing historical experience data for training and enhancing sample utilization. This means that even with limited resources, PPOs can efficiently train for reinforcement learning tasks. Based on these characteristics, this paper aims to obtain a better and more efficient inference effect by introducing PPO into knowledge inference technology.Keywords: reinforcement learning, PPO, knowledge inference
Procedia PDF Downloads 2457591 Development of Computational Approach for Calculation of Hydrogen Solubility in Hydrocarbons for Treatment of Petroleum
Authors: Abdulrahman Sumayli, Saad M. AlShahrani
Abstract:
For the hydrogenation process, knowing the solubility of hydrogen (H2) in hydrocarbons is critical to improve the efficiency of the process. We investigated the H2 solubility computation in four heavy crude oil feedstocks using machine learning techniques. Temperature, pressure, and feedstock type were considered as the inputs to the models, while the hydrogen solubility was the sole response. Specifically, we employed three different models: Support Vector Regression (SVR), Gaussian process regression (GPR), and Bayesian ridge regression (BRR). To achieve the best performance, the hyper-parameters of these models are optimized using the whale optimization algorithm (WOA). We evaluated the models using a dataset of solubility measurements in various feedstocks, and we compared their performance based on several metrics. Our results show that the WOA-SVR model tuned with WOA achieves the best performance overall, with an RMSE of 1.38 × 10− 2 and an R-squared of 0.991. These findings suggest that machine learning techniques can provide accurate predictions of hydrogen solubility in different feedstocks, which could be useful in the development of hydrogen-related technologies. Besides, the solubility of hydrogen in the four heavy oil fractions is estimated in different ranges of temperatures and pressures of 150 ◦C–350 ◦C and 1.2 MPa–10.8 MPa, respectivelyKeywords: temperature, pressure variations, machine learning, oil treatment
Procedia PDF Downloads 727590 Neural Network Based Approach of Software Maintenance Prediction for Laboratory Information System
Authors: Vuk M. Popovic, Dunja D. Popovic
Abstract:
Software maintenance phase is started once a software project has been developed and delivered. After that, any modification to it corresponds to maintenance. Software maintenance involves modifications to keep a software project usable in a changed or a changing environment, to correct discovered faults, and modifications, and to improve performance or maintainability. Software maintenance and management of software maintenance are recognized as two most important and most expensive processes in a life of a software product. This research is basing the prediction of maintenance, on risks and time evaluation, and using them as data sets for working with neural networks. The aim of this paper is to provide support to project maintenance managers. They will be able to pass the issues planned for the next software-service-patch to the experts, for risk and working time evaluation, and afterward to put all data to neural networks in order to get software maintenance prediction. This process will lead to the more accurate prediction of the working hours needed for the software-service-patch, which will eventually lead to better planning of budget for the software maintenance projects.Keywords: laboratory information system, maintenance engineering, neural networks, software maintenance, software maintenance costs
Procedia PDF Downloads 3617589 Impact of Hard Limited Clipping Crest Factor Reduction Technique on Bit Error Rate in OFDM Based Systems
Authors: Theodore Grosch, Felipe Koji Godinho Hoshino
Abstract:
In wireless communications, 3GPP LTE is one of the solutions to meet the greater transmission data rate demand. One issue inherent to this technology is the PAPR (Peak-to-Average Power Ratio) of OFDM (Orthogonal Frequency Division Multiplexing) modulation. This high PAPR affects the efficiency of power amplifiers. One approach to mitigate this effect is the Crest Factor Reduction (CFR) technique. In this work, we simulate the impact of Hard Limited Clipping Crest Factor Reduction technique on BER (Bit Error Rate) in OFDM based Systems. In general, the results showed that CFR has more effects on higher digital modulation schemes, as expected. More importantly, we show the worst-case degradation due to CFR on QPSK, 16QAM, and 64QAM signals in a linear system. For example, hard clipping of 9 dB results in a 2 dB increase in signal to noise energy at a 1% BER for 64-QAM modulation.Keywords: bit error rate, crest factor reduction, OFDM, physical layer simulation
Procedia PDF Downloads 3677588 Gamification of eHealth Business Cases to Enhance Rich Learning Experience
Authors: Kari Björn
Abstract:
Introduction of games has expanded the application area of computer-aided learning tools to wide variety of age groups of learners. Serious games engage the learners into a real-world -type of simulation and potentially enrich the learning experience. Institutional background of a Bachelor’s level engineering program in Information and Communication Technology is introduced, with detailed focus on one of its majors, Health Technology. As part of a Customer Oriented Software Application thematic semester, one particular course of “eHealth Business and Solutions” is described and reflected in a gamified framework. Learning a consistent view into vast literature of business management, strategies, marketing and finance in a very limited time enforces selection of topics relevant to the industry. Health Technology is a novel and growing industry with a growing sector in consumer wearable devices and homecare applications. The business sector is attracting new entrepreneurs and impatient investor funds. From engineering education point of view the sector is driven by miniaturizing electronics, sensors and wireless applications. However, the market is highly consumer-driven and usability, safety and data integrity requirements are extremely high. When the same technology is used in analysis or treatment of patients, very strict regulatory measures are enforced. The paper introduces a course structure using gamification as a tool to learn the most essential in a new market: customer value proposition design, followed by a market entry game. Students analyze the existing market size and pricing structure of eHealth web-service market and enter the market as a steering group of their company, competing against the legacy players and with each other. The market is growing but has its rules of demand and supply balance. New products can be developed with an R&D-investment, and targeted to market with unique quality- and price-combinations. Product cost structure can be improved by investing to enhanced production capacity. Investments can be funded optionally by foreign capital. Students make management decisions and face the dynamics of the market competition in form of income statement and balance sheet after each decision cycle. The focus of the learning outcome is to understand customer value creation to be the source of cash flow. The benefit of gamification is to enrich the learning experience on structure and meaning of financial statements. The paper describes the gamification approach and discusses outcomes after two course implementations. Along the case description of learning challenges, some unexpected misconceptions are noted. Improvements of the game or the semi-gamified teaching pedagogy are discussed. The case description serves as an additional support to new game coordinator, as well as helps to improve the method. Overall, the gamified approach has helped to engage engineering student to business studies in an energizing way.Keywords: engineering education, integrated curriculum, learning experience, learning outcomes
Procedia PDF Downloads 2427587 A Clustering-Based Approach for Weblog Data Cleaning
Authors: Amine Ganibardi, Cherif Arab Ali
Abstract:
This paper addresses the data cleaning issue as a part of web usage data preprocessing within the scope of Web Usage Mining. Weblog data recorded by web servers within log files reflect usage activity, i.e., End-users’ clicks and underlying user-agents’ hits. As Web Usage Mining is interested in End-users’ behavior, user-agents’ hits are referred to as noise to be cleaned-off before mining. Filtering hits from clicks is not trivial for two reasons, i.e., a server records requests interlaced in sequential order regardless of their source or type, website resources may be set up as requestable interchangeably by end-users and user-agents. The current methods are content-centric based on filtering heuristics of relevant/irrelevant items in terms of some cleaning attributes, i.e., website’s resources filetype extensions, website’s resources pointed by hyperlinks/URIs, http methods, user-agents, etc. These methods need exhaustive extra-weblog data and prior knowledge on the relevant and/or irrelevant items to be assumed as clicks or hits within the filtering heuristics. Such methods are not appropriate for dynamic/responsive Web for three reasons, i.e., resources may be set up to as clickable by end-users regardless of their type, website’s resources are indexed by frame names without filetype extensions, web contents are generated and cancelled differently from an end-user to another. In order to overcome these constraints, a clustering-based cleaning method centered on the logging structure is proposed. This method focuses on the statistical properties of the logging structure at the requested and referring resources attributes levels. It is insensitive to logging content and does not need extra-weblog data. The used statistical property takes on the structure of the generated logging feature by webpage requests in terms of clicks and hits. Since a webpage consists of its single URI and several components, these feature results in a single click to multiple hits ratio in terms of the requested and referring resources. Thus, the clustering-based method is meant to identify two clusters based on the application of the appropriate distance to the frequency matrix of the requested and referring resources levels. As the ratio clicks to hits is single to multiple, the clicks’ cluster is the smallest one in requests number. Hierarchical Agglomerative Clustering based on a pairwise distance (Gower) and average linkage has been applied to four logfiles of dynamic/responsive websites whose click to hits ratio range from 1/2 to 1/15. The optimal clustering set on the basis of average linkage and maximum inter-cluster inertia results always in two clusters. The evaluation of the smallest cluster referred to as clicks cluster under the terms of confusion matrix indicators results in 97% of true positive rate. The content-centric cleaning methods, i.e., conventional and advanced cleaning, resulted in a lower rate 91%. Thus, the proposed clustering-based cleaning outperforms the content-centric methods within dynamic and responsive web design without the need of any extra-weblog. Such an improvement in cleaning quality is likely to refine dependent analysis.Keywords: clustering approach, data cleaning, data preprocessing, weblog data, web usage data
Procedia PDF Downloads 1717586 Flourishing in Marriage among Arab Couples in Israel: The Impact of Capitalization Support and Accommodation on Positive and Negative Affect
Authors: Niveen Hassan-Abbas, Tammie Ronen-Rosenbaum
Abstract:
Background and purpose: 'Flourishing in marriage' is a concept refers to married individuals’ high positivity ratio regarding their marriage, namely greater reported positive than negative emotions. The study proposes a different approach to marriage which emphasizes the place of the individual himself as largely responsible for his personal flourishing within marriage. Accordingly, the individual's desire to preserve and strengthen his marriage largely determines the marital behavior in a way that will contribute to his marriage success (Actor Effect), regardless the contribution of his or her partner to his marriage success (Partner Effect). Another assumption was that flourishing in marriage could be achieved by two separate processes, where capitalization support increases the positive marriage's evaluations and accommodation decreases the negative one. A theoretical model was constructed, whereby individuals who were committed to their marriage were hypothesized as employing self-control skills by way of two dynamic processes. First, individual’s higher degree of 'capitalization supportive responses' - supportive responses to the partner's sharing of positive personal experiences - was hypothesized as increasing one’s positive evaluations of marriage and thereby one’s positivity ratio. Second, individual’s higher degree of 'accommodation' responses - the ability during conflict situations to control the impulse to respond destructively and instead to respond constructively - was hypothesized as decreasing one’s negative evaluations of marriage and thereby increasing one’s positivity ratio. Methods: Participants were 156 heterosexual Arab couples from different regions of Israel. The mean period of marriage was 10.19 (SD=7.83), ages were 31.53 years for women (SD=8.12) and 36.80 years for men (SD=8.07). Years of education were 13.87 for women (SD=2.84) and 13.23 years for men (SD=3.45). Each participant completed seven questionnaires: socio-demographic, self-control skills, commitment, capitalization support, accommodation, marital quality, positive and negative affect. Using statistical analyses adapted to dyadic research design, firstly descriptive statistics were calculated and preliminary tests were performed. Next, dyadic model based on the Actor-Partner Interdependence Model (APIM) were tested using structural equation modeling (SEM). Results: The assumption according to which flourishing in marriage can be achieved by two processes was confirmed. All of the Actor Effect hypotheses were confirmed. Participants with higher self-control used more capitalization support and accommodation responses. Among husbands, unlike wives, these correlations were stronger when the individual's commitment level was higher. More capitalization supportive responses were found to increase positive evaluations of marriage, and greater spousal accommodation was found to decrease negative evaluations of marriage. High positive evaluations and low negative evaluations were found to increase positivity ratio. Not according to expectation, four partner effect paths were found significant. Conclusions and Implications: The present findings coincide with the positive psychology approach that emphasizes human strengths. The uniqueness of this study is its proposal that individuals are largely responsible for their personal flourishing in marriage. This study demonstrated that marital flourishing can be achieved by two processes, where capitalization increases the positive and accommodation decreases the negative. Practical implications include the need to construct interventions that enhance self-control skills for employment of capitalizing responsiveness and accommodation processes.Keywords: accommodation, capitalization support, commitment, flourishing in marriage, positivity ratio, self-control skills
Procedia PDF Downloads 1617585 Change Point Detection Using Random Matrix Theory with Application to Frailty in Elderly Individuals
Authors: Malika Kharouf, Aly Chkeir, Khac Tuan Huynh
Abstract:
Detecting change points in time series data is a challenging problem, especially in scenarios where there is limited prior knowledge regarding the data’s distribution and the nature of the transitions. We present a method designed for detecting changes in the covariance structure of high-dimensional time series data, where the number of variables closely matches the data length. Our objective is to achieve unbiased test statistic estimation under the null hypothesis. We delve into the utilization of Random Matrix Theory to analyze the behavior of our test statistic within a high-dimensional context. Specifically, we illustrate that our test statistic converges pointwise to a normal distribution under the null hypothesis. To assess the effectiveness of our proposed approach, we conduct evaluations on a simulated dataset. Furthermore, we employ our method to examine changes aimed at detecting frailty in the elderly.Keywords: change point detection, hypothesis tests, random matrix theory, frailty in elderly
Procedia PDF Downloads 617584 Solving Dimensionality Problem and Finding Statistical Constructs on Latent Regression Models: A Novel Methodology with Real Data Application
Authors: Sergio Paez Moncaleano, Alvaro Mauricio Montenegro
Abstract:
This paper presents a novel statistical methodology for measuring and founding constructs in Latent Regression Analysis. This approach uses the qualities of Factor Analysis in binary data with interpretations on Item Response Theory (IRT). In addition, based on the fundamentals of submodel theory and with a convergence of many ideas of IRT, we propose an algorithm not just to solve the dimensionality problem (nowadays an open discussion) but a new research field that promises more fear and realistic qualifications for examiners and a revolution on IRT and educational research. In the end, the methodology is applied to a set of real data set presenting impressive results for the coherence, speed and precision. Acknowledgments: This research was financed by Colciencias through the project: 'Multidimensional Item Response Theory Models for Practical Application in Large Test Designed to Measure Multiple Constructs' and both authors belong to SICS Research Group from Universidad Nacional de Colombia.Keywords: item response theory, dimensionality, submodel theory, factorial analysis
Procedia PDF Downloads 3747583 A Refinement Strategy Coupling Event-B and Planning Domain Definition Language (PDDL) for Planning Problems
Authors: Sabrine Ammar, Mohamed Tahar Bhiri
Abstract:
Automatic planning has a de facto standard language called Planning Domain Definition Language (PDDL) for describing planning problems. It aims to formalize the planning problems described by the concept of state space. PDDL-related dynamic analysis tools, namely planners and validators, are insufficient for verifying and validating PDDL descriptions. Indeed, these tools made it possible to detect errors a posteriori by means of test activity. In this paper, we recommend a formal approach coupling the two languages Event-B and PDDL, for automatic planning. Event-B is used for formal modeling by stepwise refinement with mathematical proofs of planning problems. Thus, this paper proposes a refinement strategy allowing to obtain reliable PDDL descriptions from an ultimate Event-B model correct by construction. The ultimate Event-B model, correct by construction which is supposed to be translatable into PDDL, is automatically translated into PDDL using our MDE Event-B2PDDL tool.Keywords: code generation, event-b, PDDL, refinement strategy, translation rules
Procedia PDF Downloads 2007582 On the Hirota Bilinearization of Fokas-Lenells Equation to Obtain Bright N-Soliton Solution
Authors: Sagardeep Talukdar, Gautam Kumar Saharia, Riki Dutta, Sudipta Nandy
Abstract:
In non-linear optics, the Fokas-Lenells equation (FLE) is a well-known integrable equation that describes how ultrashort pulses move across optical fiber. It admits localized wave solutions, just like any other integrable equation. We apply the Hirota bilinearization method to obtain the soliton solution of FLE. The proposed bilinearization makes use of an auxiliary function. We apply the method to FLE with a vanishing boundary condition, that is, to obtain bright soliton. We have obtained bright 1-soliton, 2-soliton solutions and propose the scheme for obtaining N-soliton solution. We have used an additional parameter which is responsible for the shift in the position of the soliton. Further analysis of the 2-soliton solution is done by asymptotic analysis. We discover that the suggested bilinearization approach, which makes use of the auxiliary function, greatly simplifies the process while still producing the desired outcome. We think that the current analysis will be helpful in understanding how FLE is used in nonlinear optics and other areas of physics.Keywords: asymptotic analysis, fokas-lenells equation, hirota bilinearization method, soliton
Procedia PDF Downloads 1267581 H∞ Fuzzy Integral Power Control for DFIG Wind Energy System
Authors: N. Chayaopas, W. Assawinchaichote
Abstract:
In order to maximize energy capturing from wind energy, controlling the doubly fed induction generator to have optimal power from the wind, generator speed and output electrical power control in wind energy system have a great importance due to the nonlinear behavior of wind velocities. In this paper purposes the design of a control scheme is developed for power control of wind energy system via H∞ fuzzy integral controller. Firstly, the nonlinear system is represented in term of a TS fuzzy control design via linear matrix inequality approach to find the optimal controller to have an H∞ performance are derived. The proposed control method extract the maximum energy from the wind and overcome the nonlinearity and disturbances problems of wind energy system which give good tracking performance and high efficiency power output of the DFIG.Keywords: doubly fed induction generator, H-infinity fuzzy integral control, linear matrix inequality, wind energy system
Procedia PDF Downloads 3507580 The Functions of “Question” and Its Role in Education Process: Quranic Approach
Authors: Sara Tusian, Zahra Salehi Motaahed, Narges Sajjadie, Nikoo Dialame
Abstract:
One of the methods which have frequently been used in Quran is the “question”. In the Quran, in addition to the content, methods are also important. Using analysis-interpretation method, the present study has investigated Quranic questions, and extracted its functions from educational perspective. In so doing, it has first investigated all the questions in Quran and then taking the three-stage classification of education into account, it has offered question functions. The results obtained from this study suggest that question functions in Quran are presented in three categories: the preparation stage (including preparation of the audience, revising the insights, and internal Evolution); main body (including the granting the insight, and elimination of intellectual negligence and the question of innate and logical axioms, the introducting of the realm of thinking, creating emotional arousal and alleged in the claim) and the third stage as modification and revision (including invitation to move in the framework of tasks using the individual beliefs to reveal the contradictions and, Error detection and contribution to change the function) that each of which has a special role in the education process.Keywords: education, question, Quranic questions, Quran
Procedia PDF Downloads 5047579 Holistic Approach for Natural Results in Facial Aesthetics
Authors: R. Denkova
Abstract:
Nowadays, aesthetic and psychological researches in some countries show that the aesthetic ideal for women is built by the same pattern of big volumes – lips, cheek, facial disproportions. They all look like made of a matrix. And they lose their unique and emotional aspects of beauty. How to escape this matrix and find the balance? The secret to being a unique injector is good assessment, creating a treatment plan and flawless injection strategy. The newest concepts in this new injection era which meet the requirements of a modern society and deliver balanced and natural looking results are based on the concept of injecting not the consequence, but the reason. Three case studies are presented with full face assessment, treatment plan and before/after pictures. Using different approaches and techniques of the MD codes concept, lights and shadows concept in order to preserve the emotional beauty and identity of the women. In conclusion, the cases demonstrate that beauty exists even beyond the matrix and it is the injector’s mission and responsibility is to preserve and highlight the natural beauty and unique identity of every different patient.Keywords: beyond the matrix, emotional beauty, face assessment, injector, treatment plan
Procedia PDF Downloads 1247578 Parkinson's Disease Gene Identification Using Physicochemical Properties of Amino Acids
Authors: Priya Arora, Ashutosh Mishra
Abstract:
Gene identification, towards the pursuit of mutated genes, leading to Parkinson’s disease, puts forward a challenge towards proactive cure of the disorder itself. Computational analysis is an effective technique for exploring genes in the form of protein sequences, as the theoretical and manual analysis is infeasible. The limitations and effectiveness of a particular computational method are entirely dependent on the previous data that is available for disease identification. The article presents a sequence-based classification method for the identification of genes responsible for Parkinson’s disease. During the initiation phase, the physicochemical properties of amino acids transform protein sequences into a feature vector. The second phase of the method employs Jaccard distances to select negative genes from the candidate population. The third phase involves artificial neural networks for making final predictions. The proposed approach is compared with the state of art methods on the basis of F-measure. The results confirm and estimate the efficiency of the method.Keywords: disease gene identification, Parkinson’s disease, physicochemical properties of amino acid, protein sequences
Procedia PDF Downloads 142