Search results for: variable precision rough sets theory
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8649

Search results for: variable precision rough sets theory

7869 Housing Price Dynamics: Comparative Study of 1980-1999 and the New Millenium

Authors: Janne Engblom, Elias Oikarinen

Abstract:

The understanding of housing price dynamics is of importance to a great number of agents: to portfolio investors, banks, real estate brokers and construction companies as well as to policy makers and households. A panel dataset is one that follows a given sample of individuals over time, and thus provides multiple observations on each individual in the sample. Panel data models include a variety of fixed and random effects models which form a wide range of linear models. A special case of panel data models is dynamic in nature. A complication regarding a dynamic panel data model that includes the lagged dependent variable is endogeneity bias of estimates. Several approaches have been developed to account for this problem. In this paper, the panel models were estimated using the Common Correlated Effects estimator (CCE) of dynamic panel data which also accounts for cross-sectional dependence which is caused by common structures of the economy. In presence of cross-sectional dependence standard OLS gives biased estimates. In this study, U.S housing price dynamics were examined empirically using the dynamic CCE estimator with first-difference of housing price as the dependent and first-differences of per capita income, interest rate, housing stock and lagged price together with deviation of housing prices from their long-run equilibrium level as independents. These deviations were also estimated from the data. The aim of the analysis was to provide estimates with comparisons of estimates between 1980-1999 and 2000-2012. Based on data of 50 U.S cities over 1980-2012 differences of short-run housing price dynamics estimates were mostly significant when two time periods were compared. Significance tests of differences were provided by the model containing interaction terms of independents and time dummy variable. Residual analysis showed very low cross-sectional correlation of the model residuals compared with the standard OLS approach. This means a good fit of CCE estimator model. Estimates of the dynamic panel data model were in line with the theory of housing price dynamics. Results also suggest that dynamics of a housing market is evolving over time.

Keywords: dynamic model, panel data, cross-sectional dependence, interaction model

Procedia PDF Downloads 242
7868 Developing a Theory for Study of Transformation of Historic Cities

Authors: Sana Ahrar

Abstract:

Cities are undergoing rapid transformation with the change in lifestyle and technological advancements. These transformations may be experienced or physically visible in the built form. This paper focuses on the relationship between the social, physical environment, change in lifestyle and the interrelated factors influencing the transformation of any historic city. Shahjahanabad as a city has undergone transformation under the various political powers as well as the various policy implementations after independence. These visible traces of transformation diffused throughout the city may be due to socio-economic, historic, political factors and due to the globalization process. This study shall enable evolving a theory for the study of transformation of Historic cities such as Shahjahanabad: which has been plundered, rebuilt, and which still thrives as a ‘living heritage city’. The theory developed will be the process of studying the transformation and can be used by planners, policy makers and researchers in different urban contexts.

Keywords: heritage, historic cities, Shahjahanabad, transformation

Procedia PDF Downloads 374
7867 Precision Grinding of Titanium (Ti-6Al-4V) Alloy Using Nanolubrication

Authors: Ahmed A. D. Sarhan, Hong Wan Ping, M. Sayuti

Abstract:

In this current era of competitive machinery productions, the industries are designed to place more emphasis on the product quality and reduction of cost whilst abiding by the pollution-preventing policy. In attempting to delve into the concerns, the industries are aware that the effectiveness of existing lubrication systems must be improved to achieve power-efficient and pollution-preventing machining processes. As such, this research is targeted to study on a plausible solution to the issue in grinding titanium alloy (Ti-6Al-4V) by using nanolubrication, as an alternative to flood grinding. The aim of this research is to evaluate the optimum condition of grinding force and surface roughness using MQL lubricating system to deliver nano-oil at different level of weight concentration of Silicon Dioxide (SiO2) mixed normal mineral oil. Taguchi Design of Experiment (DoE) method is carried out using a standard Taguchi orthogonal array of L16(43) to find the optimized combination of weight concentration mixture of SiO2, nozzle orientation and pressure of MQL. Surface roughness and grinding force are also analyzed using signal-to-noise(S/N) ratio to determine the best level of each factor that are tested. Consequently, the best combination of parameters is tested for a period of time and the results are compared with conventional grinding method of dry and flood condition. The results show a positive performance of MQL nanolubrication.

Keywords: grinding, MQL, precision grinding, Taguchi optimization, titanium alloy

Procedia PDF Downloads 262
7866 Using Computer Vision to Detect and Localize Fractures in Wrist X-ray Images

Authors: John Paul Q. Tomas, Mark Wilson L. de los Reyes, Kirsten Joyce P. Vasquez

Abstract:

The most frequent type of fracture is a wrist fracture, which often makes it difficult for medical professionals to find and locate. In this study, fractures in wrist x-ray pictures were located and identified using deep learning and computer vision. The researchers used image filtering, masking, morphological operations, and data augmentation for the image preprocessing and trained the RetinaNet and Faster R-CNN models with ResNet50 backbones and Adam optimizers separately for each image filtering technique and projection. The RetinaNet model with Anisotropic Diffusion Smoothing filter trained with 50 epochs has obtained the greatest accuracy of 99.14%, precision of 100%, sensitivity/recall of 98.41%, specificity of 100%, and an IoU score of 56.44% for the Posteroanterior projection utilizing augmented data. For the Lateral projection using augmented data, the RetinaNet model with an Anisotropic Diffusion filter trained with 50 epochs has produced the highest accuracy of 98.40%, precision of 98.36%, sensitivity/recall of 98.36%, specificity of 98.43%, and an IoU score of 58.69%. When comparing the test results of the different individual projections, models, and image filtering techniques, the Anisotropic Diffusion filter trained with 50 epochs has produced the best classification and regression scores for both projections.

Keywords: Artificial Intelligence, Computer Vision, Wrist Fracture, Deep Learning

Procedia PDF Downloads 65
7865 Exploring the Compatibility of The Rhizome and Complex Adaptive System (CAS) Theory as a Hybrid Urban Strategy Via Aggregation, Nonlinearity, and Flow

Authors: Sudaff Mohammed, Wahda Shuker Al-Hinkawi, Nada Abdulmueen Hasan

Abstract:

The compatibility of the Rhizome and Complex Adaptive system theory as a strategy within the urban context is the essential interest of this paper since there are only a few attempts to establish a hybrid, multi-scalar, and developable strategy based on the concept of the Rhizome and the CAS theory. This paper aims to establish a Rhizomic CAS strategy for different urban contexts by investigating the principles, characteristics, properties, and mechanisms of Rhizome and Complex Adaptive Systems. The research focused mainly on analyzing three properties: aggregation, non-linearity, and flow through the lens of Rhizome, Rhizomatization of CAS properties. The most intriguing result is that the principal and well-investigated characteristics of Complex Adaptive systems can be ‘Rhizomatized’ in two ways; highlighting commonalities between Rhizome and Complex Adaptive systems in addition to using Rhizome-related concepts. This paper attempts to emphasize the potency of the Rhizome as an apparently stochastic and barely anticipatable structure that can be developed to analyze cities of distinctive contexts for formulating better customized urban strategies.

Keywords: rhizome, complex adaptive system (CAS), system Theory, urban system, rhizomatic CAS, assemblage, human occupation impulses (HOI)

Procedia PDF Downloads 26
7864 The Influence of Variable Geometrical Modifications of the Trailing Edge of Supercritical Airfoil on the Characteristics of Aerodynamics

Authors: P. Lauk, K. E. Seegel, T. Tähemaa

Abstract:

The fuel consumption of modern, high wing loading, commercial aircraft in the first stage of flight is high because the usable flight level is lower and the weather conditions (jet stream) have great impact on aircraft performance. To reduce the fuel consumption, it is necessary to raise during first stage of flight the L/D ratio value within Cl 0.55-0.65. Different variable geometrical wing trailing edge modifications of SC(2)-410 airfoil were compared at M 0.78 using the CFD software STAR-CCM+ simulation based Reynolds-averaged Navier-Stokes (RANS) equations. The numerical results obtained show that by increasing the width of the airfoil by 4% and by modifying the trailing edge airfoil, it is possible to decrease airfoil drag at Cl 0.70 for up to 26.6% and at the same time to increase commercial aircraft L/D ratio for up to 5.0%. Fuel consumption can be reduced in proportion to the increase in L/D ratio.

Keywords: L/D ratio, miniflaps, mini-TED, supercritical airfoil

Procedia PDF Downloads 192
7863 The Theory of Number "0"

Authors: Iryna Shevchenko

Abstract:

The science of mathematics was originated at the order of count of objects and subsequently for the measurement of size and quality of objects using the logical or abstract means. The laws of mathematics are based on the study of absolute values. The number 0 or "nothing" is the purely logical (as the opposite to absolute) value as the "nothing" should always assume the space for the something that had existed there; otherwise the "something" would never come to existence. In this work we are going to prove that the number "0" is the abstract (logical) and not an absolute number and it has the absolute value of “∞” (infinity). Therefore, the number "0" might not stand in the row of numbers that symbolically represents the absolute values, as it would be the mathematically incorrect. The symbolical value of number "0" in the row of numbers could be represented with symbol "∞" (infinity). As a result, we have the mathematical row of numbers: epsilon, ...4, 3, 2, 1, ∞. As the conclusions of the theory of number “0” we presented the statements: multiplication and division by fractions of numbers is illegal operation and the mathematical division by number “0” is allowed.

Keywords: illegal operation of division and multiplication by fractions of number, infinity, mathematical row of numbers, theory of number “0”

Procedia PDF Downloads 541
7862 Reinforcement Learning the Born Rule from Photon Detection

Authors: Rodrigo S. Piera, Jailson Sales Ara´ujo, Gabriela B. Lemos, Matthew B. Weiss, John B. DeBrota, Gabriel H. Aguilar, Jacques L. Pienaar

Abstract:

The Born rule was historically viewed as an independent axiom of quantum mechanics until Gleason derived it in 1957 by assuming the Hilbert space structure of quantum measurements [1]. In subsequent decades there have been diverse proposals to derive the Born rule starting from even more basic assumptions [2]. In this work, we demonstrate that a simple reinforcement-learning algorithm, having no pre-programmed assumptions about quantum theory, will nevertheless converge to a behaviour pattern that accords with the Born rule, when tasked with predicting the output of a quantum optical implementation of a symmetric informationally-complete measurement (SIC). Our findings support a hypothesis due to QBism (the subjective Bayesian approach to quantum theory), which states that the Born rule can be thought of as a normative rule for making decisions in a quantum world [3].

Keywords: quantum Bayesianism, quantum theory, quantum information, quantum measurement

Procedia PDF Downloads 93
7861 High Precision 65nm CMOS Rectifier for Energy Harvesting using Threshold Voltage Minimization in Telemedicine Embedded System

Authors: Hafez Fouad

Abstract:

Telemedicine applications have very low voltage which required High Precision Rectifier Design with high Sensitivity to operate at minimum input Voltage. In this work, we targeted 0.2V input voltage using 65 nm CMOS rectifier for Energy Harvesting Telemedicine application. The proposed rectifier which designed at 2.4GHz using two-stage structure found to perform in a better case where minimum operation voltage is lower than previous published paper and the rectifier can work at a wide range of low input voltage amplitude. The Performance Summary of Full-wave fully gate cross-coupled rectifiers (FWFR) CMOS Rectifier at F = 2.4 GHz: The minimum and maximum output voltages generated using an input voltage amplitude of 2 V are 490.9 mV and 1.997 V, maximum VCE = 99.85 % and maximum PCE = 46.86 %. The Performance Summary of Differential drive CMOS rectifier with external bootstrapping circuit rectifier at F = 2.4 GHz: The minimum and maximum output voltages generated using an input voltage amplitude of 2V are 265.5 mV (0.265V) and 1.467 V respectively, maximum VCE = 93.9 % and maximum PCE= 15.8 %.

Keywords: energy harvesting, embedded system, IoT telemedicine system, threshold voltage minimization, differential drive cmos rectifier, full-wave fully gate cross-coupled rectifiers CMOS rectifier

Procedia PDF Downloads 141
7860 Mecano-Reliability Approach Applied to a Water Storage Tank Placed on Ground

Authors: Amar Aliche, Hocine Hammoum, Karima Bouzelha, Arezki Ben Abderrahmane

Abstract:

Traditionally, the dimensioning of storage tanks is conducted with a deterministic approach based on partial coefficients of safety. These coefficients are applied to take into account the uncertainties related to hazards on properties of materials used and applied loads. However, the use of these safety factors in the design process does not assure an optimal and reliable solution and can sometimes lead to a lack of robustness of the structure. The reliability theory based on a probabilistic formulation of constructions safety can respond in an adapted manner. It allows constructing a modelling in which uncertain data are represented by random variables, and therefore allows a better appreciation of safety margins with confidence indicators. The work presented in this paper consists of a mecano-reliability analysis of a concrete storage tank placed on ground. The classical method of Monte Carlo simulation is used to evaluate the failure probability of concrete tank by considering the seismic acceleration as random variable.

Keywords: reliability approach, storage tanks, monte carlo simulation, seismic acceleration

Procedia PDF Downloads 298
7859 Antecedent and Outcome of New Product Development in Leather Industry, Bangkok and Vicinity, Thailand

Authors: Bundit Pungnirund

Abstract:

The purposes of this research were to develop and to monitor the antecedent factors which directly affected the success rate of new product development. This was a case study of the leather industry in Bangkok, Thailand. A total of 350 leather factories were used as a sample group. The findings revealed that the new product development model was harmonized with the empirical data at the acceptable level, the statistic values are: x^2=6.45, df= 7, p-value = .48856; RMSEA = .000; RMR = .0029; AGFI = .98; GFI = 1.00. The independent variable that directly influenced the dependent variable at the highest level was marketing outcome which had a influence coefficient at 0.32 and the independent variables that indirectly influenced the dependent variables at the highest level was a clear organization policy which had a influence coefficient at 0.17, whereas, all independent variables can predict the model at 48 percent.

Keywords: antecedent, new product development, leather industry, Thailand

Procedia PDF Downloads 290
7858 Theoretical and Experimental Analysis of Hard Material Machining

Authors: Rajaram Kr. Gupta, Bhupendra Kumar, T. V. K. Gupta, D. S. Ramteke

Abstract:

Machining of hard materials is a recent technology for direct production of work-pieces. The primary challenge in machining these materials is selection of cutting tool inserts which facilitates an extended tool life and high-precision machining of the component. These materials are widely for making precision parts for the aerospace industry. Nickel-based alloys are typically used in extreme environment applications where a combination of strength, corrosion resistance and oxidation resistance material characteristics are required. The present paper reports the theoretical and experimental investigations carried out to understand the influence of machining parameters on the response parameters. Considering the basic machining parameters (speed, feed and depth of cut) a study has been conducted to observe their influence on material removal rate, surface roughness, cutting forces and corresponding tool wear. Experiments are designed and conducted with the help of Central Composite Rotatable Design technique. The results reveals that for a given range of process parameters, material removal rate is favorable for higher depths of cut and low feed rate for cutting forces. Low feed rates and high values of rotational speeds are suitable for better finish and higher tool life.

Keywords: speed, feed, depth of cut, roughness, cutting force, flank wear

Procedia PDF Downloads 275
7857 Efficient Human Motion Detection Feature Set by Using Local Phase Quantization Method

Authors: Arwa Alzughaibi

Abstract:

Human Motion detection is a challenging task due to a number of factors including variable appearance, posture and a wide range of illumination conditions and background. So, the first need of such a model is a reliable feature set that can discriminate between a human and a non-human form with a fair amount of confidence even under difficult conditions. By having richer representations, the classification task becomes easier and improved results can be achieved. The Aim of this paper is to investigate the reliable and accurate human motion detection models that are able to detect the human motions accurately under varying illumination levels and backgrounds. Different sets of features are tried and tested including Histogram of Oriented Gradients (HOG), Deformable Parts Model (DPM), Local Decorrelated Channel Feature (LDCF) and Aggregate Channel Feature (ACF). However, we propose an efficient and reliable human motion detection approach by combining Histogram of oriented gradients (HOG) and local phase quantization (LPQ) as the feature set, and implementing search pruning algorithm based on optical flow to reduce the number of false positive. Experimental results show the effectiveness of combining local phase quantization descriptor and the histogram of gradient to perform perfectly well for a large range of illumination conditions and backgrounds than the state-of-the-art human detectors. Areaunder th ROC Curve (AUC) of the proposed method achieved 0.781 for UCF dataset and 0.826 for CDW dataset which indicates that it performs comparably better than HOG, DPM, LDCF and ACF methods.

Keywords: human motion detection, histograms of oriented gradient, local phase quantization, local phase quantization

Procedia PDF Downloads 247
7856 Enhancing Coping Strategies of Student: A Case Study of 'Choice Theory' Group Counseling

Authors: Warakorn Supwirapakorn

Abstract:

The purpose of this research was to study the effects of choice theory in group counseling on coping strategies of students. The sample consisted of 16 students at a boarding school, who had the lowest score on the coping strategies. The sample was divided into two groups by random assignment and then were assigned into the experimental group and the control group, with eight members each. The instruments were the Adolescent Coping Scale and choice theory group counseling program. The data collection procedure was divided into three phases: The pre-test, the post-test, and the follow-up. The data were analyzed by repeated measure analysis of variance: One between-subjects and one within-subjects. The results revealed that the interaction between the methods and the duration of the experiment was found statistically significant at 0.05 level. The students in the experimental group demonstrated significantly higher at 0.05 level on coping strategies score in both the post-test and the follow-up than in the pre-test and the control group. No significant difference was found on coping strategies during the post-test phase and the follow-up phase of the experimental group.

Keywords: coping strategies, choice theory, group counseling, boarding school

Procedia PDF Downloads 200
7855 Relative Composition of Executive Compensation Packages, Corporate Governance and Financial Reporting Quality

Authors: Philemon Rakoto

Abstract:

Most executive compensation packages consist of four major components: base fixed salary, annual and long-term non-equity incentive plans, share-based and option-based awards and pension value. According to agency theory, the relative composition of executive compensation packages is one of the mechanisms that firms use to align the interests of executives and shareholders in order to mitigate agency costs. This paper tests the effect of the relative composition of executive compensation packages on financial reporting quality. Financial reporting quality is measured by the value relevance of accounting earnings. Corporate governance is a moderating variable in the model. Using data from Canadian firms composing S&P/TSX index of the year 2013 and governance scores based on Board Games, the analysis shows that, only for firms with good governance, there is an optimal level of the proportion of executive equity-based compensation in relation to total compensation that enhances the quality of financial reporting.

Keywords: Canada, corporate governance, executive compensation packages, financial reporting quality

Procedia PDF Downloads 335
7854 Composite Laminate and Thin-Walled Beam Correlations for Aircraft Wing Box Design

Authors: S. J. M. Mohd Saleh, S. Guo

Abstract:

Composite materials have become an important option for the primary structure of aircraft due to their design flexibility and ability to improve the overall performance. At present, the option for composite usage in aircraft component is largely based on experience, knowledge, benchmarking and partly market driven. An inevitable iterative design during the design stage and validation process will increase the development time and cost. This paper aims at presenting the correlation between laminate and composite thin-wall beam structure, which contains the theoretical and numerical investigations on stiffness estimation of composite aerostructures with applications to aircraft wings. Classical laminate theory and thin-walled beam theory were applied to define the correlation between 1-dimensional composite laminate and 2-dimensional composite beam structure, respectively. Then FE model was created to represent the 3-dimensional structure. A detailed study on stiffness matrix of composite laminates has been carried out to understand the effects of stacking sequence on the coupling between extension, shear, bending and torsional deformation of wing box structures for 1-dimensional, 2-dimensional and 3-dimensional structures. Relationships amongst composite laminates and composite wing box structures of the same material have been developed in this study. These correlations will be guidelines for the design engineers to predict the stiffness of the wing box structure during the material selection process and laminate design stage.

Keywords: aircraft design, aircraft structures, classical lamination theory, composite structures, laminate theory, structural design, thin-walled beam theory, wing box design

Procedia PDF Downloads 219
7853 Non-Cooperative Game Theory Approach for Ensuring Community Satisfaction on Public-Private Partnership Projects

Authors: Jason Salim, Zhouyang Lu

Abstract:

Private sector involvement in Public Private Partnership (PPP) projects may raise public suspicion, as PPP is often mistaken as merely a partnership between private and government agencies without consideration for greater “public” (community). This public marginalization is crucial to be dealt with because undermining opinion of majority may cause problems such as protests and/ or low demand. Game theory approach applied in this paper shows that probability of public acceptance towards a project is affected by overall public’s perception on Private sectors’ possible profit accumulation from the project. On the contrary, goodwill of the government and private coalition alone is not enough to minimize the probability of public opposition towards a PPP project. Additionally, the threat of loss or damage raised from public opposition does not affect the profit-maximization behavior of Private sectors.

Keywords: community satisfaction, game theory, non-cooperative, PPP, public policy

Procedia PDF Downloads 681
7852 2D Ferromagnetism in Van der Waals Bonded Fe₃GeTe₂

Authors: Ankita Tiwari, Jyoti Saini, Subhasis Ghosh

Abstract:

For many years, researchers have been fascinated by the subject of how properties evolve as dimensionality is lowered. Early on, it was shown that the presence of a significant magnetic anisotropy might compensate for the lack of long-range (LR) magnetic order in a low-dimensional system (d < 3) with continuous symmetry, as proposed by Hohenberg-Mermin and Wagner (HMW). Strong magnetic anisotropy allows an LR magnetic order to stabilize in two dimensions (2D) even in the presence of stronger thermal fluctuations which is responsible for the absence of Heisenberg ferromagnetism in 2D. Van der Waals (vdW) ferromagnets, including CrI₃, CrTe₂, Cr₂X₂Te₆ (X = Si and Ge) and Fe₃GeTe₂, offer a nearly ideal platform for studying ferromagnetism in 2D. Fe₃GeTe₂ is the subject of extensive investigation due to its tunable magnetic properties, high Curie temperature (Tc ~ 220K), and perpendicular magnetic anisotropy. Many applications in the field of spintronics device development have been quite active due to these appealing features of Fe₃GeTe₂. Although it is known that LR-driven ferromagnetism is necessary to get around the HMW theorem in 2D experimental realization, Heisenberg 2D ferromagnetism remains elusive in condensed matter systems. Here, we show that Fe₃GeTe₂ hosts both localized and delocalized spins, resulting in itinerant and local-moment ferromagnetism. The presence of LR itinerant interaction facilitates to stabilize Heisenberg ferromagnet in 2D. With the help of Rhodes-Wohlfarth (RW) and generalized RW-based analysis, Fe₃GeTe₂ has been shown to be a 2D ferromagnet with itinerant magnetism that can be modulated by an external magnetic field. Hence, the presence of both local moment and itinerant magnetism has made this system interesting in terms of research in low dimensions. We have also rigorously performed critical analysis using an improvised method. We show that the variable critical exponents are typical signatures of 2D ferromagnetism in Fe₃GeTe₂. The spontaneous magnetization exponent β changes the universality class from mean-field to 2D Heisenberg with field. We have also confirmed the range of interaction via the renormalization group (RG) theory. According to RG theory, Fe₃GeTe₂ is a 2D ferromagnet with LR interactions.

Keywords: Van der Waal ferromagnet, 2D ferromagnetism, phase transition, itinerant ferromagnetism, long range order

Procedia PDF Downloads 60
7851 Urban Planning Compilation Problems in China and the Corresponding Optimization Ideas under the Vision of the Hyper-Cycle Theory

Authors: Hong Dongchen, Chen Qiuxiao, Wu Shuang

Abstract:

Systematic science reveals the complex nonlinear mechanisms of behaviour in urban system. However, in China, when the current city planners face with the system, most of them are still taking simple linear thinking to consider the open complex giant system. This paper introduces the hyper-cycle theory, which is one of the basis theories of systematic science, based on the analysis of the reasons why the current urban planning failed, and proposals for optimization ideas that urban planning compilation should change, from controlling quantitative to the changes of relationship, from blueprint planning to progressive planning based on the nonlinear characteristics and from management control to dynamically monitor feedback.

Keywords: systematic science, hyper-cycle theory, urban planning, urban management

Procedia PDF Downloads 387
7850 Minimum-Fuel Optimal Trajectory for Reusable First-Stage Rocket Landing Using Particle Swarm Optimization

Authors: Kevin Spencer G. Anglim, Zhenyu Zhang, Qingbin Gao

Abstract:

Reusable launch vehicles (RLVs) present a more environmentally-friendly approach to accessing space when compared to traditional launch vehicles that are discarded after each flight. This paper studies the recyclable nature of RLVs by presenting a solution method for determining minimum-fuel optimal trajectories using principles from optimal control theory and particle swarm optimization (PSO). This problem is formulated as a minimum-landing error powered descent problem where it is desired to move the RLV from a fixed set of initial conditions to three different sets of terminal conditions. However, unlike other powered descent studies, this paper considers the highly nonlinear effects caused by atmospheric drag, which are often ignored for studies on the Moon or on Mars. Rather than optimizing the controls directly, the throttle control is assumed to be bang-off-bang with a predetermined thrust direction for each phase of flight. The PSO method is verified in a one-dimensional comparison study, and it is then applied to the two-dimensional cases, the results of which are illustrated.

Keywords: minimum-fuel optimal trajectory, particle swarm optimization, reusable rocket, SpaceX

Procedia PDF Downloads 267
7849 Analysis of Cardiac Health Using Chaotic Theory

Authors: Chandra Mukherjee

Abstract:

The prevalent knowledge of the biological systems is based on the standard scientific perception of natural equilibrium, determination and predictability. Recently, a rethinking of concepts was presented and a new scientific perspective emerged that involves complexity theory with deterministic chaos theory, nonlinear dynamics and theory of fractals. The unpredictability of the chaotic processes probably would change our understanding of diseases and their management. The mathematical definition of chaos is defined by deterministic behavior with irregular patterns that obey mathematical equations which are critically dependent on initial conditions. The chaos theory is the branch of sciences with an interest in nonlinear dynamics, fractals, bifurcations, periodic oscillations and complexity. Recently, the biomedical interest for this scientific field made these mathematical concepts available to medical researchers and practitioners. Any biological network system is considered to have a nominal state, which is recognized as a homeostatic state. In reality, the different physiological systems are not under normal conditions in a stable state of homeostatic balance, but they are in a dynamically stable state with a chaotic behavior and complexity. Biological systems like heart rhythm and brain electrical activity are dynamical systems that can be classified as chaotic systems with sensitive dependence on initial conditions. In biological systems, the state of a disease is characterized by a loss of the complexity and chaotic behavior, and by the presence of pathological periodicity and regulatory behavior. The failure or the collapse of nonlinear dynamics is an indication of disease rather than a characteristic of health.

Keywords: HRV, HRVI, LF, HF, DII

Procedia PDF Downloads 409
7848 Balancing Resources and Demands in Activation Work with Young Adults: Exploring Potentials of the Job Demands-Resources Theory

Authors: Gurli Olsen, Ida Bruheim Jensen

Abstract:

Internationally, many young adults not in education, employment, or training (NEET) remain in temporary solutions such as labour market measures or other forms of welfare arrangements. These trends have been associated with ineffective labour market measures, an underfunded theoretical foundation for activation work, limited competence among social workers and labour market employees in using ordinary workplaces as job inclusion measures, and an overemphasis on young adults’ personal limitations such as health challenges and lack of motivation. Two competing models have been prominent in activation work: Place‐Then‐Train and Train‐Then‐Place. A traditional strategy for labour market measures has been to first motivate NEETs to sheltered work and training and then to the regular labour market (train then place). Measures such as Supported Employment (SE) and Individual Placement and Support (IPS) advocate for rapid entry into paid work at the regular labour market with close supervision and training from social workers, employees, and others (place then train). None of these models demonstrate unquestionable results. In this web of working life measures, young adults (NEETs) experience a lack of confidence in their own capabilities and coping strategies vis-á-vis labour market- and educational demands. Drawing on young adults’ own experiences, we argue that the Job Demands-Resources (JD-R) Theory can contribute to the theoretical and practical dimensions of activation work. This presentation will focus on what the JD-R theory entails and how it can be fruitful in activation work with NEETs (what and how). The overarching rationale of the JD-R theory is that an enduring balance between demands (e.g., deadlines, working hours) and resources (e.g., social support, enjoyable work tasks) is important for job performance for people in any job and potentially in other meaningful activities. Extensive research has demonstrated that a balance between demands and resources increases motivation and decreases stress. Nevertheless, we have not identified literature on the JD-R theory in activation work with young adults.

Keywords: activation work, job demands-resources theory, social work, theory development

Procedia PDF Downloads 69
7847 Investigating the Effectiveness of Multilingual NLP Models for Sentiment Analysis

Authors: Othmane Touri, Sanaa El Filali, El Habib Benlahmar

Abstract:

Natural Language Processing (NLP) has gained significant attention lately. It has proved its ability to analyze and extract insights from unstructured text data in various languages. It is found that one of the most popular NLP applications is sentiment analysis which aims to identify the sentiment expressed in a piece of text, such as positive, negative, or neutral, in multiple languages. While there are several multilingual NLP models available for sentiment analysis, there is a need to investigate their effectiveness in different contexts and applications. In this study, we aim to investigate the effectiveness of different multilingual NLP models for sentiment analysis on a dataset of online product reviews in multiple languages. The performance of several NLP models, including Google Cloud Natural Language API, Microsoft Azure Cognitive Services, Amazon Comprehend, Stanford CoreNLP, spaCy, and Hugging Face Transformers are being compared. The models based on several metrics, including accuracy, precision, recall, and F1 score, are being evaluated and compared to their performance across different categories of product reviews. In order to run the study, preprocessing of the dataset has been performed by cleaning and tokenizing the text data in multiple languages. Then training and testing each model has been applied using a cross-validation approach where randomly dividing the dataset into training and testing sets and repeating the process multiple times has been used. A grid search approach to optimize the hyperparameters of each model and select the best-performing model for each category of product reviews and language has been applied. The findings of this study provide insights into the effectiveness of different multilingual NLP models for Multilingual Sentiment Analysis and their suitability for different languages and applications. The strengths and limitations of each model were identified, and recommendations for selecting the most performant model based on the specific requirements of a project were provided. This study contributes to the advancement of research methods in multilingual NLP and provides a practical guide for researchers and practitioners in the field.

Keywords: NLP, multilingual, sentiment analysis, texts

Procedia PDF Downloads 85
7846 An Interpretive Study of Entrepreneurial Experience towards Achieving Business Growth Using the Theory of Planned Behaviour as a Lens

Authors: Akunna Agunwah, Kevin Gallimore, Kathryn Kinmond

Abstract:

Entrepreneurship is widely associated and seen as a vehicle for economic growth; however, different scholars have studied entrepreneurship from various perspectives, resulting in multiple definitions. It is surprising to know most entrepreneurship definition does not incorporate growth as part of their definition of entrepreneurship. Economic growth is engineered by the activities of the entrepreneurs. The purpose of the present theoretical study is to explore the working practices of the successful entrepreneurs towards achieving business growth by understanding the experiences of the entrepreneur using the Theory of Planned Behaviour (TPB) as a lens. Ten successful entrepreneurs in the North West of England in various business sectors were interviewed using semi-structured interview method. The recorded audio interviews transcribed and subsequently evaluated using the thematic deductive technique (qualitative approach). The themes were examined using Theory of Planned Behaviour to ascertain the presence of the three intentional antecedents (attitude, subjective norms, and perceived behavioural control). The findings categorised in two folds, firstly, it was observed that the three intentional antecedents, which make up Theory of Planned Behaviour were evident in the transcript. Secondly, the entrepreneurs are most concerned with achieving a state of freedom and realising their visions and ambitions. Nevertheless, the entrepreneur employed these intentional antecedents to enhance business growth. In conclusion, the work presented here showed a novel way of understanding the working practices and experiences of the entrepreneur using the theory of planned behaviour in qualitative approach towards enhancing business growth. There exist few qualitative studies in entrepreneurship research. In addition, this work applies a novel approach to studying the experience of the entrepreneurs by examining the working practices of the successful entrepreneurs in the North-West England through the lens of the theory of planned behaviour. Given the findings regarding TPB as a lens in the study, the entrepreneur does not differentiate between the categories of the antecedents reasonably sees them as processes that can be utilised to enhance business growth.

Keywords: business growth, experience, interpretive, theory of planned behaviour

Procedia PDF Downloads 200
7845 Organizational Culture and Its Internalization of Change in the Manufacturing and Service Sector Industries in India

Authors: Rashmi Uchil, A. H. Sequeira

Abstract:

Post-liberalization era in India has seen an unprecedented growth of mergers, both domestic as well as cross-border deals. Indian organizations have slowly begun appreciating this inorganic method of growth. However, all is not well as is evidenced in the lowering value creation of organizations after mergers. Several studies have identified that organizational culture is one of the key factors that affects the success of mergers. But very few studies have been attempted in this realm in India. The current study attempts to identify the factors in the organizational culture variable that may be unique to India. It also focuses on the difference in the impact of organizational culture on merger of organizations in the manufacturing and service sectors in India. The study uses a mixed research approach. An exploratory research approach is adopted to identify the variables that constitute organizational culture specifically in the Indian scenario. A few hypotheses were developed from the identified variables and tested to arrive at the Grounded Theory. The Grounded Theory approach used in the study, attempts to integrate the variables related to organizational culture. Descriptive approach is used to validate the developed grounded theory with a new empirical data set and thus test the relationship between the organizational culture variables and the success of mergers. Empirical data is captured from merged organizations situated in major cities of India. These organizations represent significant proportions of the total number of organizations which have adopted mergers. The mix of industries included software, banking, manufacturing, pharmaceutical and financial services. Mixed sampling approach was adopted for this study. The first phase of sampling was conducted using the probability method of stratified random sampling. The study further used the non-probability method of judgmental sampling. Adequate sample size was identified for the study which represents the top, middle and junior management levels of the organizations that had adopted mergers. Validity and reliability of the research instrument was ensured with appropriate tests. Statistical tools like regression analysis, correlation analysis and factor analysis were used for data analysis. The results of the study revealed a strong relationship between organizational culture and its impact on the success of mergers. The study also revealed that the results were unique to the extent that they highlighted a marked difference in the manner of internalization of change of organizational culture after merger by the organizations in the manufacturing sector. Further, the study reveals that the organizations in the service sector internalized the changes at a slower rate. The study also portrays the industries in the manufacturing sector as more proactive and can contribute to a change in the perception of the said organizations.

Keywords: manufacturing industries, mergers, organizational culture, service industries

Procedia PDF Downloads 286
7844 Antecedents of Online Trust Towards E-Retailers for Repeat Buyers: An Empirical Study in Indian Context

Authors: Prageet Aeron, Shilpi Jain

Abstract:

The present work explores the trust building mechanisms in the context of e-commerce vendors and reconciles trust as a cognitive as well as a knowledge-based mechanism in the framework which is developed. The paper conducts an empirical examination of the variables integrity, benevolence, and ability with trust as the dependent variable and propensity to trust as the mediating variable. Authors establish ability and integrity as primary antecedents as well as establish the central role of trust propensity in the online context for Indian buyers. Authors further identify that benevolence in the context of Indian buyers online behaviour seems insignificant, and this seems counter-intutive given the role of discounts in the Indian market. Lastly, authors conclude that the role of media and social influencers in building a perception of trust seems of little consequence.

Keywords: e-commerce, trust, e-retailers, propensity to trust

Procedia PDF Downloads 341
7843 DeepNIC a Method to Transform Each Tabular Variable into an Independant Image Analyzable by Basic CNNs

Authors: Nguyen J. M., Lucas G., Ruan S., Digonnet H., Antonioli D.

Abstract:

Introduction: Deep Learning (DL) is a very powerful tool for analyzing image data. But for tabular data, it cannot compete with machine learning methods like XGBoost. The research question becomes: can tabular data be transformed into images that can be analyzed by simple CNNs (Convolutional Neuron Networks)? Will DL be the absolute tool for data classification? All current solutions consist in repositioning the variables in a 2x2 matrix using their correlation proximity. In doing so, it obtains an image whose pixels are the variables. We implement a technology, DeepNIC, that offers the possibility of obtaining an image for each variable, which can be analyzed by simple CNNs. Material and method: The 'ROP' (Regression OPtimized) model is a binary and atypical decision tree whose nodes are managed by a new artificial neuron, the Neurop. By positioning an artificial neuron in each node of the decision trees, it is possible to make an adjustment on a theoretically infinite number of variables at each node. From this new decision tree whose nodes are artificial neurons, we created the concept of a 'Random Forest of Perfect Trees' (RFPT), which disobeys Breiman's concepts by assembling very large numbers of small trees with no classification errors. From the results of the RFPT, we developed a family of 10 statistical information criteria, Nguyen Information Criterion (NICs), which evaluates in 3 dimensions the predictive quality of a variable: Performance, Complexity and Multiplicity of solution. A NIC is a probability that can be transformed into a grey level. The value of a NIC depends essentially on 2 super parameters used in Neurops. By varying these 2 super parameters, we obtain a 2x2 matrix of probabilities for each NIC. We can combine these 10 NICs with the functions AND, OR, and XOR. The total number of combinations is greater than 100,000. In total, we obtain for each variable an image of at least 1166x1167 pixels. The intensity of the pixels is proportional to the probability of the associated NIC. The color depends on the associated NIC. This image actually contains considerable information about the ability of the variable to make the prediction of Y, depending on the presence or absence of other variables. A basic CNNs model was trained for supervised classification. Results: The first results are impressive. Using the GSE22513 public data (Omic data set of markers of Taxane Sensitivity in Breast Cancer), DEEPNic outperformed other statistical methods, including XGBoost. We still need to generalize the comparison on several databases. Conclusion: The ability to transform any tabular variable into an image offers the possibility of merging image and tabular information in the same format. This opens up great perspectives in the analysis of metadata.

Keywords: tabular data, CNNs, NICs, DeepNICs, random forest of perfect trees, classification

Procedia PDF Downloads 104
7842 Possibility Theory Based Multi-Attribute Decision-Making: Application in Facility Location-Selection Problem under Uncertain and Extreme Environment

Authors: Bezhan Ghvaberidze

Abstract:

A fuzzy multi-objective facility location-selection problem (FLSP) under uncertain and extreme environments based on possibility theory is developed. The model’s uncertain parameters in the q-rung orthopair fuzzy values are presented and transformed in the Dempster-Shaper’s belief structure environment. An objective function – distribution centers’ selection ranking index as an extension of Dempster’s extremal expectations under discrimination q-rung orthopair fuzzy information is constructed. Experts evaluate each humanitarian aid from distribution centers (HADC) against each of the uncertain factors. HADCs location problem is reduced to the bicriteria problem of partitioning the set of customers by the set of centers: (1) – Minimization of transportation costs; (2) – Maximization of centers’ selection ranking indexes. Partitioning type constraints are also constructed. For an illustration of the obtained results, a numerical example is created from the facility location-selection problem.

Keywords: FLSP, multi-objective combinatorial optimization problem, evidence theory, HADC, q-rung orthopair fuzzy set, possibility theory

Procedia PDF Downloads 106
7841 Songs from the Cradle: An Analysis of Some Selected Nupe Songs

Authors: Zainab Zendana Shafii

Abstract:

Lullabies have been broadly defined as songs that are sung to calm and soothe children. While this is true, this paper intends to show that lullabies exceed these functions. The paper, in exploring Nupe lullabies, examines the various functions that lullabies perform in terms of language development, cultural enrichment and also the retelling of history as it relates to the culture of the Nupe people of northern Nigeria. The theoretical framework used is the functionalist theory. This theory postulates that all cultural or social phenomena have a positive function and that all are indispensable. The functionalist theory is based on the premise that all aspects of a society—institutions, roles, norms, etc.—serve a purpose and that all are indispensable for the long-term survival of the society. To this end, this paper dissects the various lullabies in Nupeland with a view to exploring the meaning that these songs generate and why they are even sung at all. The qualitative research methodology has been used to gather materials.

Keywords: Nupe, lullabies, Nigeria, northern

Procedia PDF Downloads 183
7840 Disassociating Preferences from Evaluations Towards Pseudo Drink Brands

Authors: Micah Amd

Abstract:

Preferences towards unfamiliar drink brands can be predictably influenced following correlations of subliminally-presented brands (CS) with positively valenced attributes (US). Alternatively, evaluations towards subliminally-presented CS may be more variable, suggesting that CS-evoked evaluations may disassociate from CS-associated preferences following subliminal CS-US conditioning. We assessed this hypothesis over three experiments (Ex1, Ex2, Ex3). Across each experiment, participants first provided preferences and evaluations towards meaningless trigrams (CS) as a baseline, followed by conditioning and a final round of preference and evaluation measurements. During conditioning, four pairs of subliminal and supraliminal/visible CS were respectively correlated with four US categories varying along aggregate valence (e.g., 100% positive, 80% positive, 40% positive, 0% positive – for Ex1 and Ex2). Across Ex1 and Ex2, presentation durations for subliminal CS were 34 and 17 milliseconds, respectively. Across Ex3, aggregate valences of the four US categories were altered (75% positive, 55% positive, 45% positive, 25% positive). Valence across US categories was manipulated to address a supplemental query of whether US-to-CS valence transfer was summative or integrative. During analysis, we computed two sets of difference scores reflecting pre-post preference and evaluation performances, respectively. These were subjected to Bayes tests. Across all experiments, results illustrated US-to-CS valence transfer was most likely to shift evaluations for visible CS, but least likely to shift evaluations for subliminal CS. Alternatively, preferences were likely to shift following correlations with single-valence categories (e.g., 100% positive, 100% negative) across both visible and subliminal CS. Our results suggest that CS preferences can be influenced through subliminal conditioning even as CS evaluations remain unchanged, supporting our central hypothesis. As for whether transfer effects are summative/integrative, our results were more mixed; a comparison of relative likelihoods revealed that preferences are more likely to reflect summative effects whereas evaluations reflect integration, independent of visibility condition.

Keywords: subliminal conditioning, evaluations, preferences, valence transfer

Procedia PDF Downloads 144