Search results for: random matrix theory
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 8554

Search results for: random matrix theory

7834 Generalized Mean-Field Theory of Phase Unwrapping via Multiple Interferograms

Authors: Yohei Saika

Abstract:

On the basis of Bayesian inference using the maximizer of the posterior marginal estimate, we carry out phase unwrapping using multiple interferograms via generalized mean-field theory. Numerical calculations for a typical wave-front in remote sensing using the synthetic aperture radar interferometry, phase diagram in hyper-parameter space clarifies that the present method succeeds in phase unwrapping perfectly under the constraint of surface- consistency condition, if the interferograms are not corrupted by any noises. Also, we find that prior is useful for extending a phase in which phase unwrapping under the constraint of the surface-consistency condition. These results are quantitatively confirmed by the Monte Carlo simulation.

Keywords: Bayesian inference, generalized mean-field theory, phase unwrapping, multiple interferograms, statistical mechanics

Procedia PDF Downloads 474
7833 Dark Gravity Confronted with Supernovae, Baryonic Oscillations and Cosmic Microwave Background Data

Authors: Frederic Henry-Couannier

Abstract:

Dark Gravity is a natural extension of general relativity in presence of a flat non dynamical background. Matter and radiation fields from its dark sector, as soon as their gravity dominates over our side fields gravity, produce a constant acceleration law of the scale factor. After a brief reminder of the Dark Gravity theory foundations, the confrontation with the main cosmological probes is carried out. We show that, amazingly, the sudden transition between the usual matter dominated decelerated expansion law a(t) ∝ t²/³ and this accelerated expansion law a(t) ∝ t² predicted by the theory should be able to fit the main cosmological probes (SN, BAO, CMB and age of the oldest stars data) but also direct H₀ measurements with two free parameters only: H₀ and the transition redshift.

Keywords: anti-gravity, negative energies, time reversal, field discontinuities, dark energy theory

Procedia PDF Downloads 48
7832 The Philosophy of Language Theory in the Standard Malay Primary School Curriculum in Malaysia

Authors: Mohd Rashid Bin Hj. Md Idris, Lajiman Bin Janoory, Abdullah Bin Yusof, Mahzir Bin Ibrahim

Abstract:

The Malay language curriculum at primary school level in Malaysia is instrumental in ensuring the status of the language as the official and national language, the language of instruction as well as the language that unites the various ethnics in Malaysia. A research addressing issues related to the curriculum standard is, therefore, essential to provide value added quality to the existing National Education Philosophy in ongoing efforts to produce an individual who is balanced in intellectual, spiritual, emotional and physical developments. The objective of this study is to examine the Philosophy of Language Theory, to review the content of the Malay language subject in relation to the Standard Curriculum for Primary Schools (KSSR), and to identify aspects of Theory of Philosophy in the Standard Curriculum for Primary Schools. The Malay language Primary School Curriculum is designed to enable students to be competent speakers and communicators of the language in order to gain knowledge, skills, information, values, and ideas and to enhance skills in social relations. Therefore, this study is designed to help educators to achieve all the stated goals. At the same time students at primary school level are expected to be able to apply the principle of language perfection as stated in the Philosophy of Language Theory to enable them to understand, appreciate and to take pride in being a Malaysian who speaks the language well.

Keywords: language, philosophy, theory, curriculum, standard, national education philosophy

Procedia PDF Downloads 590
7831 Coupling Random Demand and Route Selection in the Transportation Network Design Problem

Authors: Shabnam Najafi, Metin Turkay

Abstract:

Network design problem (NDP) is used to determine the set of optimal values for certain pre-specified decision variables such as capacity expansion of nodes and links by optimizing various system performance measures including safety, congestion, and accessibility. The designed transportation network should improve objective functions defined for the system by considering the route choice behaviors of network users at the same time. The NDP studies mostly investigated the random demand and route selection constraints separately due to computational challenges. In this work, we consider both random demand and route selection constraints simultaneously. This work presents a nonlinear stochastic model for land use and road network design problem to address the development of different functional zones in urban areas by considering both cost function and air pollution. This model minimizes cost function and air pollution simultaneously with random demand and stochastic route selection constraint that aims to optimize network performance via road capacity expansion. The Bureau of Public Roads (BPR) link impedance function is used to determine the travel time function in each link. We consider a city with origin and destination nodes which can be residential or employment or both. There are set of existing paths between origin-destination (O-D) pairs. Case of increasing employed population is analyzed to determine amount of roads and origin zones simultaneously. Minimizing travel and expansion cost of routes and origin zones in one side and minimizing CO emission in the other side is considered in this analysis at the same time. In this work demand between O-D pairs is random and also the network flow pattern is subject to stochastic user equilibrium, specifically logit route choice model. Considering both demand and route choice, random is more applicable to design urban network programs. Epsilon-constraint is one of the methods to solve both linear and nonlinear multi-objective problems. In this work epsilon-constraint method is used to solve the problem. The problem was solved by keeping first objective (cost function) as the objective function of the problem and second objective as a constraint that should be less than an epsilon, where epsilon is an upper bound of the emission function. The value of epsilon should change from the worst to the best value of the emission function to generate the family of solutions representing Pareto set. A numerical example with 2 origin zones and 2 destination zones and 7 links is solved by GAMS and the set of Pareto points is obtained. There are 15 efficient solutions. According to these solutions as cost function value increases, emission function value decreases and vice versa.

Keywords: epsilon-constraint, multi-objective, network design, stochastic

Procedia PDF Downloads 637
7830 Classification of Contexts for Mentioning Love in Interviews with Victims of the Holocaust

Authors: Marina Yurievna Aleksandrova

Abstract:

Research of the Holocaust retains value not only for history but also for sociology and psychology. One of the most important fields of study is how people were coping during and after this traumatic event. The aim of this paper is to identify the main contexts of the topic of love and to determine which contexts are more characteristic for different groups of victims of the Holocaust (gender, nationality, age). In this research, transcripts of interviews with Holocaust victims that were collected during 1946 for the "Voices of the Holocaust" project were used as data. Main contexts were analyzed with methods of network analysis and latent semantic analysis and classified by gender, age, and nationality with random forest. The results show that love is articulated and described significantly differently for male and female informants, nationality is shown results with lower values of quality metrics, as well as the age.

Keywords: Holocaust, latent semantic analysis, network analysis, text-mining, random forest

Procedia PDF Downloads 176
7829 Predictive Modelling of Curcuminoid Bioaccessibility as a Function of Food Formulation and Associated Properties

Authors: Kevin De Castro Cogle, Mirian Kubo, Maria Anastasiadi, Fady Mohareb, Claire Rossi

Abstract:

Background: The bioaccessibility of bioactive compounds is a critical determinant of the nutritional quality of various food products. Despite its importance, there is a limited number of comprehensive studies aimed at assessing how the composition of a food matrix influences the bioaccessibility of a compound of interest. This knowledge gap has prompted a growing need to investigate the intricate relationship between food matrix formulations and the bioaccessibility of bioactive compounds. One such class of bioactive compounds that has attracted considerable attention is curcuminoids. These naturally occurring phytochemicals, extracted from the roots of Curcuma longa, have gained popularity owing to their purported health benefits and also well known for their poor bioaccessibility Project aim: The primary objective of this research project is to systematically assess the influence of matrix composition on the bioaccessibility of curcuminoids. Additionally, this study aimed to develop a series of predictive models for bioaccessibility, providing valuable insights for optimising the formula for functional foods and provide more descriptive nutritional information to potential consumers. Methods: Food formulations enriched with curcuminoids were subjected to in vitro digestion simulation, and their bioaccessibility was characterized with chromatographic and spectrophotometric techniques. The resulting data served as the foundation for the development of predictive models capable of estimating bioaccessibility based on specific physicochemical properties of the food matrices. Results: One striking finding of this study was the strong correlation observed between the concentration of macronutrients within the food formulations and the bioaccessibility of curcuminoids. In fact, macronutrient content emerged as a very informative explanatory variable of bioaccessibility and was used, alongside other variables, as predictors in a Bayesian hierarchical model that predicted curcuminoid bioaccessibility accurately (optimisation performance of 0.97 R2) for the majority of cross-validated test formulations (LOOCV of 0.92 R2). These preliminary results open the door to further exploration, enabling researchers to investigate a broader spectrum of food matrix types and additional properties that may influence bioaccessibility. Conclusions: This research sheds light on the intricate interplay between food matrix composition and the bioaccessibility of curcuminoids. This study lays a foundation for future investigations, offering a promising avenue for advancing our understanding of bioactive compound bioaccessibility and its implications for the food industry and informed consumer choices.

Keywords: bioactive bioaccessibility, food formulation, food matrix, machine learning, probabilistic modelling

Procedia PDF Downloads 65
7828 Crystalline Particles Dispersed Cu-Based Metallic Glassy Composites Fabricated by Spark Plasma Sintering

Authors: Sandrine Cardinal, Jean-Marc Pelletier, Guang Xie, Florian Mercier, Florent Delmas

Abstract:

Bulk metallic glasses exhibit several superior properties, compared to their corresponding crystalline counterpart, such as high strength, high elastic limit or good corrosion resistance. Therefore they can be considered as good candidates for structural applications in many sectors. However, they are generally brittle and do not exhibit plastic deformation at room temperature. These materials are mainly obtained by rapid cooling from a liquid state to prevent crystallization, which limits their size. To overcome these two drawbacks: fragility and limited dimensions, composite metallic glass matrix reinforced by a second phase whose role is to slow crack growth are developed. Concerning the limited size of the pieces, the proposed solution is to get the material from amorphous powders by densifying under load. In this study, Cu50Zr45Al5 bulk metallic glassy matrix composites (MGMCs) containing different volume fraction (Vf) of Zr crystalline particles were manufactured by spark plasma sintering (SPS). Microstructure, thermal stability and mechanical properties of the MGMCs were investigated. Matrix of the composites remains a fully amorphous phase after consolidation at 420°C under 600 MPa. A good dispersion of the particles in the glassy matrix is obtained. Results show that the compressive strength decreases with Vf : 1670 MPa (Vf=0%) to 1300MPa (Vf=30%), the elastic modulus decreases but only slighty respectively 97.3GPa and 94.5 GPa and plasticity is improved from 0 to 4%. Fractographic investigation indicates a good bonding between amorphous and crystalline particles. In conclusion, present study has demonstrated that SPS method is useful for the synthesis of the bulk glassy composites. Large controlled microstructure specimens with interesting ductility can be obtained compared with others methods.

Keywords: composite, mechanical properties, metallic glasses, spark plasma sintering

Procedia PDF Downloads 275
7827 Neuron Imaging in Lateral Geniculate Nucleus

Authors: Sandy Bao, Yankang Bao

Abstract:

The understanding of information that is being processed in the brain, especially in the lateral geniculate nucleus (LGN), has been proven challenging for modern neuroscience and for researchers with a focus on how neurons process signals and images. In this paper, we are proposing a method to image process different colors within different layers of LGN, that is, green information in layers 4 & 6 and red & blue in layers 3 & 5 based on the surface dimension of layers. We take into consideration the images in LGN and visual cortex, and that the edge detected information from the visual cortex needs to be considered in order to return back to the layers of LGN, along with the image in LGN to form the new image, which will provide an improved image that is clearer, sharper, and making it easier to identify objects in the image. Matrix Laboratory (MATLAB) simulation is performed, and results show that the clarity of the output image has significant improvement.

Keywords: lateral geniculate nucleus, matrix laboratory, neuroscience, visual cortex

Procedia PDF Downloads 267
7826 Belief-Based Games: An Appropriate Tool for Uncertain Strategic Situation

Authors: Saied Farham-Nia, Alireza Ghaffari-Hadigheh

Abstract:

Game theory is a mathematical tool to study the behaviors of a rational and strategic decision-makers, that analyze existing equilibrium in interest conflict situation and provides an appropriate mechanisms for cooperation between two or more player. Game theory is applicable for any strategic and interest conflict situation in politics, management and economics, sociology and etc. Real worlds’ decisions are usually made in the state of indeterminacy and the players often are lack of the information about the other players’ payoffs or even his own, which leads to the games in uncertain environments. When historical data for decision parameters distribution estimation is unavailable, we may have no choice but to use expertise belief degree, which represents the strength with that we believe the event will happen. To deal with belief degrees, we have use uncertainty theory which is introduced and developed by Liu based on normality, duality, subadditivity and product axioms to modeling personal belief degree. As we know, the personal belief degree heavily depends on the personal knowledge concerning the event and when personal knowledge changes, cause changes in the belief degree too. Uncertainty theory not only theoretically is self-consistent but also is the best among other theories for modeling belief degree on practical problem. In this attempt, we primarily reintroduced Expected Utility Function in uncertainty environment according to uncertainty theory axioms to extract payoffs. Then, we employed Nash Equilibrium to investigate the solutions. For more practical issues, Stackelberg leader-follower Game and Bertrand Game, as a benchmark models are discussed. Compared to existing articles in the similar topics, the game models and solution concepts introduced in this article can be a framework for problems in an uncertain competitive situation based on experienced expert’s belief degree.

Keywords: game theory, uncertainty theory, belief degree, uncertain expected value, Nash equilibrium

Procedia PDF Downloads 411
7825 Theorical Studies on the Structural Properties of 2,3-Bis(Furan-2-Yl)Pyrazino[2,3-F][1,10]Phenanthroline Derivaties

Authors: Zahra Sadeghian

Abstract:

This paper reports on the geometrical parameters optimized of the stationary point for the 2,3-Bis(furan-2-yl)pyrazino[2,3-f][1,10]phenanthroline. The calculations are performed using density functional theory (DFT) method at the B3LYP/LanL2DZ level. We determined bond lengths and bond angles values for the compound and calculate the amount of bond hybridization according to the natural bond orbital theory (NBO) too. The energy of frontier orbital (HOMO and LUMO) are computed. In addition, calculated data are accurately compared with the experimental result. This comparison show that the our theoretical data are in reasonable agreement with the experimental values.

Keywords: 2, 3-Bis(furan-2-yl)pyrazino[2, 3-f][1, 10]phenanthroline, density functional theory, theorical calculations, LanL2DZ level, B3LYP level

Procedia PDF Downloads 365
7824 Rounding Technique's Application in Schnorr Signature Algorithm: Known Partially Most Significant Bits of Nonce

Authors: Wenjie Qin, Kewei Lv

Abstract:

In 1996, Boneh and Venkatesan proposed the Hidden Number Problem (HNP) and proved the most significant bits (MSB) of computational Diffie-Hellman key exchange scheme and related schemes are unpredictable bits. They also gave a method which is a lattice rounding technique to solve HNP in non-uniform model. In this paper, we put forward a new concept that is Schnorr-MSB-HNP. We also reduce the problem of solving Schnorr signature private key with a few consecutive most significant bits of random nonce (used at each signature generation) to Schnorr-MSB-HNP, then we use the rounding technique to solve the Schnorr-MSB-HNP. We have come to the conclusion that if there is a ‘miraculous box’ which inputs the random nonce and outputs 2loglogq (q is a prime number) most significant bits of nonce, the signature private key will be obtained by choosing 2logq signature messages randomly. Thus we get an attack on the Schnorr signature private key.

Keywords: rounding technique, most significant bits, Schnorr signature algorithm, nonce, Schnorr-MSB-HNP

Procedia PDF Downloads 229
7823 Influence of Hygro-Thermo-Mechanical Loading on Buckling and Vibrational Behavior of FG-CNT Composite Beam with Temperature Dependent Characteristics

Authors: Puneet Kumar, Jonnalagadda Srinivas

Abstract:

The authors report here vibration and buckling analysis of functionally graded carbon nanotube-polymer composite (FG-CNTPC) beams under hygro-thermo-mechanical environments using higher order shear deformation theory. The material properties of CNT and polymer matrix are often affected by temperature and moisture content. A micromechanical model with agglomeration effect is employed to compute the elastic, thermal and moisture properties of the composite beam. The governing differential equation of FG-CNTRPC beam is developed using higher-order shear deformation theory to account shear deformation effects. The elastic, thermal and hygroscopic strain terms are derived from variational principles. Moreover, thermal and hygroscopic loads are determined by considering uniform, linear and sinusoidal variation of temperature and moisture content through the thickness. Differential equations of motion are formulated as an eigenvalue problem using appropriate displacement fields and solved by using finite element modeling. The obtained results of natural frequencies and critical buckling loads show a good agreement with published data. The numerical illustrations elaborate the dynamic as well as buckling behavior under uniaxial load for different environmental conditions, boundary conditions and volume fraction distribution profile, beam slenderness ratio. Further, comparisons are shown at different boundary conditions, temperatures, degree of moisture content, volume fraction as well as agglomeration of CNTs, slenderness ratio of beam for different shear deformation theories.

Keywords: hygrothermal effect, free vibration, buckling load, agglomeration

Procedia PDF Downloads 260
7822 Early Talent Identification and Its Impact on Children’s Growth and Development: An Examination of “The Social Learning Theory, by Albert Bandura"

Authors: Michael Subbey, Kwame Takyi Danquah

Abstract:

Finding a child's exceptional skills and abilities at a young age and nurturing them is a challenging process. The Social Learning Theory (SLT) of Albert Bandura is used to analyze the effects of early talent identification on children's growth and development. The study examines both the advantages and disadvantages of early talent identification and stresses the significance of a moral strategy that puts the welfare of the child first. The paper emphasizes the value of a balanced approach to early talent identification that takes into account individual differences, cultural considerations, and the child's social environment.

Keywords: early talent development, social learning theory, child development, child welfare

Procedia PDF Downloads 95
7821 Efficient Credit Card Fraud Detection Based on Multiple ML Algorithms

Authors: Neha Ahirwar

Abstract:

In the contemporary digital era, the rise of credit card fraud poses a significant threat to both financial institutions and consumers. As fraudulent activities become more sophisticated, there is an escalating demand for robust and effective fraud detection mechanisms. Advanced machine learning algorithms have become crucial tools in addressing this challenge. This paper conducts a thorough examination of the design and evaluation of a credit card fraud detection system, utilizing four prominent machine learning algorithms: random forest, logistic regression, decision tree, and XGBoost. The surge in digital transactions has opened avenues for fraudsters to exploit vulnerabilities within payment systems. Consequently, there is an urgent need for proactive and adaptable fraud detection systems. This study addresses this imperative by exploring the efficacy of machine learning algorithms in identifying fraudulent credit card transactions. The selection of random forest, logistic regression, decision tree, and XGBoost for scrutiny in this study is based on their documented effectiveness in diverse domains, particularly in credit card fraud detection. These algorithms are renowned for their capability to model intricate patterns and provide accurate predictions. Each algorithm is implemented and evaluated for its performance in a controlled environment, utilizing a diverse dataset comprising both genuine and fraudulent credit card transactions.

Keywords: efficient credit card fraud detection, random forest, logistic regression, XGBoost, decision tree

Procedia PDF Downloads 57
7820 Role of Interlayer Coupling for the Power Factor of CuSbS2 and CuSbSe2

Authors: Najebah Alsaleh, Nirpendra Singh, Udo Schwingenschlogl

Abstract:

The electronic and transport properties of bulk and monolayer CuSbS2 and CuSbSe2 are determined by using density functional theory and semiclassical Boltzmann transport theory, in order to investigate the role of interlayer coupling for the thermoelectric properties. The calculated band gaps of the bulk compounds are in agreement with experiments and significantly higher than those of the monolayers, which thus show lower Seebeck coefficients. Since also the electrical conductivity is lower, the monolayers are characterized by lower power factors. Therefore, interlayer coupling is found to be essential for the excellent thermoelectric response of CuSbS2 and CuSbSe2, even though it is weak.

Keywords: density functional theory, thermoelectric, electronic properties, monolayer

Procedia PDF Downloads 316
7819 Grounded Theory of Consumer Loyalty: A Perspective through Video Game Addiction

Authors: Bassam Shaikh, R. S. A. Jumain

Abstract:

Game addiction has become an extremely important topic in psychology researchers, particularly in understanding and explaining why individuals become addicted (to video games). In previous studies, effect of online game addiction on social responsibilities, health problems, government action, and the behaviors of individuals to purchase and the causes of making individuals addicted on the video games has been discussed. Extending these concepts in marketing, it could be argued than the phenomenon could enlighten and extending our understanding on consumer loyalty. This study took the Grounded Theory approach, and found that motivation, satisfaction, fulfillments, exploration and achievements to be part of the important elements that builds consumer loyalty.

Keywords: grounded theory, consumer loyalty, video games, video game addiction

Procedia PDF Downloads 527
7818 Performance Optimization on Waiting Time Using Queuing Theory in an Advanced Manufacturing Environment: Robotics to Enhance Productivity

Authors: Ganiyat Soliu, Glen Bright, Chiemela Onunka

Abstract:

Performance optimization plays a key role in controlling the waiting time during manufacturing in an advanced manufacturing environment to improve productivity. Queuing mathematical modeling theory was used to examine the performance of the multi-stage production line. Robotics as a disruptive technology was implemented into a virtual manufacturing scenario during the packaging process to study the effect of waiting time on productivity. The queuing mathematical model was used to determine the optimum service rate required by robots during the packaging stage of manufacturing to yield an optimum production cost. Different rates of production were assumed in a virtual manufacturing environment, cost of packaging was estimated with optimum production cost. An equation was generated using queuing mathematical modeling theory and the theorem adopted for analysis of the scenario is the Newton Raphson theorem. Queuing theory presented here provides an adequate analysis of the number of robots required to regulate waiting time in order to increase the number of output. Arrival rate of the product was fast which shows that queuing mathematical model was effective in minimizing service cost and the waiting time during manufacturing. At a reduced waiting time, there was an improvement in the number of products obtained per hour. The overall productivity was improved based on the assumptions used in the queuing modeling theory implemented in the virtual manufacturing scenario.

Keywords: performance optimization, productivity, queuing theory, robotics

Procedia PDF Downloads 144
7817 Examining the Attitude and Behavior Towards Household Waste in China With the Theory of Planned Behavior and PEST Analysis

Authors: Yuxuan Liu, Jianli Hao, Ruoyu Zhang, Lin Lin, Nelsen Andreco Muljadi, Yu Song, Guobin Gong

Abstract:

With the increased municipal waste of China, household waste management (HWM) has become a key issue for sustainable development. In this study, an online survey questionnaire was conducted with the aim of assessing the current attitudes and behaviors of the households in China towards waste separationand recycling practices. Related influential factors are also determined within the context of the theory of planned behavior and PEST analysis. The survey received a total of 551 valid respondents. Results showed that the sample has an overall positive attitudes and behavior toward participating in HWM, but only 16.3% of themregularly segregate their waste. Society and policy are also found to be the two most impactful factors.

Keywords: householde waste management, theory of planned behavior, attitude, behavior

Procedia PDF Downloads 193
7816 Finite Element Modelling of a 3D Woven Composite for Automotive Applications

Authors: Ahmad R. Zamani, Luigi Sanguigno, Angelo R. Maligno

Abstract:

A 3D woven composite, designed for automotive applications, is studied using Abaqus Finite Element (FE) software suite. Python scripts were developed to build FE models of the woven composite in Complete Abaqus Environment (CAE). They can read TexGen or WiseTex files and automatically generate consistent meshes of the fabric and the matrix. A user menu is provided to help define parameters for the FE models, such as type and size of the elements in fabric and matrix as well as the type of matrix-fabric interaction. Node-to-node constraints were imposed to guarantee periodicity of the deformed shapes at the boundaries of the representative volume element of the composite. Tensile loads in three axes and biaxial loads in x-y directions have been applied at different Fibre Volume Fractions (FVFs). A simple damage model was implemented via an Abaqus user material (UMAT) subroutine. Existing tools for homogenization were also used, including voxel mesh generation from TexGen as well as Abaqus Micromechanics plugin. Linear relations between homogenised elastic properties and the FVFs are given. The FE models of composite exhibited balanced behaviour with respect to warp and weft directions in terms of both stiffness and strength.

Keywords: 3D woven composite (3DWC), meso-scale finite element model, homogenisation of elastic material properties, Abaqus Python scripting

Procedia PDF Downloads 136
7815 Image Rotation Using an Augmented 2-Step Shear Transform

Authors: Hee-Choul Kwon, Heeyong Kwon

Abstract:

Image rotation is one of main pre-processing steps for image processing or image pattern recognition. It is implemented with a rotation matrix multiplication. It requires a lot of floating point arithmetic operations and trigonometric calculations, so it takes a long time to execute. Therefore, there has been a need for a high speed image rotation algorithm without two major time-consuming operations. However, the rotated image has a drawback, i.e. distortions. We solved the problem using an augmented two-step shear transform. We compare the presented algorithm with the conventional rotation with images of various sizes. Experimental results show that the presented algorithm is superior to the conventional rotation one.

Keywords: high-speed rotation operation, image rotation, transform matrix, image processing, pattern recognition

Procedia PDF Downloads 272
7814 Sampled-Data Control for Fuel Cell Systems

Authors: H. Y. Jung, Ju H. Park, S. M. Lee

Abstract:

A sampled-data controller is presented for solid oxide fuel cell systems which is expressed by a sector bounded nonlinear model. The sector bounded nonlinear systems, which have a feedback connection with a linear dynamical system and nonlinearity satisfying certain sector type constraints. Also, the sampled-data control scheme is very useful since it is possible to handle digital controller and increasing research efforts have been devoted to sampled-data control systems with the development of modern high-speed computers. The proposed control law is obtained by solving a convex problem satisfying several linear matrix inequalities. Simulation results are given to show the effectiveness of the proposed design method.

Keywords: sampled-data control, fuel cell, linear matrix inequalities, nonlinear control

Procedia PDF Downloads 562
7813 Construction of Graph Signal Modulations via Graph Fourier Transform and Its Applications

Authors: Xianwei Zheng, Yuan Yan Tang

Abstract:

Classical window Fourier transform has been widely used in signal processing, image processing, machine learning and pattern recognition. The related Gabor transform is powerful enough to capture the texture information of any given dataset. Recently, in the emerging field of graph signal processing, researchers devoting themselves to develop a graph signal processing theory to handle the so-called graph signals. Among the new developing theory, windowed graph Fourier transform has been constructed to establish a time-frequency analysis framework of graph signals. The windowed graph Fourier transform is defined by using the translation and modulation operators of graph signals, following the similar calculations in classical windowed Fourier transform. Specifically, the translation and modulation operators of graph signals are defined by using the Laplacian eigenvectors as follows. For a given graph signal, its translation is defined by a similar manner as its definition in classical signal processing. Specifically, the translation operator can be defined by using the Fourier atoms; the graph signal translation is defined similarly by using the Laplacian eigenvectors. The modulation of the graph can also be established by using the Laplacian eigenvectors. The windowed graph Fourier transform based on these two operators has been applied to obtain time-frequency representations of graph signals. Fundamentally, the modulation operator is defined similarly to the classical modulation by multiplying a graph signal with the entries in each Fourier atom. However, a single Laplacian eigenvector entry cannot play a similar role as the Fourier atom. This definition ignored the relationship between the translation and modulation operators. In this paper, a new definition of the modulation operator is proposed and thus another time-frequency framework for graph signal is constructed. Specifically, the relationship between the translation and modulation operations can be established by the Fourier transform. Specifically, for any signal, the Fourier transform of its translation is the modulation of its Fourier transform. Thus, the modulation of any signal can be defined as the inverse Fourier transform of the translation of its Fourier transform. Therefore, similarly, the graph modulation of any graph signal can be defined as the inverse graph Fourier transform of the translation of its graph Fourier. The novel definition of the graph modulation operator established a relationship of the translation and modulation operations. The new modulation operation and the original translation operation are applied to construct a new framework of graph signal time-frequency analysis. Furthermore, a windowed graph Fourier frame theory is developed. Necessary and sufficient conditions for constructing windowed graph Fourier frames, tight frames and dual frames are presented in this paper. The novel graph signal time-frequency analysis framework is applied to signals defined on well-known graphs, e.g. Minnesota road graph and random graphs. Experimental results show that the novel framework captures new features of graph signals.

Keywords: graph signals, windowed graph Fourier transform, windowed graph Fourier frames, vertex frequency analysis

Procedia PDF Downloads 336
7812 Testing the Life Cycle Theory on the Capital Structure Dynamics of Trade-Off and Pecking Order Theories: A Case of Retail, Industrial and Mining Sectors

Authors: Freddy Munzhelele

Abstract:

Setting: the empirical research has shown that the life cycle theory has an impact on the firms’ financing decisions, particularly the dividend pay-outs. Accordingly, the life cycle theory posits that as a firm matures, it gets to a level and capacity where it distributes more cash as dividends. On the other hand, the young firms prioritise investment opportunities sets and their financing; thus, they pay little or no dividends. The research on firms’ financing decisions also demonstrated, among others, the adoption of trade-off and pecking order theories on the dynamics of firms capital structure. The trade-off theory talks to firms holding a favourable position regarding debt structures particularly as to the cost and benefits thereof; and pecking order is concerned with firms preferring a hierarchical order as to choosing financing sources. The case of life cycle hypothesis explaining the financial managers’ decisions as regards the firms’ capital structure dynamics appears to be an interesting link, yet this link has been neglected in corporate finance research. If this link is to be explored as an empirical research, the financial decision-making alternatives will be enhanced immensely, since no conclusive evidence has been found yet as to the dynamics of capital structure. Aim: the aim of this study is to examine the impact of life cycle theory on the capital structure dynamics trade-off and pecking order theories of firms listed in retail, industrial and mining sectors of the JSE. These sectors are among the key contributors to the GDP in the South African economy. Design and methodology: following the postpositivist research paradigm, the study is quantitative in nature and utilises secondary data obtainable from the financial statements of sampled firm for the period 2010 – 2022. The firms’ financial statements will be extracted from the IRESS database. Since the data will be in panel form, a combination of the static and dynamic panel data estimators will used to analyse data. The overall data analyses will be done using STATA program. Value add: this study directly investigates the link between the life cycle theory and the dynamics of capital structure decisions, particularly the trade-off and pecking order theories.

Keywords: life cycle theory, trade-off theory, pecking order theory, capital structure, JSE listed firms

Procedia PDF Downloads 58
7811 An Encapsulation of a Navigable Tree Position: Theory, Specification, and Verification

Authors: Nicodemus M. J. Mbwambo, Yu-Shan Sun, Murali Sitaraman, Joan Krone

Abstract:

This paper presents a generic data abstraction that captures a navigable tree position. The mathematical modeling of the abstraction encapsulates the current tree position, which can be used to navigate and modify the tree. The encapsulation of the tree position in the data abstraction specification avoids the use of explicit references and aliasing, thereby simplifying verification of (imperative) client code that uses the data abstraction. To ease the tasks of such specification and verification, a general tree theory, rich with mathematical notations and results, has been developed. The paper contains an example to illustrate automated verification ramifications. With sufficient tree theory development, automated proving seems plausible even in the absence of a special-purpose tree solver.

Keywords: automation, data abstraction, maps, specification, tree, verification

Procedia PDF Downloads 158
7810 On a Single Server Queue with Arrivals in Batches of Variable Size, Generalized Coxian-2 Service and Compulsory Server Vacations

Authors: Kailash C. Madan

Abstract:

We study the steady state behaviour of a batch arrival single server queue in which the first service with general service times is compulsory and the second service with general service times is optional. We term such a two phase service as generalized Coxian-2 service. Just after completion of a service the server must take a vacation of random length of time with general vacation times. We obtain steady state probability generating functions for the queue size as well as the steady state mean queue size at a random epoch of time in explicit and closed forms. Some particular cases of interest including some known results have been derived.

Keywords: batch arrivals, compound Poisson process, generalized Coxian-2 service, steady state

Procedia PDF Downloads 454
7809 Cognitive Theory and the Design of Integrate Curriculum

Authors: Bijan Gillani, Roya Gillani

Abstract:

The purpose of this paper is to propose a pedagogical model where engineering provides the interconnection to integrate the other topics of science, technology, engineering, and mathematics. The author(s) will first present a brief discussion of cognitive theory and then derive an integrated pedagogy to use engineering and technology, such as drones, sensors, camera, iPhone, radio waves as the nexus to an integrated curriculum development for the other topics of STEM. Based on this pedagogy, one example developed by the author(s) called “Drones and Environmental Science,” will be presented that uses a drone and related technology as an appropriate instructional delivery medium to apply Piaget’s cognitive theory to create environments that promote the integration of different STEM subjects that relate to environmental science.

Keywords: cogntive theories, drone, environmental science, pedagogy

Procedia PDF Downloads 569
7808 Mean Square Responses of a Cantilever Beam with Various Damping Mechanisms

Authors: Yaping Zhao, Yimin Zhang

Abstract:

In the present paper, the stationary random vibration of a uniform cantilever beam is investigated. Two types of damping mechanism, i.e. the external and internal viscous dampings, are taken into account simultaneously. The excitation form is the support motion, and it is ideal white. Because two type of damping mechanism are considered concurrently, the product of the modal damping ratio and the natural frequency is not a constant anymore. As a result, the infinite definite integral encountered in the process of computing the mean square response is more complex than that in the existing literature. One signal progress of this work is to have calculated these definite integrals accurately. The precise solution of the mean square response is thus obtained in the infinite series form finally. Numerical examples are supplied and the numerical outcomes acquired confirm the validity of the theoretical analyses.

Keywords: random vibration, cantilever beam, mean square response, white noise

Procedia PDF Downloads 379
7807 A Case Study for User Rating Prediction on Automobile Recommendation System Using Mapreduce

Authors: Jiao Sun, Li Pan, Shijun Liu

Abstract:

Recommender systems have been widely used in contemporary industry, and plenty of work has been done in this field to help users to identify items of interest. Collaborative Filtering (CF, for short) algorithm is an important technology in recommender systems. However, less work has been done in automobile recommendation system with the sharp increase of the amount of automobiles. What’s more, the computational speed is a major weakness for collaborative filtering technology. Therefore, using MapReduce framework to optimize the CF algorithm is a vital solution to this performance problem. In this paper, we present a recommendation of the users’ comment on industrial automobiles with various properties based on real world industrial datasets of user-automobile comment data collection, and provide recommendation for automobile providers and help them predict users’ comment on automobiles with new-coming property. Firstly, we solve the sparseness of matrix using previous construction of score matrix. Secondly, we solve the data normalization problem by removing dimensional effects from the raw data of automobiles, where different dimensions of automobile properties bring great error to the calculation of CF. Finally, we use the MapReduce framework to optimize the CF algorithm, and the computational speed has been improved times. UV decomposition used in this paper is an often used matrix factorization technology in CF algorithm, without calculating the interpolation weight of neighbors, which will be more convenient in industry.

Keywords: collaborative filtering, recommendation, data normalization, mapreduce

Procedia PDF Downloads 213
7806 A Theory of Aftercare for Human Trafficking Survivors: A Grounded Theory Analysis of Survivors and Aftercare Providers in South Africa

Authors: Robyn L. Curran, Joanne R. Naidoo, Gugu Mchunu

Abstract:

Along with the increasing awareness of human trafficking, is the acknowledgement that it is no longer just a social problem but also a significant public health problem that requires both increased knowledge and the specialist equipping of aftercare providers such as nurses who care for human trafficking survivors. Current discourse regarding aftercare of human trafficking survivors, is that approaches do not clearly explain the function or content of aftercare and what aftercare entails. Although psychological and medical aftercare are emphasized as important components, little practical attention is devoted to what these components actually involve and the effectiveness of current practice in aftercare. Review of the literature on the processes that take place from aftercare to empowerment, revealed the need for emphasis to be placed on the voices of survivors concerning their liberation from oppression. The aim of the study was to develop a theory for aftercare of human trafficking survivors, through analyzing the experiences of survivors and aftercare providers in shelters in three provinces in South Africa. Through using a Straussian grounded theory approach, the researcher developed a theory to inform care of human trafficking survivors in low resource settings using the voice of the survivors and those experienced in direct care of human trafficking survivors. Four human trafficking survivors and three aftercare providers from three shelters in three provinces in South Africa were individually interviewed in order for the theory to emerge. The findings of the study elicited a theoretical model of the renewed self, and the conditions that facilitate this process in care of human trafficking survivors. The process that human trafficking survivors navigate to empowerment require mutual collaboration of the aftercare provider and survivor as the survivor awakens vision, confronts reality, re-salvages autonomy and liberates self. Psychological resilience of the survivor facilitates the transition to renewed self. The recommendations of this study may improve the nursing care provided to human trafficking survivors and equip professionals with knowledge and skills to promote the process of renewing self for survivors.

Keywords: aftercare, aftercare providers, grounded theory, human trafficking survivors

Procedia PDF Downloads 271
7805 Nonparametric Copula Approximations

Authors: Serge Provost, Yishan Zang

Abstract:

Copulas are currently utilized in finance, reliability theory, machine learning, signal processing, geodesy, hydrology and biostatistics, among several other fields of scientific investigation. It follows from Sklar's theorem that the joint distribution function of a multidimensional random vector can be expressed in terms of its associated copula and marginals. Since marginal distributions can easily be determined by making use of a variety of techniques, we address the problem of securing the distribution of the copula. This will be done by using several approaches. For example, we will obtain bivariate least-squares approximations of the empirical copulas, modify the kernel density estimation technique and propose a criterion for selecting appropriate bandwidths, differentiate linearized empirical copulas, secure Bernstein polynomial approximations of suitable degrees, and apply a corollary to Sklar's result. Illustrative examples involving actual observations will be presented. The proposed methodologies will as well be applied to a sample generated from a known copula distribution in order to validate their effectiveness.

Keywords: copulas, Bernstein polynomial approximation, least-squares polynomial approximation, kernel density estimation, density approximation

Procedia PDF Downloads 65