Search results for: skew inverse Laurent series rings
2525 Comparitive Analysis of Islamic and Conventional Banking Systems in Terms of Profitability: A Study on Emerging Market Economies
Authors: Alimshan Faizulayev, Eralp Bektas, Abdul Ghafar Ismail, Bezhan Rustamov
Abstract:
This paper performs empirical analysis on determinants of profitability in Islamic and Conventional Banks. The main focus of this study is to evaluate and measure of financial performance of Islamic banking firms operating in Egypt, Iran, Malaysia, Pakistan, Turkey, UAE in contrast to Conventional ones in those countries. To evaluate empirically performance of the banks, various financial ratios are employed. We measure performance in terms of liquidity, profitability, solvency, and efficiency. In this work, t-test, F-test, and OLS analysis are used to make hypothesis tests. Our findings reveal that there are similarities and differences in profitability determinants of Islamic and Conventional banking firms. The cost to revenue ratio has inverse relationship with profitability indicators in both banking systems. However, there are differences in financial performances between Conventional Banks and Islamic banks which are found in overall picture of all banks in terms of net income margin.Keywords: Islamic banking, conventional banking, GDP growth, emerging market economies
Procedia PDF Downloads 3982524 Synthesis of New 2-(Methylthio) Benzo[g]-[1,2,4] Triazolo [1,5a] Quinazolines
Authors: Rashad A. Al-Salahi, Mohamed S. Marzouk
Abstract:
Aiming to the synthesis of bioactive triazoloquinazolines, a new series of 2-(methylthio)benzo [g]-[1,2,4] triazolo [1,5-a] quinazolin-5(4H)-ones was synthesized from 2-(methylthio)benzo [g]-[1,2,4] triazolo [1,5-a] quinazolin-5(4H)-one. All synthesized derivatives based on N-alkylation and chlorination of the parent compound and its salfonyl derivative. The success of the reactions was proved by NMR, IR, and HREI-MS analyses for all products.Keywords: triazoloquinazoline, alkylation, thionation, quinazolin
Procedia PDF Downloads 3592523 Fundamental Solutions for Discrete Dynamical Systems Involving the Fractional Laplacian
Authors: Jorge Gonzalez Camus, Valentin Keyantuo, Mahamadi Warma
Abstract:
In this work, we obtain representation results for solutions of a time-fractional differential equation involving the discrete fractional Laplace operator in terms of generalized Wright functions. Such equations arise in the modeling of many physical systems, for example, chain processes in chemistry and radioactivity. The focus is on the linear problem of the simplified Moore - Gibson - Thompson equation, where the discrete fractional Laplacian and the Caputo fractional derivate of order on (0,2] are involved. As a particular case, we obtain the explicit solution for the discrete heat equation and discrete wave equation. Furthermore, we show the explicit solution for the equation involving the perturbed Laplacian by the identity operator. The main tool for obtaining the explicit solution are the Laplace and discrete Fourier transforms, and Stirling's formula. The methodology mainly is to apply both transforms in the equation, to find the inverse of each transform, and to prove that this solution is well defined, using Stirling´s formula.Keywords: discrete fractional Laplacian, explicit representation of solutions, fractional heat and wave equations, fundamental
Procedia PDF Downloads 2092522 A Bayesian Population Model to Estimate Reference Points of Bombay-Duck (Harpadon nehereus) in Bay of Bengal, Bangladesh Using CMSY and BSM
Authors: Ahmad Rabby
Abstract:
The demographic trend analyses of Bombay-duck from time series catch data using CMSY and BSM for the first time in Bangladesh. During 2000-2018, CMSY indicates average lowest production in 2000 and highest in 2018. This has been used in the estimation of prior biomass by the default rules. Possible 31030 viable trajectories for 3422 r-k pairs were found by the CMSY analysis and the final estimates for intrinsic rate of population increase (r) was 1.19 year-1 with 95% CL= 0.957-1.48 year-1. The carrying capacity(k) of Bombay-duck was 283×103 tons with 95% CL=173×103 - 464×103 tons and MSY was 84.3×103tons year-1, 95% CL=49.1×103-145×103 tons year-1. Results from Bayesian state-space implementation of the Schaefer production model (BSM) using catch & CPUE data, found catchabilitiy coefficient(q) was 1.63 ×10-6 from lcl=1.27×10-6 to ucl=2.10×10-6 and r= 1.06 year-1 with 95% CL= 0.727 - 1.55 year-1, k was 226×103 tons with 95% CL=170×103-301×103 tons and MSY was 60×103 tons year-1 with 95% CL=49.9 ×103- 72.2 ×103 tons year-1. Results for Bombay-duck fishery management based on BSM assessment from time series catch data illustrated that, Fmsy=0.531 with 95% CL =0.364 - 0.775 (if B > 1/2 Bmsy then Fmsy =0.5r); Fmsy=0.531 with 95% CL =0.364-0.775 (r and Fmsy are linearly reduced if B < 1/2Bmsy). Biomass in 2018 was 110×103 tons with 2.5th to 97.5th percentile=82.3-155×103 tons. Relative biomass (B/Bmsy) in last year was 0.972 from 2.5th percentile to 97.5th percentile=0.728 -1.37. Fishing mortality in last year was 0.738 with 2.5th-97.5th percentile=0.525-1.37. Exploitation F/Fmsy was 1.39, from 2.5th to 97.5th percentile it was 0.988 -1.86. The biological reference points of B/BMSY was smaller than 1.0, while F/FMSY was higher than 1.0 revealed an over-exploitation of the fishery, indicating that more conservative management strategies are required for Bombay-duck fishery.Keywords: biological reference points, catchability coefficient, carrying capacity, intrinsic rate of population increase
Procedia PDF Downloads 1242521 Inter-Annual Variations of Sea Surface Temperature in the Arabian Sea
Authors: K. S. Sreejith, C. Shaji
Abstract:
Though both Arabian Sea and its counterpart Bay of Bengal is forced primarily by the semi-annually reversing monsoons, the spatio-temporal variations of surface waters is very strong in the Arabian Sea as compared to the Bay of Bengal. This study focuses on the inter-annual variability of Sea Surface Temperature (SST) in the Arabian Sea by analysing ERSST dataset which covers 152 years of SST (January 1854 to December 2002) based on the ICOADS in situ observations. To capture the dominant SST oscillations and to understand the inter-annual SST variations at various local regions of the Arabian Sea, wavelet analysis was performed on this long time-series SST dataset. This tool is advantageous over other signal analysing tools like Fourier analysis, based on the fact that it unfolds a time-series data (signal) both in frequency and time domain. This technique makes it easier to determine dominant modes of variability and explain how those modes vary in time. The analysis revealed that pentadal SST oscillations predominate at most of the analysed local regions in the Arabian Sea. From the time information of wavelet analysis, it was interpreted that these cold and warm events of large amplitude occurred during the periods 1870-1890, 1890-1910, 1930-1950, 1980-1990 and 1990-2005. SST oscillations with peaks having period of ~ 2-4 years was found to be significant in the central and eastern regions of Arabian Sea. This indicates that the inter-annual SST variation in the Indian Ocean is affected by the El Niño-Southern Oscillation (ENSO) and Indian Ocean Dipole (IOD) events.Keywords: Arabian Sea, ICOADS, inter-annual variation, pentadal oscillation, SST, wavelet analysis
Procedia PDF Downloads 2762520 A Case Study of Business Analytic Use in European Football: Analysis and Implications
Authors: M. C. Schloesser
Abstract:
The purpose of this paper is to explore the use and impact of business analytics in European football. Despite good evidence from other major sports leagues, research on this topic in Europe is currently very scarce. This research relies on expert interviews on the use and objective of business analytics. Along with revenue data over 16 seasons spanning from 2004/05 to 2019/20 from Manchester City FC, we conducted a time series analysis to detect a structural breakpoint on the different revenue streams, i.e., sponsorship and ticketing, after analytical tools have been implemented. We not only find that business analytics have indeed been applied at Manchester City FC and revenue increase is the main objective of their utilization but also that business analytics is indeed a good means to increase revenues if applied sufficiently. We can thereby support findings from other sports leagues. Consequently, professional sports organizations are advised to apply business analytics if they aim to increase revenues. This research has shown that analytical practices do, in fact, support revenue growth and help to work more efficiently. As the knowledge of analytical practices is very confidential and not publicly available, we had to select one club as a case study which can be considered a research limitation. Other practitioners should explore other clubs or leagues. Further, there are other factors that can lead to increased revenues that need to be considered. Additionally, sports organizations need resources to be able to apply and utilize business analytics. Consequently, findings might only apply to the top teams of the European football leagues. Nonetheless, this paper combines insights and results on usage, objectives, and impact of business analytics in European professional football and thereby fills a current research gap.Keywords: business analytics, expert interviews, revenue management, time series analysis
Procedia PDF Downloads 792519 An Explanatory Study Approach Using Artificial Intelligence to Forecast Solar Energy Outcome
Authors: Agada N. Ihuoma, Nagata Yasunori
Abstract:
Artificial intelligence (AI) techniques play a crucial role in predicting the expected energy outcome and its performance, analysis, modeling, and control of renewable energy. Renewable energy is becoming more popular for economic and environmental reasons. In the face of global energy consumption and increased depletion of most fossil fuels, the world is faced with the challenges of meeting the ever-increasing energy demands. Therefore, incorporating artificial intelligence to predict solar radiation outcomes from the intermittent sunlight is crucial to enable a balance between supply and demand of energy on loads, predict the performance and outcome of solar energy, enhance production planning and energy management, and ensure proper sizing of parameters when generating clean energy. However, one of the major problems of forecasting is the algorithms used to control, model, and predict performances of the energy systems, which are complicated and involves large computer power, differential equations, and time series. Also, having unreliable data (poor quality) for solar radiation over a geographical location as well as insufficient long series can be a bottleneck to actualization. To overcome these problems, this study employs the anaconda Navigator (Jupyter Notebook) for machine learning which can combine larger amounts of data with fast, iterative processing and intelligent algorithms allowing the software to learn automatically from patterns or features to predict the performance and outcome of Solar Energy which in turns enables the balance of supply and demand on loads as well as enhance production planning and energy management.Keywords: artificial Intelligence, backward elimination, linear regression, solar energy
Procedia PDF Downloads 1572518 The Shannon Entropy and Multifractional Markets
Authors: Massimiliano Frezza, Sergio Bianchi, Augusto Pianese
Abstract:
Introduced by Shannon in 1948 in the field of information theory as the average rate at which information is produced by a stochastic set of data, the concept of entropy has gained much attention as a measure of uncertainty and unpredictability associated with a dynamical system, eventually depicted by a stochastic process. In particular, the Shannon entropy measures the degree of order/disorder of a given signal and provides useful information about the underlying dynamical process. It has found widespread application in a variety of fields, such as, for example, cryptography, statistical physics and finance. In this regard, many contributions have employed different measures of entropy in an attempt to characterize the financial time series in terms of market efficiency, market crashes and/or financial crises. The Shannon entropy has also been considered as a measure of the risk of a portfolio or as a tool in asset pricing. This work investigates the theoretical link between the Shannon entropy and the multifractional Brownian motion (mBm), stochastic process which recently is the focus of a renewed interest in finance as a driving model of stochastic volatility. In particular, after exploring the current state of research in this area and highlighting some of the key results and open questions that remain, we show a well-defined relationship between the Shannon (log)entropy and the memory function H(t) of the mBm. In details, we allow both the length of time series and time scale to change over analysis to study how the relation modify itself. On the one hand, applications are developed after generating surrogates of mBm trajectories based on different memory functions; on the other hand, an empirical analysis of several international stock indexes, which confirms the previous results, concludes the work.Keywords: Shannon entropy, multifractional Brownian motion, Hurst–Holder exponent, stock indexes
Procedia PDF Downloads 1102517 The Social Psychology of Illegal Game Room Addiction in the Historic Chinatown District of Honolulu, Hawaii: Illegal Compulsive Gambling, Chinese-Polynesian Organized Crime Syndicates, Police Corruption, and Loan Sharking Rings
Authors: Gordon James Knowles
Abstract:
Historically the Chinatown district in Sandwich Islands has been plagued with the traditional vice crimes of illegal drugs, gambling, and prostitution since the early 1800s. However, a new form of psychologically addictive arcade style table gambling machines has become the dominant form of illegal revenue made in Honolulu, Hawaii. This study attempts to document the drive, desire, or will to play and wager with arcade style video gaming and understand the role of illegal game rooms in facilitating pathological gambling addiction. Indicators of police corruption by Chinese organized crime syndicates related to protection rackets, bribery, and pay-offs were revealed. Information fusion from a police science and sociological intelligence perspective indicates insurgent warfare is being waged on the streets of Honolulu by the People’s Republic of China. This state-sponsored communist terrorism in the Hawaiian Islands used “contactless” irregular warfare entailing: (1) the deployment of psychologically addictive gambling machines, (2) the distribution of the physically addictive fentanyl drug as a lethal chemical weapon, and (3) psychological warfare by circulating pro-China anti-American propaganda newspapers targeted at the small island populace.Keywords: Chinese and Polynesian organized crime, china daily newspaper, electronic arcade style table games, gaming technology addiction, illegal compulsive gambling, and police intelligence
Procedia PDF Downloads 742516 Enhancing Project Performance Forecasting using Machine Learning Techniques
Authors: Soheila Sadeghi
Abstract:
Accurate forecasting of project performance metrics is crucial for successfully managing and delivering urban road reconstruction projects. Traditional methods often rely on static baseline plans and fail to consider the dynamic nature of project progress and external factors. This research proposes a machine learning-based approach to forecast project performance metrics, such as cost variance and earned value, for each Work Breakdown Structure (WBS) category in an urban road reconstruction project. The proposed model utilizes time series forecasting techniques, including Autoregressive Integrated Moving Average (ARIMA) and Long Short-Term Memory (LSTM) networks, to predict future performance based on historical data and project progress. The model also incorporates external factors, such as weather patterns and resource availability, as features to enhance the accuracy of forecasts. By applying the predictive power of machine learning, the performance forecasting model enables proactive identification of potential deviations from the baseline plan, which allows project managers to take timely corrective actions. The research aims to validate the effectiveness of the proposed approach using a case study of an urban road reconstruction project, comparing the model's forecasts with actual project performance data. The findings of this research contribute to the advancement of project management practices in the construction industry, offering a data-driven solution for improving project performance monitoring and control.Keywords: project performance forecasting, machine learning, time series forecasting, cost variance, earned value management
Procedia PDF Downloads 492515 A Conceptual Study for Investigating the Creation of Energy and Understanding the Properties of Nothing
Authors: Mahmoud Reza Hosseini
Abstract:
The universe is in a continuous expansion process, resulting in the reduction of its density and temperature. Also, by extrapolating back from its current state, the universe at its early times is studied, known as the big bang theory. According to this theory, moments after creation, the universe was an extremely hot and dense environment. However, its rapid expansion due to nuclear fusion led to a reduction in its temperature and density. This is evidenced through the cosmic microwave background and the universe structure at a large scale. However, extrapolating back further from this early state reaches singularity, which cannot be explained by modern physics, and the big bang theory is no longer valid. In addition, one can expect a nonuniform energy distribution across the universe from a sudden expansion. However, highly accurate measurements reveal an equal temperature mapping across the universe, which is contradictory to the big bang principles. To resolve this issue, it is believed that cosmic inflation occurred at the very early stages of the birth of the universe. According to the cosmic inflation theory, the elements which formed the universe underwent a phase of exponential growth due to the existence of a large cosmological constant. The inflation phase allows the uniform distribution of energy so that an equal maximum temperature can be achieved across the early universe. Also, the evidence of quantum fluctuations of this stage provides a means for studying the types of imperfections the universe would begin with. Although well-established theories such as cosmic inflation and the big bang together provide a comprehensive picture of the early universe and how it evolved into its current state, they are unable to address the singularity paradox at the time of universe creation. Therefore, a practical model capable of describing how the universe was initiated is needed. This research series aims at addressing the singularity issue by introducing a state of energy called a "neutral state," possessing an energy level that is referred to as the "base energy." The governing principles of base energy are discussed in detail in our second paper in the series "A Conceptual Study for Addressing the Singularity of the Emerging Universe," which is discussed in detail. To establish a complete picture, the origin of the base energy should be identified and studied. In this research paper, the mechanism which led to the emergence of this natural state and its corresponding base energy is proposed. In addition, the effect of the base energy in the space-time fabric is discussed. Finally, the possible role of the base energy in quantization and energy exchange is investigated. Therefore, the proposed concept in this research series provides a road map for enhancing our understating of the universe's creation from nothing and its evolution and discusses the possibility of base energy as one of the main building blocks of this universe.Keywords: big bang, cosmic inflation, birth of universe, energy creation, universe evolution
Procedia PDF Downloads 992514 A Stokes Optimal Control Model of Determining Cellular Interaction Forces during Gastrulation
Authors: Yuanhao Gao, Ping Lin, Kees Weijer
Abstract:
An optimal control system model is proposed for the cell flow in the process of chick embryo gastrulation in this paper. The target is to determine the cellular interaction forces which are hard to measure. This paper will take an approach to investigate the forces with the idea of the inverse problem. By choosing the forces as the control variable and regarding the cell flow as Stokes fluid, an objective functional will be established to match the numerical result of cell velocity with the experimental data. So that the forces could be determined by minimizing the objective functional. The Lagrange multiplier method is utilized to derive the state and adjoint equations consisting the optimal control system, which specifies the first-order necessary conditions. Finite element method is used to discretize and approximate equations. A conjugate gradient algorithm is given for solving the minimum solution of the system and determine the forces.Keywords: optimal control model, Stokes equation, conjugate gradient method, finite element method, chick embryo gastrulation
Procedia PDF Downloads 2592513 Teacher-Child Interactions within Learning Contexts in Prekindergarten
Authors: Angélique Laurent, Marie-Josée Letarte, Jean-Pascal Lemelin, Marie-France Morin
Abstract:
This study aims at exploring teacher-child interactions within learning contexts in public prekindergartens of the province of Québec (Canada). It is based on previous research showing that teacher-child interactions in preschools have direct and determining effects on the quality of early childhood education and could directly or indirectly influence child development. However, throughout a typical preschool day, children experience different learning contexts to promote their learning opportunities. Depending on these specific contexts, teacher-child interactions could vary, for example, between free play and shared book reading. Indeed, some studies have found that teacher-directed or child-directed contexts might lead to significant variations in teacher-child interactions. This study drew upon both the bioecological and the Teaching Through Interactions frameworks. It was conducted through a descriptive and correlational design. Fifteen teachers were recruited to participate in the study. At Time 1 in October, they completed a diary to report the learning contexts they proposed in their classroom during a typical week. At Time 2, seven months later (May), they were videotaped three times in the morning (two weeks’ time between each recording) during a typical morning class. The quality of teacher-child interactions was then coded with the Classroom Assessment Scoring System (CLASS) through the contexts identified. This tool measures three main domains of interactions: emotional support, classroom organization, and instruction support, and10 dimensions scored on a scale from 1 (low quality) to 7 (high quality). Based on the teachers’ reports, five learning contexts were identified: 1) shared book reading, 2) free play, 3) morning meeting, 4) teacher-directed activity (such as craft), and 5) snack. Based on preliminary statistical analyses, little variation was observed within the learning contexts for each domain of the CLASS. However, the instructional support domain showed lower scores during specific learning contexts, specifically free play and teacher-directed activity. Practical implications for how preschool teachers could foster specific domains of interactions depending on learning contexts to enhance children’s social and academic development will be discussed.Keywords: teacher practices, teacher-child interactions, preschool education, learning contexts, child development
Procedia PDF Downloads 1092512 Performance Evaluation of Refinement Method for Wideband Two-Beams Formation
Authors: C. Bunsanit
Abstract:
This paper presents the refinement method for two beams formation of wideband smart antenna. The refinement method for weighting coefficients is based on Fully Spatial Signal Processing by taking Inverse Discrete Fourier Transform (IDFT), and its simulation results are presented using MATLAB. The radiation pattern is created by multiplying the incoming signal with real weights and then summing them together. These real weighting coefficients are computed by IDFT method; however, the range of weight values is relatively wide. Therefore, for reducing this range, the refinement method is used. The radiation pattern concerns with five input parameters to control. These parameters are maximum weighting coefficient, wideband signal, direction of mainbeam, beamwidth, and maximum of minor lobe level. Comparison of the obtained simulation results between using refinement method and taking only IDFT shows that the refinement method works well for wideband two beams formation.Keywords: fully spatial signal processing, beam forming, refinement method, smart antenna, weighting coefficient, wideband
Procedia PDF Downloads 2262511 Modeling of Transformer Winding for Transients: Frequency-Dependent Proximity and Skin Analysis
Authors: Yazid Alkraimeen
Abstract:
Precise prediction of dielectric stresses and high voltages of power transformers require the accurate calculation of frequency-dependent parameters. A lack of accuracy can result in severe damages to transformer windings. Transient conditions is stuided by digital computers, which require the implementation of accurate models. This paper analyzes the computation of frequency-dependent skin and proximity losses included in the transformer winding model, using analytical equations and Finite Element Method (FEM). A modified formula to calculate the proximity and the skin losses is presented. The results of the frequency-dependent parameter calculations are verified using the Finite Element Method. The time-domain transient voltages are obtained using Numerical Inverse Laplace Transform. The results show that the classical formula for proximity losses is overestimating the transient voltages when compared with the results obtained from the modified method on a simple transformer geometry.Keywords: fast front transients, proximity losses, transformer winding modeling, skin losses
Procedia PDF Downloads 1392510 Fuzzy-Genetic Algorithm Multi-Objective Optimization Methodology for Cylindrical Stiffened Tanks Conceptual Design
Authors: H. Naseh, M. Mirshams, M. Mirdamadian, H. R. Fazeley
Abstract:
This paper presents an extension of fuzzy-genetic algorithm multi-objective optimization methodology that could effectively be used to find the overall satisfaction of objective functions (selecting the design variables) in the early stages of design process. The coupling of objective functions due to design variables in an engineering design process will result in difficulties in design optimization problems. In many cases, decision making on design variables conflicts with more than one discipline in system design. In space launch system conceptual design, decision making on some design variable (e.g. oxidizer to fuel mass flow rate O/F) in early stages of the design process is related to objective of liquid propellant engine (specific impulse) and Tanks (structure weight). Then, the primary application of this methodology is the design of a liquid propellant engine with the maximum specific impulse and cylindrical stiffened tank with the minimum weight. To this end, the design problem is established the fuzzy rule set based on designer's expert knowledge with a holistic approach. The independent design variables in this model are oxidizer to fuel mass flow rate, thickness of stringers, thickness of rings, shell thickness. To handle the mentioned problems, a fuzzy-genetic algorithm multi-objective optimization methodology is developed based on Pareto optimal set. Consequently, this methodology is modeled with the one stage of space launch system to illustrate accuracy and efficiency of proposed methodology.Keywords: cylindrical stiffened tanks, multi-objective, genetic algorithm, fuzzy approach
Procedia PDF Downloads 6552509 Detection of Image Blur and Its Restoration for Image Enhancement
Authors: M. V. Chidananda Murthy, M. Z. Kurian, H. S. Guruprasad
Abstract:
Image restoration in the process of communication is one of the emerging fields in the image processing. The motion analysis processing is the simplest case to detect motion in an image. Applications of motion analysis widely spread in many areas such as surveillance, remote sensing, film industry, navigation of autonomous vehicles, etc. The scene may contain multiple moving objects, by using motion analysis techniques the blur caused by the movement of the objects can be enhanced by filling-in occluded regions and reconstruction of transparent objects, and it also removes the motion blurring. This paper presents the design and comparison of various motion detection and enhancement filters. Median filter, Linear image deconvolution, Inverse filter, Pseudoinverse filter, Wiener filter, Lucy Richardson filter and Blind deconvolution filters are used to remove the blur. In this work, we have considered different types and different amount of blur for the analysis. Mean Square Error (MSE) and Peak Signal to Noise Ration (PSNR) are used to evaluate the performance of the filters. The designed system has been implemented in Matlab software and tested for synthetic and real-time images.Keywords: image enhancement, motion analysis, motion detection, motion estimation
Procedia PDF Downloads 2872508 Modelling and Detecting the Demagnetization Fault in the Permanent Magnet Synchronous Machine Using the Current Signature Analysis
Authors: Yassa Nacera, Badji Abderrezak, Saidoune Abdelmalek, Houassine Hamza
Abstract:
Several kinds of faults can occur in a permanent magnet synchronous machine (PMSM) systems: bearing faults, electrically short/open faults, eccentricity faults, and demagnetization faults. Demagnetization fault means that the strengths of permanent magnets (PM) in PMSM decrease, and it causes low output torque, which is undesirable for EVs. The fault is caused by physical damage, high-temperature stress, inverse magnetic field, and aging. Motor current signature analysis (MCSA) is a conventional motor fault detection method based on the extraction of signal features from stator current. a simulation model of the PMSM under partial demagnetization and uniform demagnetization fault was established, and different degrees of demagnetization fault were simulated. The harmonic analyses using the Fast Fourier Transform (FFT) show that the fault diagnosis method based on the harmonic wave analysis is only suitable for partial demagnetization fault of the PMSM and does not apply to uniform demagnetization fault of the PMSM.Keywords: permanent magnet, diagnosis, demagnetization, modelling
Procedia PDF Downloads 682507 Product Features Extraction from Opinions According to Time
Authors: Kamal Amarouche, Houda Benbrahim, Ismail Kassou
Abstract:
Nowadays, e-commerce shopping websites have experienced noticeable growth. These websites have gained consumers’ trust. After purchasing a product, many consumers share comments where opinions are usually embedded about the given product. Research on the automatic management of opinions that gives suggestions to potential consumers and portrays an image of the product to manufactures has been growing recently. After launching the product in the market, the reviews generated around it do not usually contain helpful information or generic opinions about this product (e.g. telephone: great phone...); in the sense that the product is still in the launching phase in the market. Within time, the product becomes old. Therefore, consumers perceive the advantages/ disadvantages about each specific product feature. Therefore, they will generate comments that contain their sentiments about these features. In this paper, we present an unsupervised method to extract different product features hidden in the opinions which influence its purchase, and that combines Time Weighting (TW) which depends on the time opinions were expressed with Term Frequency-Inverse Document Frequency (TF-IDF). We conduct several experiments using two different datasets about cell phones and hotels. The results show the effectiveness of our automatic feature extraction, as well as its domain independent characteristic.Keywords: opinion mining, product feature extraction, sentiment analysis, SentiWordNet
Procedia PDF Downloads 4112506 Robust Method for Evaluation of Catchment Response to Rainfall Variations Using Vegetation Indices and Surface Temperature
Authors: Revalin Herdianto
Abstract:
Recent climate changes increase uncertainties in vegetation conditions such as health and biomass globally and locally. The detection is, however, difficult due to the spatial and temporal scale of vegetation coverage. Due to unique vegetation response to its environmental conditions such as water availability, the interplay between vegetation dynamics and hydrologic conditions leave a signature in their feedback relationship. Vegetation indices (VI) depict vegetation biomass and photosynthetic capacity that indicate vegetation dynamics as a response to variables including hydrologic conditions and microclimate factors such as rainfall characteristics and land surface temperature (LST). It is hypothesized that the signature may be depicted by VI in its relationship with other variables. To study this signature, several catchments in Asia, Australia, and Indonesia were analysed to assess the variations in hydrologic characteristics with vegetation types. Methods used in this study includes geographic identification and pixel marking for studied catchments, analysing time series of VI and LST of the marked pixels, smoothing technique using Savitzky-Golay filter, which is effective for large area and extensive data. Time series of VI, LST, and rainfall from satellite and ground stations coupled with digital elevation models were analysed and presented. This study found that the hydrologic response of vegetation to rainfall variations may be shown in one hydrologic year, in which a drought event can be detected a year later as a suppressed growth. However, an annual rainfall of above average do not promote growth above average as shown by VI. This technique is found to be a robust and tractable approach for assessing catchment dynamics in changing climates.Keywords: vegetation indices, land surface temperature, vegetation dynamics, catchment
Procedia PDF Downloads 2872505 Reducing the Imbalance Penalty Through Artificial Intelligence Methods Geothermal Production Forecasting: A Case Study for Turkey
Authors: Hayriye Anıl, Görkem Kar
Abstract:
In addition to being rich in renewable energy resources, Turkey is one of the countries that promise potential in geothermal energy production with its high installed power, cheapness, and sustainability. Increasing imbalance penalties become an economic burden for organizations since geothermal generation plants cannot maintain the balance of supply and demand due to the inadequacy of the production forecasts given in the day-ahead market. A better production forecast reduces the imbalance penalties of market participants and provides a better imbalance in the day ahead market. In this study, using machine learning, deep learning, and, time series methods, the total generation of the power plants belonging to Zorlu Natural Electricity Generation, which has a high installed capacity in terms of geothermal, was estimated for the first one and two weeks of March, then the imbalance penalties were calculated with these estimates and compared with the real values. These modeling operations were carried out on two datasets, the basic dataset and the dataset created by extracting new features from this dataset with the feature engineering method. According to the results, Support Vector Regression from traditional machine learning models outperformed other models and exhibited the best performance. In addition, the estimation results in the feature engineering dataset showed lower error rates than the basic dataset. It has been concluded that the estimated imbalance penalty calculated for the selected organization is lower than the actual imbalance penalty, optimum and profitable accounts.Keywords: machine learning, deep learning, time series models, feature engineering, geothermal energy production forecasting
Procedia PDF Downloads 1102504 A Sparse Representation Speech Denoising Method Based on Adapted Stopping Residue Error
Authors: Qianhua He, Weili Zhou, Aiwu Chen
Abstract:
A sparse representation speech denoising method based on adapted stopping residue error was presented in this paper. Firstly, the cross-correlation between the clean speech spectrum and the noise spectrum was analyzed, and an estimation method was proposed. In the denoising method, an over-complete dictionary of the clean speech power spectrum was learned with the K-singular value decomposition (K-SVD) algorithm. In the sparse representation stage, the stopping residue error was adaptively achieved according to the estimated cross-correlation and the adjusted noise spectrum, and the orthogonal matching pursuit (OMP) approach was applied to reconstruct the clean speech spectrum from the noisy speech. Finally, the clean speech was re-synthesised via the inverse Fourier transform with the reconstructed speech spectrum and the noisy speech phase. The experiment results show that the proposed method outperforms the conventional methods in terms of subjective and objective measure.Keywords: speech denoising, sparse representation, k-singular value decomposition, orthogonal matching pursuit
Procedia PDF Downloads 4992503 Managing Pseudoangiomatous Stromal Hyperplasia Appropriately and Safely: A Retrospective Case Series Review
Authors: C. M. Williams, R. English, P. King, I. M. Brown
Abstract:
Introduction: Pseudoangiomatous Stromal Hyperplasia (PASH) is a benign fibrous proliferation of breast stroma affecting predominantly premenopausal women with no significant increased risk of breast cancer. Informal recommendations for management have continued to evolve over recent years from surgical excision to observation, although there are no specific national guidelines. This study assesses the safety of a non-surgical approach to PASH management by review of cases at a single centre. Methods: Retrospective case series review (January 2011 – August 2016) was conducted on consecutive PASH cases. Diagnostic classification (clinical, radiological and histological), management outcomes, and breast cancer incidence were recorded. Results: 43 patients were followed up for median of 25 months (3-64) with 75% symptomatic at presentation. 12% of cases (n=5) had a radiological score (BIRADS MMG or US) ≥ 4 of which 3 were confirmed malignant. One further malignancy was detected and proven radiologically occult and contralateral. No patients were diagnosed with a malignancy during follow-up. Treatment evolved from 67% surgical in 2011 to 33% in 2016. Conclusions: The management of PASH has transitioned in line with other published experience. The preliminary findings suggest this appears safe with no evidence of missed malignancies; however, longer follow up is required to confirm long-term safety. Recommendations: PASH with suspicious radiological findings ( ≥ U4/R4) warrants multidisciplinary discussion for excision. In the absence of histological or radiological suspicion of malignancy, PASH can be safely managed without surgery.Keywords: benign breast disease, conservative management, malignancy, pseudoangiomatous stromal hyperplasia, surgical excision
Procedia PDF Downloads 1322502 A Hybrid Multi-Pole Fe₇₈Si₁₃B₉+FeSi₃ Soft Magnetic Core for Application in the Stators of the Low-Power Permanent Magnet Brushless Direct Current Motors
Authors: P. Zackiewicz, M. Hreczka, R. Kolano, A. Kolano-Burian
Abstract:
New types of materials applied as the stators in the Permanent Magnet Brushless Direct Current motors used in the heart supporting pumps are presented. The main focus of this work is the research on the fabrication of a hybrid nine-pole soft magnetic core consisting of a soft magnetic carrier ring with rectangular notches, made from the FeSi3 strip, and nine soft magnetic poles. This soft magnetic core is made in three stages: (a) preparation of the carrier rings from soft magnetic material with the lowest possible power losses and suitable stiffness, (b) preparation of trapezoidal soft magnetic poles from Metglas 2605 SA1 type ribbons, and (c) making durable connection between the poles and the carrier ring, capable of withstanding a four-times greater tearing force than that present during normal operation of the motor pump. All magnetic properties measurements were made using Remacomp C-1200 (Magnet Physik, Germany) and 450 Gaussometer (Lake Shore, USA) and the electrical characteristics were measured using laboratory generator DF1723009TC (NDN, Poland). Specific measurement techniques used to determine properties of the hybrid cores were presented. Obtained results allow developing the fabrication technology with an account of the intended application of these cores in the stators of the low-power PMBLDC motors used in implanted heart operation supporting pumps. The proposed measurement methodology is appropriate for assessing the quality of the stators.Keywords: amorphous materials, heart supporting pump, PMBLDC motor, soft magnetic materials
Procedia PDF Downloads 2132501 Multimedia Data Fusion for Event Detection in Twitter by Using Dempster-Shafer Evidence Theory
Authors: Samar M. Alqhtani, Suhuai Luo, Brian Regan
Abstract:
Data fusion technology can be the best way to extract useful information from multiple sources of data. It has been widely applied in various applications. This paper presents a data fusion approach in multimedia data for event detection in twitter by using Dempster-Shafer evidence theory. The methodology applies a mining algorithm to detect the event. There are two types of data in the fusion. The first is features extracted from text by using the bag-ofwords method which is calculated using the term frequency-inverse document frequency (TF-IDF). The second is the visual features extracted by applying scale-invariant feature transform (SIFT). The Dempster - Shafer theory of evidence is applied in order to fuse the information from these two sources. Our experiments have indicated that comparing to the approaches using individual data source, the proposed data fusion approach can increase the prediction accuracy for event detection. The experimental result showed that the proposed method achieved a high accuracy of 0.97, comparing with 0.93 with texts only, and 0.86 with images only.Keywords: data fusion, Dempster-Shafer theory, data mining, event detection
Procedia PDF Downloads 4102500 Enhancing Transfer Path Analysis with In-Situ Component Transfer Path Analysis for Interface Forces Identification
Authors: Raef Cherif, Houssine Bakkali, Wafaa El Khatiri, Yacine Yaddaden
Abstract:
The analysis of how vibrations are transmitted between components is required in many engineering applications. Transfer path analysis (TPA) has been a valuable engineering tool for solving Noise, Vibration, and Harshness (NVH problems using sub-structuring applications. The most challenging part of a TPA analysis is estimating the equivalent forces at the contact points between the active and the passive side. Component TPA in situ Method calculates these forces by inverting the frequency response functions (FRFs) measured at the passive subsystem, relating the motion at indicator points to forces at the interface. However, matrix inversion could pose problems due to the ill-conditioning of the matrices leading to inaccurate results. This paper establishes a TPA model for an academic system consisting of two plates linked by four springs. A numerical study has been performed to improve the interface forces identification. Several parameters are studied and discussed, such as the singular value rejection and the number and position of indicator points chosen and used in the inversion matrix.Keywords: transfer path analysis, matrix inverse method, indicator points, SVD decomposition
Procedia PDF Downloads 852499 A Self Organized Map Method to Classify Auditory-Color Synesthesia from Frontal Lobe Brain Blood Volume
Authors: Takashi Kaburagi, Takamasa Komura, Yosuke Kurihara
Abstract:
Absolute pitch is the ability to identify a musical note without a reference tone. Training for absolute pitch often occurs in preschool education. It is necessary to clarify how well the trainee can make use of synesthesia in order to evaluate the effect of the training. To the best of our knowledge, there are no existing methods for objectively confirming whether the subject is using synesthesia. Therefore, in this study, we present a method to distinguish the use of color-auditory synesthesia from the separate use of color and audition during absolute pitch training. This method measures blood volume in the prefrontal cortex using functional Near-infrared spectroscopy (fNIRS) and assumes that the cognitive step has two parts, a non-linear step and a linear step. For the linear step, we assume a second order ordinary differential equation. For the non-linear part, it is extremely difficult, if not impossible, to create an inverse filter of such a complex system as the brain. Therefore, we apply a method based on a self-organizing map (SOM) and are guided by the available data. The presented method was tested using 15 subjects, and the estimation accuracy is reported.Keywords: absolute pitch, functional near-infrared spectroscopy, prefrontal cortex, synesthesia
Procedia PDF Downloads 2632498 Perceived Social Support, Resilience and Relapse Risk in Recovered Addicts
Authors: Islah Ud Din, Amna Bibi
Abstract:
The current study was carried out to examine the perceived social support, resilience and relapse risk in recovered addicts. A purposive sampling technique was used to collect data from recovered addicts. A multidimensional scale of perceived social support by was used to measure the perceived social support. The brief Resilience Scale (BRS) was used to assess resilience. The Stimulant Relapse Risk Scale (SRRS) was used to examine the relapse risk. Resilience and Perceived social support have substantial positive correlations, whereas relapse risk and perceived social support have significant negative associations. Relapse risk and resilience have a strong inverse connection. Regression analysis was used to check the mediating effect of resilience between perceived social support and relapse risk. The findings revealed that perceived social support negatively predicted relapse risk. Results showed that Resilience plays a role as partial mediation between perceived social support and relapse risk. This Research will allow us to explore and understand the relapse risk factor and the role of perceived social support and resilience in recovered addicts. The study's findings have immediate consequences in the prevention of relapse. The study will play a significant part in drug rehabilitation centers, clinical settings and further research.Keywords: perceived social support, resilience, relapse risk, recovered addicts, drugs addiction
Procedia PDF Downloads 352497 Geoinformation Technology of Agricultural Monitoring Using Multi-Temporal Satellite Imagery
Authors: Olena Kavats, Dmitry Khramov, Kateryna Sergieieva, Vladimir Vasyliev, Iurii Kavats
Abstract:
Geoinformation technologies of space agromonitoring are a means of operative decision making support in the tasks of managing the agricultural sector of the economy. Existing technologies use satellite images in the optical range of electromagnetic spectrum. Time series of optical images often contain gaps due to the presence of clouds and haze. A geoinformation technology is created. It allows to fill gaps in time series of optical images (Sentinel-2, Landsat-8, PROBA-V, MODIS) with radar survey data (Sentinel-1) and use information about agrometeorological conditions of the growing season for individual monitoring years. The technology allows to perform crop classification and mapping for spring-summer (winter and spring crops) and autumn-winter (winter crops) periods of vegetation, monitoring the dynamics of crop state seasonal changes, crop yield forecasting. Crop classification is based on supervised classification algorithms, takes into account the peculiarities of crop growth at different vegetation stages (dates of sowing, emergence, active vegetation, and harvesting) and agriculture land state characteristics (row spacing, seedling density, etc.). A catalog of samples of the main agricultural crops (Ukraine) is created and crop spectral signatures are calculated with the preliminary removal of row spacing, cloud cover, and cloud shadows in order to construct time series of crop growth characteristics. The obtained data is used in grain crop growth tracking and in timely detection of growth trends deviations from reference samples of a given crop for a selected date. Statistical models of crop yield forecast are created in the forms of linear and nonlinear interconnections between crop yield indicators and crop state characteristics (temperature, precipitation, vegetation indices, etc.). Predicted values of grain crop yield are evaluated with an accuracy up to 95%. The developed technology was used for agricultural areas monitoring in a number of Great Britain and Ukraine regions using EOS Crop Monitoring Platform (https://crop-monitoring.eos.com). The obtained results allow to conclude that joint use of Sentinel-1 and Sentinel-2 images improve separation of winter crops (rapeseed, wheat, barley) in the early stages of vegetation (October-December). It allows to separate successfully the soybean, corn, and sunflower sowing areas that are quite similar in their spectral characteristics.Keywords: geoinformation technology, crop classification, crop yield prediction, agricultural monitoring, EOS Crop Monitoring Platform
Procedia PDF Downloads 4562496 Modelling of Damage as Hinges in Segmented Tunnels
Authors: Gelacio JuáRez-Luna, Daniel Enrique GonzáLez-RamíRez, Enrique Tenorio-Montero
Abstract:
Frame elements coupled with springs elements are used for modelling the development of hinges in segmented tunnels, the spring elements modelled the rotational, transversal and axial failure. These spring elements are equipped with constitutive models to include independently the moment, shear force and axial force, respectively. These constitutive models are formulated based on damage mechanics and experimental test reported in the literature review. The mesh of the segmented tunnels was discretized in the software GID, and the nonlinear analyses were carried out in the finite element software ANSYS. These analyses provide the capacity curve of the primary and secondary lining of a segmented tunnel. Two numerical examples of segmented tunnels show the capability of the spring elements to release energy by the development of hinges. The first example is a segmental concrete lining discretized with frame elements loaded until hinges occurred in the lining. The second example is a tunnel with primary and secondary lining, discretized with a double ring frame model. The outer ring simulates the segmental concrete lining and the inner ring simulates the secondary cast-in-place concrete lining. Spring elements also modelled the joints between the segments in the circumferential direction and the ring joints, which connect parallel adjacent rings. The computed load vs displacement curves are congruent with numerical and experimental results reported in the literature review. It is shown that the modelling of a tunnel with primary and secondary lining with frame elements and springs provides reasonable results and save computational cost, comparing with 2D or 3D models equipped with smeared crack models.Keywords: damage, hinges, lining, tunnel
Procedia PDF Downloads 390