Search results for: marking vector
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1169

Search results for: marking vector

569 Using Photogrammetry to Survey the Côa Valley Iron Age Rock Art Motifs: Vermelhosa Panel 3 Case Study

Authors: Natália Botica, Luís Luís, Paulo Bernardes

Abstract:

The Côa Valley, listed World Heritage since 1998, presents more than 1300 open-air engraved rock panels. The Archaeological Park of the Côa Valley recorded the rock art motifs, testing various techniques based on direct tracing processes on the rock, using natural and artificial lighting. In this work, integrated in the "Open Access Rock Art Repository" (RARAA) project, we present the methodology adopted for the vectorial drawing of the rock art motifs based on orthophotos taken from the photogrammetric survey and 3D models of the rocks. We also present the information system designed to integrate the vector drawing and the characterization data of the motifs, as well as the open access sharing, in order to promote their reuse in multiple areas. The 3D models themselves constitute a very detailed record, ensuring the digital preservation of the rock and iconography. Thus, even if a rock or motif disappears, it can continue to be studied and even recreated.

Keywords: rock art, archaeology, iron age, 3D models

Procedia PDF Downloads 73
568 Chemical and Vibrational Nonequilibrium Hypersonic Viscous Flow around an Axisymmetric Blunt Body

Authors: Rabah Haoui

Abstract:

Hypersonic flows around spatial vehicles during their reentry phase in planetary atmospheres are characterized by intense aerothermodynamics phenomena. The aim of this work is to analyze high temperature flows around an axisymmetric blunt body taking into account chemical and vibrational non-equilibrium for air mixture species and the no slip condition at the wall. For this purpose, the Navier-Stokes equations system is resolved by the finite volume methodology to determine the flow parameters around the axisymmetric blunt body especially at the stagnation point and in the boundary layer along the wall of the blunt body. The code allows the capture of shock wave before a blunt body placed in hypersonic free stream. The numerical technique uses the Flux Vector Splitting method of Van Leer. CFL coefficient and mesh size level are selected to ensure the numerical convergence.

Keywords: hypersonic flow, viscous flow, chemical kinetic, dissociation, finite volumes, frozen and non-equilibrium flow

Procedia PDF Downloads 447
567 Vectorial Capacity and Age Determination of Anopheles Maculipinnis S. L. (Diptera: Culicidae), in Esfahan and Chahar Mahal and Bakhtiari Provinces, Central Iran

Authors: Fariba Sepahvand, Seyed Hassan Moosa-kazemi

Abstract:

The objective was to determine the population dynamics of Anopheles maculipinnis s.l. in relation to probable malaria transmission. The study was carried out in three villages in Isfahan and charmahal bakhteari provinces of Iran, from April to March 2014. Mosquitoes were collected by Total catch, Human and Animal bait collection. An. maculipinnis play as a dominant vector with exophagic and endophilic behavior. Ovary dissection revealed four dilatations indicate at least 9% of the population can reach to the dangerous age to potentially malaria transmission. Two peaks of blood feeding were observed, 9.00-10.00 P.M, and the 12.00-00.01 A.M. The gonotrophic cycle, survival rate, life expectancy of the species was 4, 0.82 and five days, respectively. Vectorial capacity was measured as 0.028. In conclusion, moderate climatic conditions support the persistence, density and longevity of An maculipinnis s.l. could result in more significant malaria transmission.

Keywords: age determination, Anopheles maculipinnis, center of Iran, Malaria

Procedia PDF Downloads 229
566 A Nonlinear Feature Selection Method for Hyperspectral Image Classification

Authors: Pei-Jyun Hsieh, Cheng-Hsuan Li, Bor-Chen Kuo

Abstract:

For hyperspectral image classification, feature reduction is an important pre-processing for avoiding the Hughes phenomena due to the difficulty for collecting training samples. Hence, lots of researches developed feature selection methods such as F-score, HSIC (Hilbert-Schmidt Independence Criterion), and etc., to improve hyperspectral image classification. However, most of them only consider the class separability in the original space, i.e., a linear class separability. In this study, we proposed a nonlinear class separability measure based on kernel trick for selecting an appropriate feature subset. The proposed nonlinear class separability was formed by a generalized RBF kernel with different bandwidths with respect to different features. Moreover, it considered the within-class separability and the between-class separability. A genetic algorithm was applied to tune these bandwidths such that the smallest with-class separability and the largest between-class separability simultaneously. This indicates the corresponding feature space is more suitable for classification. In addition, the corresponding nonlinear classification boundary can separate classes very well. These optimal bandwidths also show the importance of bands for hyperspectral image classification. The reciprocals of these bandwidths can be viewed as weights of bands. The smaller bandwidth, the larger weight of the band, and the more importance for classification. Hence, the descending order of the reciprocals of the bands gives an order for selecting the appropriate feature subsets. In the experiments, three hyperspectral image data sets, the Indian Pine Site data set, the PAVIA data set, and the Salinas A data set, were used to demonstrate the selected feature subsets by the proposed nonlinear feature selection method are more appropriate for hyperspectral image classification. Only ten percent of samples were randomly selected to form the training dataset. All non-background samples were used to form the testing dataset. The support vector machine was applied to classify these testing samples based on selected feature subsets. According to the experiments on the Indian Pine Site data set with 220 bands, the highest accuracies by applying the proposed method, F-score, and HSIC are 0.8795, 0.8795, and 0.87404, respectively. However, the proposed method selects 158 features. F-score and HSIC select 168 features and 217 features, respectively. Moreover, the classification accuracies increase dramatically only using first few features. The classification accuracies with respect to feature subsets of 10 features, 20 features, 50 features, and 110 features are 0.69587, 0.7348, 0.79217, and 0.84164, respectively. Furthermore, only using half selected features (110 features) of the proposed method, the corresponding classification accuracy (0.84168) is approximate to the highest classification accuracy, 0.8795. For other two hyperspectral image data sets, the PAVIA data set and Salinas A data set, we can obtain the similar results. These results illustrate our proposed method can efficiently find feature subsets to improve hyperspectral image classification. One can apply the proposed method to determine the suitable feature subset first according to specific purposes. Then researchers can only use the corresponding sensors to obtain the hyperspectral image and classify the samples. This can not only improve the classification performance but also reduce the cost for obtaining hyperspectral images.

Keywords: hyperspectral image classification, nonlinear feature selection, kernel trick, support vector machine

Procedia PDF Downloads 254
565 Investigation of New Gait Representations for Improving Gait Recognition

Authors: Chirawat Wattanapanich, Hong Wei

Abstract:

This study presents new gait representations for improving gait recognition accuracy on cross gait appearances, such as normal walking, wearing a coat and carrying a bag. Based on the Gait Energy Image (GEI), two ideas are implemented to generate new gait representations. One is to append lower knee regions to the original GEI, and the other is to apply convolutional operations to the GEI and its variants. A set of new gait representations are created and used for training multi-class Support Vector Machines (SVMs). Tests are conducted on the CASIA dataset B. Various combinations of the gait representations with different convolutional kernel size and different numbers of kernels used in the convolutional processes are examined. Both the entire images as features and reduced dimensional features by Principal Component Analysis (PCA) are tested in gait recognition. Interestingly, both new techniques, appending the lower knee regions to the original GEI and convolutional GEI, can significantly contribute to the performance improvement in the gait recognition. The experimental results have shown that the average recognition rate can be improved from 75.65% to 87.50%.

Keywords: convolutional image, lower knee, gait

Procedia PDF Downloads 193
564 The Markers -mm and dämmo in Amharic: Developmental Approach

Authors: Hayat Omar

Abstract:

Languages provide speakers with a wide range of linguistic units to organize and deliver information. There are several ways to verbally express the mental representations of events. According to the linguistic tools they have acquired, speakers select the one that brings out the most communicative effect to convey their message. Our study focuses on two markers, -mm and dämmo, in Amharic (Ethiopian Semitic language). Our aim is to examine, from a developmental perspective, how they are used by speakers. We seek to distinguish the communicative and pragmatic functions indicated by means of these markers. To do so, we created a corpus of sixty narrative productions of children from 5-6, 7-8 to 10-12 years old and adult Amharic speakers. The experimental material we used to collect our data is a series of pictures without text 'Frog, Where are you?'. Although -mm and dämmo are each used in specific contexts, they are sometimes analyzed as being interchangeable. The suffix -mm is complex and multifunctional. It marks the end of the negative verbal structure, it is found in the relative structure of the imperfect, it creates new words such as adverbials or pronouns, it also serves to coordinate words, sentences and to mark the link between macro-propositions within a larger textual unit. -mm was analyzed as marker of insistence, topic shift marker, element of concatenation, contrastive focus marker, 'bisyndetic' coordinator. On the other hand, dämmo has limited function and did not attract the attention of many authors. The only approach we could find analyzes it in terms of 'monosyndetic' coordinator. The paralleling of these two elements made it possible to understand their distinctive functions and refine their description. When it comes to marking a referent, the choice of -mm or dämmo is not neutral, depending on whether the tagged argument is newly introduced, maintained, promoted or reintroduced. The presence of these morphemes explains the inter-phrastic link. The information is seized by anaphora or presupposition: -mm goes upstream while dämmo arrows downstream, the latter requires new information. The speaker uses -mm or dämmo according to what he assumes to be known to his interlocutors. The results show that -mm and dämmo, although all the speakers use them both, do not always have the same scope according to the speaker and vary according to the age. dämmo is mainly used to mark a contrastive topic to signal the concomitance of events. It is more commonly used in young children’s narratives (F(3,56) = 3,82, p < .01). Some values of -mm (additive) are acquired very early while others are rather late and increase with age (F(3,56) = 3,2, p < .03). The difficulty is due not only because of its synthetic structure but primarily because it is multi-purpose and requires a memory work. It highlights the constituent on which it operates to clarify how the message should be interpreted.

Keywords: acquisition, cohesion, connection, contrastive topic, contrastive focus, discourse marker, pragmatics

Procedia PDF Downloads 127
563 Effects of Financial Development on Economic Growth in South Asia

Authors: Anupam Das

Abstract:

Although financial liberalization has been one of the most important policy prescriptions of international organizations like the World Bank and the IMF, the effect of financial liberalization on economic growth in developing countries is far from unanimous. Since the '80s, South Asian countries made a significant development in liberalization the financial sector. However, due to unavailability of a sufficient number of time series observations, the relationship between economic growth and financial development has not been investigated adequately. We aim to fill this gap by examining time series data of five developing countries from the South Asian region: Bangladesh, India, Pakistan, Sri Lanka, and Nepal. Applying the cointegration tests and Granger causality within the vector error correction model (VECM), we do not find unanimous evidence of financial development on positive economic growth. These results are helpful for developing countries which have been trying to liberalize the financial sector in recent decades.

Keywords: economic growth, financial development, Granger causality, South Asia

Procedia PDF Downloads 363
562 Neutral Heavy Scalar Searches via Standard Model Gauge Boson Decays at the Large Hadron Electron Collider with Multivariate Techniques

Authors: Luigi Delle Rose, Oliver Fischer, Ahmed Hammad

Abstract:

In this article, we study the prospects of the proposed Large Hadron electron Collider (LHeC) in the search for heavy neutral scalar particles. We consider a minimal model with one additional complex scalar singlet that interacts with the Standard Model (SM) via mixing with the Higgs doublet, giving rise to an SM-like Higgs boson and a heavy scalar particle. Both scalar particles are produced via vector boson fusion and can be tested via their decays into pairs of SM particles, analogously to the SM Higgs boson. Using multivariate techniques, we show that the LHeC is sensitive to heavy scalars with masses between 200 and 800 GeV down to scalar mixing of order 0.01.

Keywords: beyond the standard model, large hadron electron collider, multivariate analysis, scalar singlet

Procedia PDF Downloads 126
561 A GIS-Based Study on Geographical Divisions of Sustainable Human Settlements in China

Authors: Wu Yiqun, Weng Jiantao

Abstract:

The human settlements of China are picked up from the land use vector map by interpreting the Thematic Map of 2014. This paper established the sustainable human settlements geographical division evaluation system and division model using GIS. The results show that: The density of human residential areas in China is different, and the density of sustainable human areas is higher, and the west is lower than that in the West. The regional differences of sustainable human settlements are obvious: the north is larger than that the south, the plain regions are larger than those of the hilly regions, and the developed regions are larger than the economically developed regions. The geographical distribution of the sustainable human settlements is measured by the degree of porosity. The degree of porosity correlates with the sustainable human settlement density. In the area where the sustainable human settlement density is high the porosity is low, the distribution is even and the gap between the settlements is low.

Keywords: GIS, geographical division, sustainable human settlements, China

Procedia PDF Downloads 579
560 Morphological Features Fusion for Identifying INBREAST-Database Masses Using Neural Networks and Support Vector Machines

Authors: Nadia el Atlas, Mohammed el Aroussi, Mohammed Wahbi

Abstract:

In this paper a novel technique of mass characterization based on robust features-fusion is presented. The proposed method consists of mainly four stages: (a) the first phase involves segmenting the masses using edge information’s. (b) The second phase is to calculate and fuse the most relevant morphological features. (c) The last phase is the classification step which allows us to classify the images into benign and malignant masses. In this step we have implemented Support Vectors Machines (SVM) and Artificial Neural Networks (ANN), which were evaluated with the following performance criteria: confusion matrix, accuracy, sensitivity, specificity, receiver operating characteristic ROC, and error histogram. The effectiveness of this new approach was evaluated by a recently developed database: INBREAST database. The fusion of the most appropriate morphological features provided very good results. The SVM gives accuracy to within 64.3%. Whereas the ANN classifier gives better results with an accuracy of 97.5%.

Keywords: breast cancer, mammography, CAD system, features, fusion

Procedia PDF Downloads 583
559 Multimodal Biometric Cryptography Based Authentication in Cloud Environment to Enhance Information Security

Authors: D. Pugazhenthi, B. Sree Vidya

Abstract:

Cloud computing is one of the emerging technologies that enables end users to use the services of cloud on ‘pay per usage’ strategy. This technology grows in a fast pace and so is its security threat. One among the various services provided by cloud is storage. In this service, security plays a vital factor for both authenticating legitimate users and protection of information. This paper brings in efficient ways of authenticating users as well as securing information on the cloud. Initial phase proposed in this paper deals with an authentication technique using multi-factor and multi-dimensional authentication system with multi-level security. Unique identification and slow intrusive formulates an advanced reliability on user-behaviour based biometrics than conventional means of password authentication. By biometric systems, the accounts are accessed only by a legitimate user and not by a nonentity. The biometric templates employed here do not include single trait but multiple, viz., iris and finger prints. The coordinating stage of the authentication system functions on Ensemble Support Vector Machine (SVM) and optimization by assembling weights of base SVMs for SVM ensemble after individual SVM of ensemble is trained by the Artificial Fish Swarm Algorithm (AFSA). Thus it helps in generating a user-specific secure cryptographic key of the multimodal biometric template by fusion process. Data security problem is averted and enhanced security architecture is proposed using encryption and decryption system with double key cryptography based on Fuzzy Neural Network (FNN) for data storing and retrieval in cloud computing . The proposing scheme aims to protect the records from hackers by arresting the breaking of cipher text to original text. This improves the authentication performance that the proposed double cryptographic key scheme is capable of providing better user authentication and better security which distinguish between the genuine and fake users. Thus, there are three important modules in this proposed work such as 1) Feature extraction, 2) Multimodal biometric template generation and 3) Cryptographic key generation. The extraction of the feature and texture properties from the respective fingerprint and iris images has been done initially. Finally, with the help of fuzzy neural network and symmetric cryptography algorithm, the technique of double key encryption technique has been developed. As the proposed approach is based on neural networks, it has the advantage of not being decrypted by the hacker even though the data were hacked already. The results prove that authentication process is optimal and stored information is secured.

Keywords: artificial fish swarm algorithm (AFSA), biometric authentication, decryption, encryption, fingerprint, fusion, fuzzy neural network (FNN), iris, multi-modal, support vector machine classification

Procedia PDF Downloads 245
558 Energy Efficient Routing Protocol with Ad Hoc On-Demand Distance Vector for MANET

Authors: K. Thamizhmaran, Akshaya Devi Arivazhagan, M. Anitha

Abstract:

On the case of most important systematic issue that must need to be solved in means of implementing a data transmission algorithm on the source of Mobile adhoc networks (MANETs). That is, how to save mobile nodes energy on meeting the requirements of applications or users as the mobile nodes are with battery limited. On while satisfying the energy saving requirement, hence it is also necessary of need to achieve the quality of service. In case of emergency work, it is necessary to deliver the data on mean time. Achieving quality of service in MANETs is also important on while. In order to achieve this requirement, Hence, we further implement the Energy-Aware routing protocol for system of Mobile adhoc networks were it being proposed, that on which saves the energy as on every node by means of efficiently selecting the mode of energy efficient path in the routing process by means of Enhanced AODV routing protocol.

Keywords: Ad-Hoc networks, MANET, routing, AODV, EAODV

Procedia PDF Downloads 355
557 Vibration Propagation in Body-in-White Structures Through Structural Intensity Analysis

Authors: Jamal Takhchi

Abstract:

The understanding of vibration propagation in complex structures such as automotive body in white remains a challenging issue in car design regarding NVH performances. The current analysis is limited to the low frequency range where modal concepts are dominant. Higher frequencies, between 200 and 1000 Hz, will become critical With the rise of electrification. EVs annoying sounds are mostly whines created by either Gears or e-motors between 300 Hz and 2 kHz. Structural intensity analysis was Experienced a few years ago on finite element models. The application was promising but limited by the fact that the propagating 3D intensity vector field is masked by a rotational Intensity field. This rotational field should be filtered using a differential operator. The expression of this operator in the framework of finite element modeling is not yet known. The aim of the proposed work is to implement this operator in the current dynamic solver (NASTRAN) of Stellantis and develop the Expected methodology for the mid-frequency structural analysis of electrified vehicles.

Keywords: structural intensity, NVH, body in white, irrotatational intensity

Procedia PDF Downloads 142
556 Prediction of Dubai Financial Market Stocks Movement Using K-Nearest Neighbor and Support Vector Regression

Authors: Abdulla D. Alblooshi

Abstract:

The stock market is a representation of human behavior and psychology, such as fear, greed, and discipline. Those are manifested in the form of price movements during the trading sessions. Therefore, predicting the stock movement and prices is a challenging effort. However, those trading sessions produce a large amount of data that can be utilized to train an AI agent for the purpose of predicting the stock movement. Predicting the stock market price action will be advantageous. In this paper, the stock movement data of three DFM listed stocks are studied using historical price movements and technical indicators value and used to train an agent using KNN and SVM methods to predict the future price movement. MATLAB Toolbox and a simple script is written to process and classify the information and output the prediction. It will also compare the different learning methods and parameters s using metrics like RMSE, MAE, and R².

Keywords: KNN, ANN, style, SVM, stocks, technical indicators, RSI, MACD, moving averages, RMSE, MAE

Procedia PDF Downloads 158
555 Artificial Neural Networks Based Calibration Approach for Six-Port Receiver

Authors: Nadia Chagtmi, Nejla Rejab, Noureddine Boulejfen

Abstract:

This paper presents a calibration approach based on artificial neural networks (ANN) to determine the envelop signal (I+jQ) of a six-port based receiver (SPR). The memory effects called also dynamic behavior and the nonlinearity brought by diode based power detector have been taken into consideration by the ANN. Experimental set-up has been performed to validate the efficiency of this method. The efficiency of this approach has been confirmed by the obtained results in terms of waveforms. Moreover, the obtained error vector magnitude (EVM) and the mean absolute error (MAE) have been calculated in order to confirm and to test the ANN’s performance to achieve I/Q recovery using the output voltage detected by the power based detector. The baseband signal has been recovered using ANN with EVMs no higher than 1 % and an MAE no higher than 17, 26 for the SPR excited different type of signals such QAM (quadrature amplitude modulation) and LTE (Long Term Evolution).

Keywords: six-port based receiver; calibration, nonlinearity, memory effect, artificial neural network

Procedia PDF Downloads 58
554 Recognition of Grocery Products in Images Captured by Cellular Phones

Authors: Farshideh Einsele, Hassan Foroosh

Abstract:

In this paper, we present a robust algorithm to recognize extracted text from grocery product images captured by mobile phone cameras. Recognition of such text is challenging since text in grocery product images varies in its size, orientation, style, illumination, and can suffer from perspective distortion. Pre-processing is performed to make the characters scale and rotation invariant. Since text degradations can not be appropriately defined using wellknown geometric transformations such as translation, rotation, affine transformation and shearing, we use the whole character black pixels as our feature vector. Classification is performed with minimum distance classifier using the maximum likelihood criterion, which delivers very promising Character Recognition Rate (CRR) of 89%. We achieve considerably higher Word Recognition Rate (WRR) of 99% when using lower level linguistic knowledge about product words during the recognition process.

Keywords: camera-based OCR, feature extraction, document, image processing, grocery products

Procedia PDF Downloads 397
553 Stream Extraction from 1m-DTM Using ArcGIS

Authors: Jerald Ruta, Ricardo Villar, Jojemar Bantugan, Nycel Barbadillo, Jigg Pelayo

Abstract:

Streams are important in providing water supply for industrial, agricultural and human consumption, In short when there are streams there are lives. Identifying streams are essential since many developed cities are situated in the vicinity of these bodies of water and in flood management, it serves as basin for surface runoff within the area. This study aims to process and generate features from high-resolution digital terrain model (DTM) with 1-meter resolution using Hydrology Tools of ArcGIS. The raster was then filled, processed flow direction and accumulation, then raster calculate and provide stream order, converted to vector, and clearing undesirable features using the ancillary or google earth. In field validation streams were classified whether perennial, intermittent or ephemeral. Results show more than 90% of the extracted feature were accurate in assessment through field validation.

Keywords: digital terrain models, hydrology tools, strahler method, stream classification

Procedia PDF Downloads 259
552 The Contribution of Edgeworth, Bootstrap and Monte Carlo Methods in Financial Data

Authors: Edlira Donefski, Tina Donefski, Lorenc Ekonomi

Abstract:

Edgeworth Approximation, Bootstrap, and Monte Carlo Simulations have considerable impacts on achieving certain results related to different problems taken into study. In our paper, we have treated a financial case related to the effect that has the components of a cash-flow of one of the most successful businesses in the world, as the financial activity, operational activity, and investment activity to the cash and cash equivalents at the end of the three-months period. To have a better view of this case, we have created a vector autoregression model, and after that, we have generated the impulse responses in the terms of asymptotic analysis (Edgeworth Approximation), Monte Carlo Simulations, and residual bootstrap based on the standard errors of every series created. The generated results consisted of the common tendencies for the three methods applied that consequently verified the advantage of the three methods in the optimization of the model that contains many variants.

Keywords: autoregression, bootstrap, edgeworth expansion, Monte Carlo method

Procedia PDF Downloads 139
551 Neural Nets Based Approach for 2-Cells Power Converter Control

Authors: Kamel Laidi, Khelifa Benmansour, Ouahid Bouchhida

Abstract:

Neural networks-based approach for 2-cells serial converter has been developed and implemented. The approach is based on a behavioural description of the different operating modes of the converter. Each operating mode represents a well-defined configuration, and for which is matched an operating zone satisfying given invariance conditions, depending on the capacitors' voltages and the load current of the converter. For each mode, a control vector whose components are the control signals to be applied to the converter switches has been associated. Therefore, the problem is reduced to a classification task of the different operating modes of the converter. The artificial neural nets-based approach, which constitutes a powerful tool for this kind of task, has been adopted and implemented. The application to a 2-cells chopper has allowed ensuring efficient and robust control of the load current and a high capacitors voltages balancing.

Keywords: neural nets, control, multicellular converters, 2-cells chopper

Procedia PDF Downloads 818
550 Agriculture and Global Economy vis-à-vis the Climate Change

Authors: Assaad Ghazouani, Ati Abdessatar

Abstract:

In the world, agriculture maintains a social and economic importance in the national economy. Its importance is distinguished by its ripple effects not only downstream but also upstream vis-à-vis the non-agricultural sector. However, the situation is relatively fragile because of weather conditions. In this work, we propose a model to highlight the impacts of climate change (CC) on economic growth in the world where agriculture is considered as a strategic sector. The CC is supposed to directly and indirectly affect economic growth by reducing the performance of the agricultural sector. The model is tested for Tunisia. The results validate the hypothesis that the potential economic damage of the CC is important. Indeed, an increase in CO2 concentration (temperatures and disruption of rainfall patterns) will have an impact on global economic growth particularly by reducing the performance of the agricultural sector. Analysis from a vector error correction model also highlights the magnitude of climate impact on the performance of the agricultural sector and its repercussions on economic growth

Keywords: Climate Change, Agriculture, Economic Growth, World, VECM, Cointegration.

Procedia PDF Downloads 608
549 Usage of Military Spending, Debt Servicing and Growth for Dealing with Emergency Plan of Indian External Debt

Authors: Sahbi Farhani

Abstract:

This study investigates the relationship between external debt and military spending in case of India over the period of 1970–2012. In doing so, we have applied the structural break unit root tests to examine stationarity properties of the variables. The Auto-Regressive Distributed Lag (ARDL) bounds testing approach is used to test whether cointegration exists in presence of structural breaks stemming in the series. Our results indicate the cointegration among external debt, military spending, debt servicing, and economic growth. Moreover, military spending and debt servicing add in external debt. Economic growth helps in lowering external debt. The Vector Error Correction Model (VECM) analysis and Granger causality test reveal that military spending and economic growth cause external debt. The feedback effect also exists between external debt and debt servicing in case of India.

Keywords: external debt, military spending, ARDL approach, India

Procedia PDF Downloads 281
548 Development of a Decision-Making Method by Using Machine Learning Algorithms in the Early Stage of School Building Design

Authors: Pegah Eshraghi, Zahra Sadat Zomorodian, Mohammad Tahsildoost

Abstract:

Over the past decade, energy consumption in educational buildings has steadily increased. The purpose of this research is to provide a method to quickly predict the energy consumption of buildings using separate evaluation of zones and decomposing the building to eliminate the complexity of geometry at the early design stage. To produce this framework, machine learning algorithms such as Support vector regression (SVR) and Artificial neural network (ANN) are used to predict energy consumption and thermal comfort metrics in a school as a case. The database consists of more than 55000 samples in three climates of Iran. Cross-validation evaluation and unseen data have been used for validation. In a specific label, cooling energy, it can be said the accuracy of prediction is at least 84% and 89% in SVR and ANN, respectively. The results show that the SVR performed much better than the ANN.

Keywords: early stage of design, energy, thermal comfort, validation, machine learning

Procedia PDF Downloads 72
547 Cooperative Spectrum Sensing Using Hybrid IWO/PSO Algorithm in Cognitive Radio Networks

Authors: Deepa Das, Susmita Das

Abstract:

Cognitive Radio (CR) is an emerging technology to combat the spectrum scarcity issues. This is achieved by consistently sensing the spectrum, and detecting the under-utilized frequency bands without causing undue interference to the primary user (PU). In soft decision fusion (SDF) based cooperative spectrum sensing, various evolutionary algorithms have been discussed, which optimize the weight coefficient vector for maximizing the detection performance. In this paper, we propose the hybrid invasive weed optimization and particle swarm optimization (IWO/PSO) algorithm as a fast and global optimization method, which improves the detection probability with a lesser sensing time. Then, the efficiency of this algorithm is compared with the standard invasive weed optimization (IWO), particle swarm optimization (PSO), genetic algorithm (GA) and other conventional SDF based methods on the basis of convergence and detection probability.

Keywords: cognitive radio, spectrum sensing, soft decision fusion, GA, PSO, IWO, hybrid IWO/PSO

Procedia PDF Downloads 453
546 A Data-Driven Optimal Control Model for the Dynamics of Monkeypox in a Variable Population with a Comprehensive Cost-Effectiveness Analysis

Authors: Martins Onyekwelu Onuorah, Jnr Dahiru Usman

Abstract:

Introduction: In the realm of public health, the threat posed by Monkeypox continues to elicit concern, prompting rigorous studies to understand its dynamics and devise effective containment strategies. Particularly significant is its recurrence in variable populations, such as the observed outbreak in Nigeria in 2022. In light of this, our study undertakes a meticulous analysis, employing a data-driven approach to explore, validate, and propose optimized intervention strategies tailored to the distinct dynamics of Monkeypox within varying demographic structures. Utilizing a deterministic mathematical model, we delved into the intricate dynamics of Monkeypox, with a particular focus on a variable population context. Our qualitative analysis provided insights into the disease-free equilibrium, revealing its stability when R0 is less than one and discounting the possibility of backward bifurcation, as substantiated by the presence of a single stable endemic equilibrium. The model was rigorously validated using real-time data from the Nigerian 2022 recorded cases for Epi weeks 1 – 52. Transitioning from qualitative to quantitative, we augmented our deterministic model with optimal control, introducing three time-dependent interventions to scrutinize their efficacy and influence on the epidemic's trajectory. Numerical simulations unveiled a pronounced impact of the interventions, offering a data-supported blueprint for informed decision-making in containing the disease. A comprehensive cost-effectiveness analysis employing the Infection Averted Ratio (IAR), Average Cost-Effectiveness Ratio (ACER), and Incremental Cost-Effectiveness Ratio (ICER) facilitated a balanced evaluation of the interventions’ economic and health impacts. In essence, our study epitomizes a holistic approach to understanding and mitigating Monkeypox, intertwining rigorous mathematical modeling, empirical validation, and economic evaluation. The insights derived not only bolster our comprehension of Monkeypox's intricate dynamics but also unveil optimized, cost-effective interventions. This integration of methodologies and findings underscores a pivotal stride towards aligning public health imperatives with economic sustainability, marking a significant contribution to global efforts in combating infectious diseases.

Keywords: monkeypox, equilibrium states, stability, bifurcation, optimal control, cost-effectiveness

Procedia PDF Downloads 61
545 Ensemble-Based SVM Classification Approach for miRNA Prediction

Authors: Sondos M. Hammad, Sherin M. ElGokhy, Mahmoud M. Fahmy, Elsayed A. Sallam

Abstract:

In this paper, an ensemble-based Support Vector Machine (SVM) classification approach is proposed. It is used for miRNA prediction. Three problems, commonly associated with previous approaches, are alleviated. These problems arise due to impose assumptions on the secondary structural of premiRNA, imbalance between the numbers of the laboratory checked miRNAs and the pseudo-hairpins, and finally using a training data set that does not consider all the varieties of samples in different species. We aggregate the predicted outputs of three well-known SVM classifiers; namely, Triplet-SVM, Virgo and Mirident, weighted by their variant features without any structural assumptions. An additional SVM layer is used in aggregating the final output. The proposed approach is trained and then tested with balanced data sets. The results of the proposed approach outperform the three base classifiers. Improved values for the metrics of 88.88% f-score, 92.73% accuracy, 90.64% precision, 96.64% specificity, 87.2% sensitivity, and the area under the ROC curve is 0.91 are achieved.

Keywords: MiRNAs, SVM classification, ensemble algorithm, assumption problem, imbalance data

Procedia PDF Downloads 332
544 Sentiment Analysis on the East Timor Accession Process to the ASEAN

Authors: Marcelino Caetano Noronha, Vosco Pereira, Jose Soares Pinto, Ferdinando Da C. Saores

Abstract:

One particularly popular social media platform is Youtube. It’s a video-sharing platform where users can submit videos, and other users can like, dislike or comment on the videos. In this study, we conduct a binary classification task on YouTube’s video comments and review from the users regarding the accession process of Timor Leste to become the eleventh member of the Association of South East Asian Nations (ASEAN). We scrape the data directly from the public YouTube video and apply several pre-processing and weighting techniques. Before conducting the classification, we categorized the data into two classes, namely positive and negative. In the classification part, we apply Support Vector Machine (SVM) algorithm. By comparing with Naïve Bayes Algorithm, the experiment showed SVM achieved 84.1% of Accuracy, 94.5% of Precision, and Recall 73.8% simultaneously.

Keywords: classification, YouTube, sentiment analysis, support sector machine

Procedia PDF Downloads 90
543 Resume Ranking Using Custom Word2vec and Rule-Based Natural Language Processing Techniques

Authors: Subodh Chandra Shakya, Rajendra Sapkota, Aakash Tamang, Shushant Pudasaini, Sujan Adhikari, Sajjan Adhikari

Abstract:

Lots of efforts have been made in order to measure the semantic similarity between the text corpora in the documents. Techniques have been evolved to measure the similarity of two documents. One such state-of-art technique in the field of Natural Language Processing (NLP) is word to vector models, which converts the words into their word-embedding and measures the similarity between the vectors. We found this to be quite useful for the task of resume ranking. So, this research paper is the implementation of the word2vec model along with other Natural Language Processing techniques in order to rank the resumes for the particular job description so as to automate the process of hiring. The research paper proposes the system and the findings that were made during the process of building the system.

Keywords: chunking, document similarity, information extraction, natural language processing, word2vec, word embedding

Procedia PDF Downloads 141
542 Comparative Analysis of DTC Based Switched Reluctance Motor Drive Using Torque Equation and FEA Models

Authors: P. Srinivas, P. V. N. Prasad

Abstract:

Since torque ripple is the main cause of noise and vibrations, the performance of Switched Reluctance Motor (SRM) can be improved by minimizing its torque ripple using a novel control technique called Direct Torque Control (DTC). In DTC technique, torque is controlled directly through control of magnitude of the flux and change in speed of the stator flux vector. The flux and torque are maintained within set hysteresis bands. The DTC of SRM is analysed by two methods. In one of the methods, the actual torque is computed by conducting Finite Element Analysis (FEA) on the design specifications of the motor. In the other method, the torque is computed by Simplified Torque Equation. The variation of peak current, average current, torque ripple and speed settling time with Simplified Torque Equation model is compared with FEA based model.

Keywords: direct toque control, simplified torque equation, finite element analysis, torque ripple

Procedia PDF Downloads 467
541 Dangerous Words: A Moral Economy of HIV/AIDS in Swaziland

Authors: Robin Root

Abstract:

A fundamental premise of medical anthropology is that clinical phenomena are simultaneously cultural, political, and economic: none more so than the linked acronyms HIV/AIDS. For the medical researcher, HIV/AIDS signals an epidemiological pandemic and a pathophysiology. For persons diagnosed with an HIV-related condition, the acronym often conjures dread, too often marking and marginalizing the afflicted irretrievably. Critical medical anthropology is uniquely equipped to theorize the linkages that bind individual and social wellbeing to global structural and culture-specific phenomena. This paper reports findings from an anthropological study of HIV/AIDS in Swaziland, site of the highest HIV prevalence in the world. The project, initiated in 2005, has documented experiences of HIV/AIDS, religiosity, and treatment and care as well as drought and famine. Drawing on interviews with Swazi religious and traditional leaders about their experiences of leadership amidst worsening economic conditions, environmental degradation, and an ongoing global health crisis, the paper provides uncommon insights for global health practitioners whose singular paradigm for designing and delivering interventions is biomedically-based. In contrast, this paper details the role of local leaders in mediating extreme social suffering and resilience in ways that medical science cannot model but which radically impact how sickness is experienced and health services are delivered and accessed. Two concepts help to organize the paper’s argument. First, a ‘moral economy of language’ is central to showing up the implicit ‘technologies of knowledge’ that inhere in scientific and religious discourses of HIV/AIDS; people draw upon these discourses strategically to navigate highly vulnerable conditions. Second, Paulo Freire’s ethnographic focus on a culture’s 'dangerous words' opens up for examination how ‘sex’ is dangerous for religion and ‘god’ is dangerous for science. The paper interrogates hegemonic and ‘lived’ discourses, both biomedical and religious, and contributes to an important literature on the moral economies of health, a framework of explication and, importantly, action appropriate to a wide-range of contemporary global health phenomena. The paper concludes by asserting that it is imperative that global health planners reflect upon and ‘check’ their hegemonic policy platforms by, one, collaborating with local authoritative agents of ‘what sickness means and how it is best treated,’ and, two, taking account of the structural barriers to achieving good health.

Keywords: Africa, biomedicine, HIV/AIDS, qualitative research , religion

Procedia PDF Downloads 99
540 Incorporating Information Gain in Regular Expressions Based Classifiers

Authors: Rosa L. Figueroa, Christopher A. Flores, Qing Zeng-Treitler

Abstract:

A regular expression consists of sequence characters which allow describing a text path. Usually, in clinical research, regular expressions are manually created by programmers together with domain experts. Lately, there have been several efforts to investigate how to generate them automatically. This article presents a text classification algorithm based on regexes. The algorithm named REX was designed, and then, implemented as a simplified method to create regexes to classify Spanish text automatically. In order to classify ambiguous cases, such as, when multiple labels are assigned to a testing example, REX includes an information gain method Two sets of data were used to evaluate the algorithm’s effectiveness in clinical text classification tasks. The results indicate that the regular expression based classifier proposed in this work performs statically better regarding accuracy and F-measure than Support Vector Machine and Naïve Bayes for both datasets.

Keywords: information gain, regular expressions, smith-waterman algorithm, text classification

Procedia PDF Downloads 305