Search results for: imperialist competition algorithm
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4466

Search results for: imperialist competition algorithm

2396 A Multi-Modal Virtual Walkthrough of the Virtual Past and Present Based on Panoramic View, Crowd Simulation and Acoustic Heritage on Mobile Platform

Authors: Lim Chen Kim, Tan Kian Lam, Chan Yi Chee

Abstract:

This research presents a multi-modal simulation in the reconstruction of the past and the construction of present in digital cultural heritage on mobile platform. In bringing the present life, the virtual environment is generated through a presented scheme for rapid and efficient construction of 360° panoramic view. Then, acoustical heritage model and crowd model are presented and improvised into the 360° panoramic view. For the reconstruction of past life, the crowd is simulated and rendered in an old trading port. However, the keystone of this research is in a virtual walkthrough that shows the virtual present life in 2D and virtual past life in 3D, both in an environment of virtual heritage sites in George Town through mobile device. Firstly, the 2D crowd is modelled and simulated using OpenGL ES 1.1 on mobile platform. The 2D crowd is used to portray the present life in 360° panoramic view of a virtual heritage environment based on the extension of Newtonian Laws. Secondly, the 2D crowd is animated and rendered into 3D with improved variety and incorporated into the virtual past life using Unity3D Game Engine. The behaviours of the 3D models are then simulated based on the enhancement of the classical model of Boid algorithm. Finally, a demonstration system is developed and integrated with the models, techniques and algorithms of this research. The virtual walkthrough is demonstrated to a group of respondents and is evaluated through the user-centred evaluation by navigating around the demonstration system. The results of the evaluation based on the questionnaires have shown that the presented virtual walkthrough has been successfully deployed through a multi-modal simulation and such a virtual walkthrough would be particularly useful in a virtual tour and virtual museum applications.

Keywords: Boid Algorithm, Crowd Simulation, Mobile Platform, Newtonian Laws, Virtual Heritage

Procedia PDF Downloads 277
2395 Heliport Remote Safeguard System Based on Real-Time Stereovision 3D Reconstruction Algorithm

Authors: Ł. Morawiński, C. Jasiński, M. Jurkiewicz, S. Bou Habib, M. Bondyra

Abstract:

With the development of optics, electronics, and computers, vision systems are increasingly used in various areas of life, science, and industry. Vision systems have a huge number of applications. They can be used in quality control, object detection, data reading, e.g., QR-code, etc. A large part of them is used for measurement purposes. Some of them make it possible to obtain a 3D reconstruction of the tested objects or measurement areas. 3D reconstruction algorithms are mostly based on creating depth maps from data that can be acquired from active or passive methods. Due to the specific appliance in airfield technology, only passive methods are applicable because of other existing systems working on the site, which can be blinded on most spectral levels. Furthermore, reconstruction is required to work long distances ranging from hundreds of meters to tens of kilometers with low loss of accuracy even with harsh conditions such as fog, rain, or snow. In response to those requirements, HRESS (Heliport REmote Safeguard System) was developed; which main part is a rotational head with a two-camera stereovision rig gathering images around the head in 360 degrees along with stereovision 3D reconstruction and point cloud combination. The sub-pixel analysis introduced in the HRESS system makes it possible to obtain an increased distance measurement resolution and accuracy of about 3% for distances over one kilometer. Ultimately, this leads to more accurate and reliable measurement data in the form of a point cloud. Moreover, the program algorithm introduces operations enabling the filtering of erroneously collected data in the point cloud. All activities from the programming, mechanical and optical side are aimed at obtaining the most accurate 3D reconstruction of the environment in the measurement area.

Keywords: airfield monitoring, artificial intelligence, stereovision, 3D reconstruction

Procedia PDF Downloads 124
2394 Exploring the Effect of Environmental Cues of Food Festival on Visitor Satisfaction

Authors: Tao Zhang

Abstract:

As the competition of all kinds of festival events becomes more and more fierce, more and more event organizers try to design a blended festivalscape by integrating multifaceted environmental cues in order to raise the service quality of festival events and then raise visitors’ satisfaction. As the main type of festival events, food festivals are popular in all over the world. The organizers’ of food festivals also try to mix food with multifaceted environmental cues (e.g., music, stage, light, dance) to design a blended festivalscape. However, until now little studies explore the environmental cues of food festivals and their relationship with visitors’ satisfaction. Therefore, the aim of this study is to ascertain the environmental cues of food festival and their relationship with visitors’ satisfaction by using the blended festivalscape theory. Using convenient sampling method, this study investigated 1,000 food festival visitors in Macau. Factor analysis showed there are mainly six environmental cues (i.e., food, atmosphere, program, staff, facility, and information). All six environmental cues are positively related with visitors’ satisfaction, while the most influential factors are food, atmosphere, and program. This study showed that festival event organizers’ should focus on the topic of their festival event, build festival atmosphere, and create interesting programs in order to design a blended festivalscape and then raise visitors’ satisfaction.

Keywords: environmental cue, event, festival, satisfaction

Procedia PDF Downloads 366
2393 Local Directional Encoded Derivative Binary Pattern Based Coral Image Classification Using Weighted Distance Gray Wolf Optimization Algorithm

Authors: Annalakshmi G., Sakthivel Murugan S.

Abstract:

This paper presents a local directional encoded derivative binary pattern (LDEDBP) feature extraction method that can be applied for the classification of submarine coral reef images. The classification of coral reef images using texture features is difficult due to the dissimilarities in class samples. In coral reef image classification, texture features are extracted using the proposed method called local directional encoded derivative binary pattern (LDEDBP). The proposed approach extracts the complete structural arrangement of the local region using local binary batten (LBP) and also extracts the edge information using local directional pattern (LDP) from the edge response available in a particular region, thereby achieving extra discriminative feature value. Typically the LDP extracts the edge details in all eight directions. The process of integrating edge responses along with the local binary pattern achieves a more robust texture descriptor than the other descriptors used in texture feature extraction methods. Finally, the proposed technique is applied to an extreme learning machine (ELM) method with a meta-heuristic algorithm known as weighted distance grey wolf optimizer (GWO) to optimize the input weight and biases of single-hidden-layer feed-forward neural networks (SLFN). In the empirical results, ELM-WDGWO demonstrated their better performance in terms of accuracy on all coral datasets, namely RSMAS, EILAT, EILAT2, and MLC, compared with other state-of-the-art algorithms. The proposed method achieves the highest overall classification accuracy of 94% compared to the other state of art methods.

Keywords: feature extraction, local directional pattern, ELM classifier, GWO optimization

Procedia PDF Downloads 163
2392 The Design and Implementation of an Enhanced 2D Mesh Switch

Authors: Manel Langar, Riad Bourguiba, Jaouhar Mouine

Abstract:

In this paper, we propose the design and implementation of an enhanced wormhole virtual channel on chip router. It is a heart of a mesh NoC using the XY deterministic routing algorithm. It is characterized by its simple virtual channel allocation strategy which allows reducing area and complexity of connections without affecting the performance. We implemented our router on a Tezzaron process to validate its performances. This router is a basic element that will be used later to design a 3D mesh NoC.

Keywords: NoC, mesh, router, 3D NoC

Procedia PDF Downloads 568
2391 Investigating the Algorithm to Maintain a Constant Speed in the Wankel Engine

Authors: Adam Majczak, Michał Bialy, Zbigniew Czyż, Zdzislaw Kaminski

Abstract:

Increasingly stringent emission standards for passenger cars require us to find alternative drives. The share of electric vehicles in the sale of new cars increases every year. However, their performance and, above all, range cannot be today successfully compared to those of cars with a traditional internal combustion engine. Battery recharging lasts hours, which can be hardly accepted due to the time needed to refill a fuel tank. Therefore, the ways to reduce the adverse features of cars equipped with electric motors only are searched for. One of the methods is a combination of an electric engine as a main source of power and a small internal combustion engine as an electricity generator. This type of drive enables an electric vehicle to achieve a radically increased range and low emissions of toxic substances. For several years, the leading automotive manufacturers like the Mazda and the Audi together with the best companies in the automotive industry, e.g., AVL have developed some electric drive systems capable of recharging themselves while driving, known as a range extender. An electricity generator is powered by a Wankel engine that has seemed to pass into history. This low weight and small engine with a rotating piston and a very low vibration level turned out to be an excellent source in such applications. Its operation as an energy source for a generator almost entirely eliminates its disadvantages like high fuel consumption, high emission of toxic substances, or short lifetime typical of its traditional application. The operation of the engine at a constant rotational speed enables a significant increase in its lifetime, and its small external dimensions enable us to make compact modules to drive even small urban cars like the Audi A1 or the Mazda 2. The algorithm to maintain a constant speed was investigated on the engine dynamometer with an eddy current brake and the necessary measuring apparatus. The research object was the Aixro XR50 rotary engine with the electronic power supply developed at the Lublin University of Technology. The load torque of the engine was altered during the research by means of the eddy current brake capable of giving any number of load cycles. The parameters recorded included speed and torque as well as a position of a throttle in an inlet system. Increasing and decreasing load did not significantly change engine speed, which means that control algorithm parameters are correctly selected. This work has been financed by the Polish Ministry of Science and Higher Education.

Keywords: electric vehicle, power generator, range extender, Wankel engine

Procedia PDF Downloads 157
2390 Ensemble Machine Learning Approach for Estimating Missing Data from CO₂ Time Series

Authors: Atbin Mahabbati, Jason Beringer, Matthias Leopold

Abstract:

To address the global challenges of climate and environmental changes, there is a need for quantifying and reducing uncertainties in environmental data, including observations of carbon, water, and energy. Global eddy covariance flux tower networks (FLUXNET), and their regional counterparts (i.e., OzFlux, AmeriFlux, China Flux, etc.) were established in the late 1990s and early 2000s to address the demand. Despite the capability of eddy covariance in validating process modelling analyses, field surveys and remote sensing assessments, there are some serious concerns regarding the challenges associated with the technique, e.g. data gaps and uncertainties. To address these concerns, this research has developed an ensemble model to fill the data gaps of CO₂ flux to avoid the limitations of using a single algorithm, and therefore, provide less error and decline the uncertainties associated with the gap-filling process. In this study, the data of five towers in the OzFlux Network (Alice Springs Mulga, Calperum, Gingin, Howard Springs and Tumbarumba) during 2013 were used to develop an ensemble machine learning model, using five feedforward neural networks (FFNN) with different structures combined with an eXtreme Gradient Boosting (XGB) algorithm. The former methods, FFNN, provided the primary estimations in the first layer, while the later, XGB, used the outputs of the first layer as its input to provide the final estimations of CO₂ flux. The introduced model showed slight superiority over each single FFNN and the XGB, while each of these two methods was used individually, overall RMSE: 2.64, 2.91, and 3.54 g C m⁻² yr⁻¹ respectively (3.54 provided by the best FFNN). The most significant improvement happened to the estimation of the extreme diurnal values (during midday and sunrise), as well as nocturnal estimations, which is generally considered as one of the most challenging parts of CO₂ flux gap-filling. The towers, as well as seasonality, showed different levels of sensitivity to improvements provided by the ensemble model. For instance, Tumbarumba showed more sensitivity compared to Calperum, where the differences between the Ensemble model on the one hand and the FFNNs and XGB, on the other hand, were the least of all 5 sites. Besides, the performance difference between the ensemble model and its components individually were more significant during the warm season (Jan, Feb, Mar, Oct, Nov, and Dec) compared to the cold season (Apr, May, Jun, Jul, Aug, and Sep) due to the higher amount of photosynthesis of plants, which led to a larger range of CO₂ exchange. In conclusion, the introduced ensemble model slightly improved the accuracy of CO₂ flux gap-filling and robustness of the model. Therefore, using ensemble machine learning models is potentially capable of improving data estimation and regression outcome when it seems to be no more room for improvement while using a single algorithm.

Keywords: carbon flux, Eddy covariance, extreme gradient boosting, gap-filling comparison, hybrid model, OzFlux network

Procedia PDF Downloads 139
2389 Feature Evaluation Based on Random Subspace and Multiple-K Ensemble

Authors: Jaehong Yu, Seoung Bum Kim

Abstract:

Clustering analysis can facilitate the extraction of intrinsic patterns in a dataset and reveal its natural groupings without requiring class information. For effective clustering analysis in high dimensional datasets, unsupervised dimensionality reduction is an important task. Unsupervised dimensionality reduction can generally be achieved by feature extraction or feature selection. In many situations, feature selection methods are more appropriate than feature extraction methods because of their clear interpretation with respect to the original features. The unsupervised feature selection can be categorized as feature subset selection and feature ranking method, and we focused on unsupervised feature ranking methods which evaluate the features based on their importance scores. Recently, several unsupervised feature ranking methods were developed based on ensemble approaches to achieve their higher accuracy and stability. However, most of the ensemble-based feature ranking methods require the true number of clusters. Furthermore, these algorithms evaluate the feature importance depending on the ensemble clustering solution, and they produce undesirable evaluation results if the clustering solutions are inaccurate. To address these limitations, we proposed an ensemble-based feature ranking method with random subspace and multiple-k ensemble (FRRM). The proposed FRRM algorithm evaluates the importance of each feature with the random subspace ensemble, and all evaluation results are combined with the ensemble importance scores. Moreover, FRRM does not require the determination of the true number of clusters in advance through the use of the multiple-k ensemble idea. Experiments on various benchmark datasets were conducted to examine the properties of the proposed FRRM algorithm and to compare its performance with that of existing feature ranking methods. The experimental results demonstrated that the proposed FRRM outperformed the competitors.

Keywords: clustering analysis, multiple-k ensemble, random subspace-based feature evaluation, unsupervised feature ranking

Procedia PDF Downloads 339
2388 Modified Weibull Approach for Bridge Deterioration Modelling

Authors: Niroshan K. Walgama Wellalage, Tieling Zhang, Richard Dwight

Abstract:

State-based Markov deterioration models (SMDM) sometimes fail to find accurate transition probability matrix (TPM) values, and hence lead to invalid future condition prediction or incorrect average deterioration rates mainly due to drawbacks of existing nonlinear optimization-based algorithms and/or subjective function types used for regression analysis. Furthermore, a set of separate functions for each condition state with age cannot be directly derived by using Markov model for a given bridge element group, which however is of interest to industrial partners. This paper presents a new approach for generating Homogeneous SMDM model output, namely, the Modified Weibull approach, which consists of a set of appropriate functions to describe the percentage condition prediction of bridge elements in each state. These functions are combined with Bayesian approach and Metropolis Hasting Algorithm (MHA) based Markov Chain Monte Carlo (MCMC) simulation technique for quantifying the uncertainty in model parameter estimates. In this study, factors contributing to rail bridge deterioration were identified. The inspection data for 1,000 Australian railway bridges over 15 years were reviewed and filtered accordingly based on the real operational experience. Network level deterioration model for a typical bridge element group was developed using the proposed Modified Weibull approach. The condition state predictions obtained from this method were validated using statistical hypothesis tests with a test data set. Results show that the proposed model is able to not only predict the conditions in network-level accurately but also capture the model uncertainties with given confidence interval.

Keywords: bridge deterioration modelling, modified weibull approach, MCMC, metropolis-hasting algorithm, bayesian approach, Markov deterioration models

Procedia PDF Downloads 727
2387 Diversity and Intensity of International Technology Transfer and their Impacts on Organizational Performance

Authors: Seongryong Kang, Woonjin Kim, Sungjoo Lee

Abstract:

Under the environment of fierce competition and globalized economy, international technology collaboration has gained increasing attention as a way to improve innovation efficiency. While international technology transfer helps a firm to acquire necessary technology in a short period of time, it also has a risk; embedding external technology from overseas partners may cause a transaction cost due to the regional, cultural and language barriers, which tend to offset the benefits of such transfer. Though a number of previous studies have focused on the effects of technology in-transfer on firm performance, few have conducted in the context of international technology transfer. To fill this gap, this study aims to investigate the impact of international technology in-transfer on firm performance – both innovation and financial performance, with a particular emphasis on the diversity and intensity of such transfer. To do this, we adopted technology balance payment (TBP) data of Korean firms from 2010 to 2011, where an intermediate regression analysis was used to identify the intermediate effects of absorptive capacity. The analysis results indicate that i) the diversity and intensity of international technology transfer influence innovation performance by improving R&D capability positively; and ii) the diversity has a positive impact but the intensity has a negative impact on financial performance through the intermediation of R&D intensity. The research findings are expected to provide meaningful implications for establishing global technology strategy and developing policy programs to facilitate technology transfer.

Keywords: diversity, intensity, international technology acquisition, performance, technology transfer

Procedia PDF Downloads 361
2386 Graphical Theoretical Construction of Discrete time Share Price Paths from Matroid

Authors: Min Wang, Sergey Utev

Abstract:

The lessons from the 2007-09 global financial crisis have driven scientific research, which considers the design of new methodologies and financial models in the global market. The quantum mechanics approach was introduced in the unpredictable stock market modeling. One famous quantum tool is Feynman path integral method, which was used to model insurance risk by Tamturk and Utev and adapted to formalize the path-dependent option pricing by Hao and Utev. The research is based on the path-dependent calculation method, which is motivated by the Feynman path integral method. The path calculation can be studied in two ways, one way is to label, and the other is computational. Labeling is a part of the representation of objects, and generating functions can provide many different ways of representing share price paths. In this paper, the recent works on graphical theoretical construction of individual share price path via matroid is presented. Firstly, a study is done on the knowledge of matroid, relationship between lattice path matroid and Tutte polynomials and ways to connect points in the lattice path matroid and Tutte polynomials is suggested. Secondly, It is found that a general binary tree can be validly constructed from a connected lattice path matroid rather than general lattice path matroid. Lastly, it is suggested that there is a way to represent share price paths via a general binary tree, and an algorithm is developed to construct share price paths from general binary trees. A relationship is also provided between lattice integer points and Tutte polynomials of a transversal matroid. Use this way of connection together with the algorithm, a share price path can be constructed from a given connected lattice path matroid.

Keywords: combinatorial construction, graphical representation, matroid, path calculation, share price, Tutte polynomial

Procedia PDF Downloads 137
2385 Impact of Dynamic Capabilities on Knowledge Management Processes

Authors: Farzad Yavari, Fereydoun Ohadi

Abstract:

Today, with the development and growth of technology and extreme environmental changes, organizations need to identify opportunities and create creativity and innovation in order to be able to maintain or improve their position in competition with others. In this regard, it is necessary that the resources and assets of the organization are coordinated and reviewed in accordance with the orientation of the strategy. One of the competitive advantages of the present age is knowledge management, which is to equip the organization with the knowledge of the day and disseminate among employees and use it in the development of products and services. Therefore, in the forthcoming research, the impact of dynamic capabilities components (sense, seize, and reconfiguration) has been investigated on knowledge management processes (acquisition, integration and knowledge utilization) in the MAPNA Engineering and Construction Company using a field survey and applied research method. For this purpose, a questionnaire was filled out in the form of 15 questions for dynamic components and 15 questions for measuring knowledge management components and distributed among 46 employees of the knowledge management organization. Validity of the questionnaire was evaluated through content validity and its reliability with Cronbach's coefficient. Pearson correlation test and structural equation technique were used to analyze the data. The results of the research indicate a positive significant correlation between the components of dynamic capabilities and knowledge management.

Keywords: dynamic capabilities, knowledge management, sense capability, seize capability, reconfigurable capability, knowledge acquisition, knowledge integrity, knowledge utilization

Procedia PDF Downloads 119
2384 Population Dynamics of Juvenile Dusky Groupers, Epinephelus Marginatus: "Lowe, 1834" From Two Sites in Terceira Island, Azores, Portugal

Authors: Regina Streltsov

Abstract:

The Archipelago of the Azores in the NE Atlantic is a hot spot of marine biodiversity, both pelagic and demersal. Epinephelus marginatus is a solitary species commonly observed in these waters, with distinct territorial/residential behaviors from their post- larva and juvenile stages to the adult phase. Being commercially high valued species, about 13% of all groupers (Family Epinephelidae) face an increasing pressure that has produced known impacts in both the abundance and distribution of this group of fishes. Epinephelus marginatus is currently assessed by the IUCN as a vulnerable species. Dusky gropers inhabit rocky bottoms from shallow waters down to 200 m. Juveniles are usually found in shallow shoreline waters. Population dynamics of juveniles can lead to a better understanding of the competition for resources and predation and further conservation measures that must be taken upon dusky groupers. This study is carried out in rocky reefs from two sheltered bays on the south and north coast of the island in two different spots with four sampling sites in total. Using Transects individuals are counted at the peak of high tide and all abiotic factors are recorded. Our goal is to complete a statistically significant number of observations in order to detail these populations and to better understand their dynamics and dimension.

Keywords: Azores, dusky groupers, Epinephelus marginatus, population dynamics

Procedia PDF Downloads 157
2383 Web Data Scraping Technology Using Term Frequency Inverse Document Frequency to Enhance the Big Data Quality on Sentiment Analysis

Authors: Sangita Pokhrel, Nalinda Somasiri, Rebecca Jeyavadhanam, Swathi Ganesan

Abstract:

Tourism is a booming industry with huge future potential for global wealth and employment. There are countless data generated over social media sites every day, creating numerous opportunities to bring more insights to decision-makers. The integration of Big Data Technology into the tourism industry will allow companies to conclude where their customers have been and what they like. This information can then be used by businesses, such as those in charge of managing visitor centers or hotels, etc., and the tourist can get a clear idea of places before visiting. The technical perspective of natural language is processed by analysing the sentiment features of online reviews from tourists, and we then supply an enhanced long short-term memory (LSTM) framework for sentiment feature extraction of travel reviews. We have constructed a web review database using a crawler and web scraping technique for experimental validation to evaluate the effectiveness of our methodology. The text form of sentences was first classified through Vader and Roberta model to get the polarity of the reviews. In this paper, we have conducted study methods for feature extraction, such as Count Vectorization and TFIDF Vectorization, and implemented Convolutional Neural Network (CNN) classifier algorithm for the sentiment analysis to decide the tourist’s attitude towards the destinations is positive, negative, or simply neutral based on the review text that they posted online. The results demonstrated that from the CNN algorithm, after pre-processing and cleaning the dataset, we received an accuracy of 96.12% for the positive and negative sentiment analysis.

Keywords: counter vectorization, convolutional neural network, crawler, data technology, long short-term memory, web scraping, sentiment analysis

Procedia PDF Downloads 88
2382 Probabilistic Graphical Model for the Web

Authors: M. Nekri, A. Khelladi

Abstract:

The world wide web network is a network with a complex topology, the main properties of which are the distribution of degrees in power law, A low clustering coefficient and a weak average distance. Modeling the web as a graph allows locating the information in little time and consequently offering a help in the construction of the research engine. Here, we present a model based on the already existing probabilistic graphs with all the aforesaid characteristics. This work will consist in studying the web in order to know its structuring thus it will enable us to modelize it more easily and propose a possible algorithm for its exploration.

Keywords: clustering coefficient, preferential attachment, small world, web community

Procedia PDF Downloads 272
2381 Electrohydrodynamic Instability and Enhanced Mixing with Thermal Field and Polymer Addition Modulation

Authors: Dilin Chen, Kang Luo, Jian Wu, Chun Yang, Hongliang Yi

Abstract:

Electrically driven flows (EDF) systems play an important role in fuel cells, electrochemistry, bioseparation technology, fluid pumping, and microswimmers. The core scientific problem is multifield coupling, the further development of which depends on the exploration of nonlinear instabilities, force competing mechanisms, and energy budgets. In our study, two categories of electrostatic force-dominated phenomena, induced charge electrosmosis (ICEO) and ion conduction pumping are investigated while considering polymer rheological characteristics and heat gradients. With finite volume methods, the thermal modulation strategy of ICEO under the thermal buoyancy force is numerically analyzed, and the electroelastic instability turn associated with polymer addition is extended. The results reveal that the thermal buoyancy forces are sufficient to create typical thermogravitational convection in competition with electroconvective modes. Electroelastic instability tends to be promoted by weak electrical forces, and polymers effectively alter the unstable transition routes. Our letter paves the way for improved mixing and heat transmission in microdevices, as well as insights into the non-Newtonian nature of electrohydrodynamic dynamics.

Keywords: non-Newtonian fluid, electroosmotic flow, electrohydrodynamic, viscoelastic liquids, heat transfer

Procedia PDF Downloads 68
2380 Neural Network Based Control Algorithm for Inhabitable Spaces Applying Emotional Domotics

Authors: Sergio A. Navarro Tuch, Martin Rogelio Bustamante Bello, Leopoldo Julian Lechuga Lopez

Abstract:

In recent years, Mexico’s population has seen a rise of different physiological and mental negative states. Two main consequences of this problematic are deficient work performance and high levels of stress generating and important impact on a person’s physical, mental and emotional health. Several approaches, such as the use of audiovisual stimulus to induce emotions and modify a person’s emotional state, can be applied in an effort to decreases these negative effects. With the use of different non-invasive physiological sensors such as EEG, luminosity and face recognition we gather information of the subject’s current emotional state. In a controlled environment, a subject is shown a series of selected images from the International Affective Picture System (IAPS) in order to induce a specific set of emotions and obtain information from the sensors. The raw data obtained is statistically analyzed in order to filter only the specific groups of information that relate to a subject’s emotions and current values of the physical variables in the controlled environment such as, luminosity, RGB light color, temperature, oxygen level and noise. Finally, a neural network based control algorithm is given the data obtained in order to feedback the system and automate the modification of the environment variables and audiovisual content shown in an effort that these changes can positively alter the subject’s emotional state. During the research, it was found that the light color was directly related to the type of impact generated by the audiovisual content on the subject’s emotional state. Red illumination increased the impact of violent images and green illumination along with relaxing images decreased the subject’s levels of anxiety. Specific differences between men and women were found as to which type of images generated a greater impact in either gender. The population sample was mainly constituted by college students whose data analysis showed a decreased sensibility to violence towards humans. Despite the early stage of the control algorithm, the results obtained from the population sample give us a better insight into the possibilities of emotional domotics and the applications that can be created towards the improvement of performance in people’s lives. The objective of this research is to create a positive impact with the application of technology to everyday activities; nonetheless, an ethical problem arises since this can also be applied to control a person’s emotions and shift their decision making.

Keywords: data analysis, emotional domotics, performance improvement, neural network

Procedia PDF Downloads 140
2379 Separating Landform from Noise in High-Resolution Digital Elevation Models through Scale-Adaptive Window-Based Regression

Authors: Anne M. Denton, Rahul Gomes, David W. Franzen

Abstract:

High-resolution elevation data are becoming increasingly available, but typical approaches for computing topographic features, like slope and curvature, still assume small sliding windows, for example, of size 3x3. That means that the digital elevation model (DEM) has to be resampled to the scale of the landform features that are of interest. Any higher resolution is lost in this resampling. When the topographic features are computed through regression that is performed at the resolution of the original data, the accuracy can be much higher, and the reported result can be adjusted to the length scale that is relevant locally. Slope and variance are calculated for overlapping windows, meaning that one regression result is computed per raster point. The number of window centers per area is the same for the output as for the original DEM. Slope and variance are computed by performing regression on the points in the surrounding window. Such an approach is computationally feasible because of the additive nature of regression parameters and variance. Any doubling of window size in each direction only takes a single pass over the data, corresponding to a logarithmic scaling of the resulting algorithm as a function of the window size. Slope and variance are stored for each aggregation step, allowing the reported slope to be selected to minimize variance. The approach thereby adjusts the effective window size to the landform features that are characteristic to the area within the DEM. Starting with a window size of 2x2, each iteration aggregates 2x2 non-overlapping windows from the previous iteration. Regression results are stored for each iteration, and the slope at minimal variance is reported in the final result. As such, the reported slope is adjusted to the length scale that is characteristic of the landform locally. The length scale itself and the variance at that length scale are also visualized to aid in interpreting the results for slope. The relevant length scale is taken to be half of the window size of the window over which the minimum variance was achieved. The resulting process was evaluated for 1-meter DEM data and for artificial data that was constructed to have defined length scales and added noise. A comparison with ESRI ArcMap was performed and showed the potential of the proposed algorithm. The resolution of the resulting output is much higher and the slope and aspect much less affected by noise. Additionally, the algorithm adjusts to the scale of interest within the region of the image. These benefits are gained without additional computational cost in comparison with resampling the DEM and computing the slope over 3x3 images in ESRI ArcMap for each resolution. In summary, the proposed approach extracts slope and aspect of DEMs at the lengths scales that are characteristic locally. The result is of higher resolution and less affected by noise than existing techniques.

Keywords: high resolution digital elevation models, multi-scale analysis, slope calculation, window-based regression

Procedia PDF Downloads 129
2378 Banking Performance and Political Economy: Using ARDL Model

Authors: Marwen Ghouil, Jamel Eddine Mkadmi

Abstract:

Banking performance is the pillar and goal of all banking activity and its impact on economic policy. First, researchers defined the principles for assessing and modeling bank performance, and then theories and models explaining bank performance were developed. The importance of credit as a means of financing businesses in most developing countries has led to questions about the effects of financial liberalisation on increased banking competition. In Tunisia, as in many other countries, the liberalization of financial services in general and of banks' activities has not ceased to evolve. The objective of this paper is to examine the determinants of banking performance for 8 Tunisian banks and their impact on economic policy during the Arab Spring. We used cointegration analysis and the ARDL Panel model, explaining using total assets, bank credits, guarantees, and bank size as performance drivers. The correlation analysis shows that there is a positive correlation relationship between total assets, bank credits, guarantees, and bank size and bank performance. Long-term empirical results show that bank loans, guarantees, bank size, and total assets have a positive and significant impact on bank performance. This means that bank credits, guarantees, bank size, and total assets are very important determinants of bank performance in Tunisia.

Keywords: bank performance, economic policy, finance, economic

Procedia PDF Downloads 134
2377 Development of a Technology Assessment Model by Patents and Customers' Review Data

Authors: Kisik Song, Sungjoo Lee

Abstract:

Recent years have seen an increasing number of patent disputes due to excessive competition in the global market and a reduced technology life-cycle; this has increased the risk of investment in technology development. While many global companies have started developing a methodology to identify promising technologies and assess for decisions, the existing methodology still has some limitations. Post hoc assessments of the new technology are not being performed, especially to determine whether the suggested technologies turned out to be promising. For example, in existing quantitative patent analysis, a patent’s citation information has served as an important metric for quality assessment, but this analysis cannot be applied to recently registered patents because such information accumulates over time. Therefore, we propose a new technology assessment model that can replace citation information and positively affect technological development based on post hoc analysis of the patents for promising technologies. Additionally, we collect customer reviews on a target technology to extract keywords that show the customers’ needs, and we determine how many keywords are covered in the new technology. Finally, we construct a portfolio (based on a technology assessment from patent information) and a customer-based marketability assessment (based on review data), and we use them to visualize the characteristics of the new technologies.

Keywords: technology assessment, patents, citation information, opinion mining

Procedia PDF Downloads 466
2376 Applying Multiplicative Weight Update to Skin Cancer Classifiers

Authors: Animish Jain

Abstract:

This study deals with using Multiplicative Weight Update within artificial intelligence and machine learning to create models that can diagnose skin cancer using microscopic images of cancer samples. In this study, the multiplicative weight update method is used to take the predictions of multiple models to try and acquire more accurate results. Logistic Regression, Convolutional Neural Network (CNN), and Support Vector Machine Classifier (SVMC) models are employed within the Multiplicative Weight Update system. These models are trained on pictures of skin cancer from the ISIC-Archive, to look for patterns to label unseen scans as either benign or malignant. These models are utilized in a multiplicative weight update algorithm which takes into account the precision and accuracy of each model through each successive guess to apply weights to their guess. These guesses and weights are then analyzed together to try and obtain the correct predictions. The research hypothesis for this study stated that there would be a significant difference in the accuracy of the three models and the Multiplicative Weight Update system. The SVMC model had an accuracy of 77.88%. The CNN model had an accuracy of 85.30%. The Logistic Regression model had an accuracy of 79.09%. Using Multiplicative Weight Update, the algorithm received an accuracy of 72.27%. The final conclusion that was drawn was that there was a significant difference in the accuracy of the three models and the Multiplicative Weight Update system. The conclusion was made that using a CNN model would be the best option for this problem rather than a Multiplicative Weight Update system. This is due to the possibility that Multiplicative Weight Update is not effective in a binary setting where there are only two possible classifications. In a categorical setting with multiple classes and groupings, a Multiplicative Weight Update system might become more proficient as it takes into account the strengths of multiple different models to classify images into multiple categories rather than only two categories, as shown in this study. This experimentation and computer science project can help to create better algorithms and models for the future of artificial intelligence in the medical imaging field.

Keywords: artificial intelligence, machine learning, multiplicative weight update, skin cancer

Procedia PDF Downloads 79
2375 Cyclic Plastic Deformation of 20MN-MO-NI 55 Steel in Dynamic Strain Ageing Regime

Authors: Ashok Kumar, Sarita Sahu, H. N. Bar

Abstract:

Low cycle fatigue behavior of a ferritic, martensitic pressure vessel steel at dynamic strain ageing regime of 250°C to 280°C has been investigated. Dynamic strain ageing is a mechanism that has attracted interests of researchers due to its fascinating inexplicable repetitive nature for quite a long time. The interaction of dynamic strain ageing and cyclic plasticity has been studied from the mechanistic point of view. Dynamic strain ageing gives rise to identical serrated flow behavior in tensile and compressive halves of hysteresis loops and this has been found to gives rise to initial cyclic hardening followed by softening behavior, where as in non-DSA regime continuous cyclic softening has been found to be the dominant mechanism. An appreciable sensitivity towards nature of serrations has been observed due to degree of hardening of stable loop. The increase in degree of hardening with strain amplitude in the regime where only A type serrations are present and it decreases with strain amplitude where A+B type of serrations are present. Masing type of locus has been found in the behavior of metal at 280°C. Cyclic Stress Strain curve and Master curve has been constructed to decipher among the fatigue strength and ductility coefficients. Fractographic examinations have also shown a competition between progression of striations and secondary cracking.

Keywords: dynamic strain ageing, hardening, low cycle fatigue, softening

Procedia PDF Downloads 301
2374 Important of Innovation for Entrepreneurs

Authors: Eetedal Alanjem, Majedah Alnajem

Abstract:

The importance of innovation in entrepreneurship can be seen in the invention of new ways to produce products or improved solutions. A service industry can expand with new or improved types of services to fulfill the ever changing needs of their clients. Manufacturers can come up with new products from raw materials and by-products. Innovation is vital for the durability of any business. Innovation usually begins with a need. Small businesses are generally directly involved in their communities and they know exactly what the communities need and strive to come up with solutions to fulfill those needs. They seize the opportunity to innovate to ease communal problems and make lives more comfortable. And then, these solutions keep getting better, easier and more useful as entrepreneurs and their small businesses come up with improved formulas and solutions. Keeping abreast with current trends and demands is an important factor for entrepreneurs to fuel their creativity and innovation. Manufacturers are constantly innovating to produce more without sacrificing quality. Small businesses should make innovation as a fundamental part of their organisational development since innovation creates business success. Entrepreneurs must not see just one solution to a need. They should come up with ideas for multiple solutions. It is imperative for small businesses to encourage growth of innovation among their employees. Competition is another factor that elevates the importance of innovation in entrepreneurship. It motivates entrepreneurs to come up with better, improved products and services than their competitors for a higher share of the market. In this paper will go in-depth for each factor and will discuss some of cases studies to know how innovation it’s important for entrepreneurs by facts & lessons?

Keywords: innovation, entrepreneurship, creativity, organisational development

Procedia PDF Downloads 421
2373 Small and Medium Enterprises Owner-Managers/Entrepreneurs and Their Risk Perception in Songkhla Province, Thailand

Authors: Patraporn Kaewkhanitarak, Weerawan Marangkun

Abstract:

The objective of this study was to explore the establishment and to investigate the relationship between the gender (male or female) of SME owner-managers/ entrepreneurs and their risk perception in business activity. The study examines the data by interviewing 76 SME owner-managers/entrepreneurs’ responses (37 males, 39 females) in manufacturing, finance, human resources and marketing sector in the economic regions of Songkhla province, Thailand. This study found that four tools which were operation, cash flow, staff, and new market were perceived by the SME owner-managers/entrepreneurs at high level. However, male and female SME owner-managers/entrepreneurs perceived some factors such as the age of SME owner-managers/entrepreneurs, the duration of firm operation, type of firm, and type of business without significant differences. In contrast, the gender affected the risk perception about increasing cost, fierce competition, leapfrog development of firm, substandard staff, namely that male and female perceived these factors with significant differences. According to the research, SME owner-managers/entrepreneurs should develop their risk management competency to deal with the risk efficiently. Secondly, SME firms should gather into groups. Furthermore, it was shown that the five key tools used to manage these risky situations were the use of managerial competencies and clustering.

Keywords: risk perception, owner-managers/entrepreneurs, SME, Songkhla, Thailand

Procedia PDF Downloads 435
2372 Development of Wave-Dissipating Block Installation Simulation for Inexperienced Worker Training

Authors: Hao Min Chuah, Tatsuya Yamazaki, Ryosui Iwasawa, Tatsumi Suto

Abstract:

In recent years, with the advancement of digital technology, the movement to introduce so-called ICT (Information and Communication Technology), such as computer technology and network technology, to civil engineering construction sites and construction sites is accelerating. As part of this movement, attempts are being made in various situations to reproduce actual sites inside computers and use them for designing and construction planning, as well as for training inexperienced engineers. The installation of wave-dissipating blocks on coasts, etc., is a type of work that has been carried out by skilled workers based on their years of experience and is one of the tasks that is difficult for inexperienced workers to carry out on site. Wave-dissipating blocks are structures that are designed to protect coasts, beaches, and so on from erosion by reducing the energy of ocean waves. Wave-dissipating blocks usually weigh more than 1 t and are installed by being suspended by a crane, so it would be time-consuming and costly for inexperienced workers to train on-site. In this paper, therefore, a block installation simulator is developed based on Unity 3D, a game development engine. The simulator computes porosity. Porosity is defined as the ratio of the total volume of the wave breaker blocks inside the structure to the final shape of the ideal structure. Using the evaluation of porosity, the simulator can determine how well the user is able to install the blocks. The voxelization technique is used to calculate the porosity of the structure, simplifying the calculations. Other techniques, such as raycasting and box overlapping, are employed for accurate simulation. In the near future, the simulator will install an automatic block installation algorithm based on combinatorial optimization solutions and compare the user-demonstrated block installation and the appropriate installation solved by the algorithm.

Keywords: 3D simulator, porosity, user interface, voxelization, wave-dissipating blocks

Procedia PDF Downloads 103
2371 Customers' Prescription of Foreign versus Local Brands in the Pharmaceutical Industry of Peshawar, Pakistan

Authors: Saira Tajdar, Sajad Ahmad

Abstract:

The pharmaceutical market of Pakistan showed a mixed trend since 1947. In these six decades various local and foreign pharmaceutical companies entered the market with their highly researched based formulas and brands for various diseases. It also created a very competitive market between local and foreign companies and brands. But this intense competition does not clear the picture that whether the customers (Doctors) are preferring/prescribing foreign or local brands more frequently. Previous research has been done in various markets for different brands that whether the customers in that industry prefer foreign or local brands. However, the pharmaceutical industry in this regard has been ignored by the researchers. Generally people don't know that for prescription brands of medicines what the preferences of customers (Doctors) are. Therefore, this study is conducted in two departments of Pharmaceutical industry by selecting the top recommended formulas in those departments that for those formulas whether the customers (Doctors) are prescribing either foreign brands or local brands. Secondary data has been collected from previous studies on the country of origin (COO), ethnocentrism and factors influencing brands preferences from authentic sources. Primary data was also collected through 100 self administered questionnaires from top five hospitals of Peshawar. The results of the study were analyzed through SPSS which shows that in some categories of pharmaceutical products the COO is very important but not for all.

Keywords: customer prescription, country of origin, empirical study, foreign versus local brands, pharmaceutical industry, Pakistan

Procedia PDF Downloads 394
2370 NENU2PHAR: PHA-Based Materials from Micro-Algae for High-Volume Consumer Products

Authors: Enrique Moliner, Alba Lafarga, Isaac Herraiz, Evelina Castellana, Mihaela Mirea

Abstract:

NENU2PHAR (GA 887474) is an EU-funded project aimed at the development of polyhydroxyalkanoates (PHAs) from micro-algae. These biobased and biodegradable polymers are being tested and validated in different high-volume market applications including food packaging, cosmetic packaging, 3D printing filaments, agro-textiles and medical devices, counting on the support of key players like Danone, BEL Group, Sofradim or IFG. At the moment the project has achieved to produce PHAs from micro-algae with a cumulated yield around 17%, i.e. 1 kg PHAs produced from 5.8 kg micro-algae biomass, which in turn capture 11 kg CO₂ for growing up. These algae-based plastics can therefore offer the same environmental benefits than current bio-based plastics (reduction of greenhouse gas emissions and fossil resource depletion), using a 3rd generation biomass feedstock that avoids the competition with food and the environmental impacts of agricultural practices. The project is also dealing with other sustainability aspects like the ecodesign and life cycle assessment of the plastic products targeted, considering not only the use of the biobased plastics but also many other ecodesign strategies. This paper will present the main progresses and results achieved to date in the project.

Keywords: NENU2PHAR, Polyhydroxyalkanoates, micro-algae, biopolymer, ecodesign, life cycle assessment

Procedia PDF Downloads 90
2369 Sustainable Development, China’s Emerging Role via One Belt, One Road

Authors: Saeid Rabiei Majd, Motahareh Alvandi, Mehrad Rabiei

Abstract:

The rapid economic and technological development of any country depends on access to cheap sources of energy. Competition for access to petroleum resources is always accompanied by numerous environmental risks. These factors have caused more attention to environmental issues and sustainable development in petroleum contracts and activities. Nowadays, a sign of developed countries is adhering to the principles and rules of international environmental law and sustainable development of commercial contracts. China has entered into play through the massive project plan, One Belt, One Road. China is becoming a new emerging power in the world. China's bilateral investment treaties have an impact on environmental rights and sustainable development through regional and international foreign direct investment. The aim of this research is to examine China's key position to promote and improve environmental principles and international law and sustainable development in the energy sector in the world through the initiative, One Belt, One Road. Based on this hypothesis, it seems that in the near future, China's investment bilateral investment treaties will become popular investment model used in global trade, especially in the field of energy and sustainable development. They will replace the European and American models. The research method is including literature review, analytical and descriptive methods.

Keywords: principles of sustainable development, oil and gas law, Chinas BITs, One Belt One Road, environmental rights

Procedia PDF Downloads 306
2368 Factors Affecting the Profitability of Commercial Banks: An Empirical Study of Indian Banking Sector

Authors: Neeraj Gupta, Jitendra Mahakud

Abstract:

The banking system plays a major role in the Indian economy. Banking system is the payment gateway of most of the financial transactions. Banking has gone a major transition that is still in progress. Recent banking reforms after liberalization in 1991 have led to the establishment of the foreign banks in the country. The foreign banks are not listed in the Indian stock markets and have increased the competition leading to the capture of the significant share in the revenue from the public sector banks which are still the major players in the Indian banking sector. The performance of the banking sector depends on the internal (bank specific) as well as the external (market specific and macroeconomic) factors. Profitability in banking sector is affected by numerous factors which can be internal or external. The present study examines these internal and external factors which are likely to effect the profitablilty of the Indian banks. The sample consists of a panel dataset of 64 commercial banks in India, consisting of 1088 observations over the years from 1998 to 2016. The GMM dynamic panel estimation given by Arellano and Bond has been used. The study revealed that the variables capital adequacy ratio, deposit, age, labour productivity, non-performing asset, inflation and concentration have significant effect on performance measured.

Keywords: banks in India, bank performance, bank productivity, banking management

Procedia PDF Downloads 272
2367 Analysis of a IncResU-Net Model for R-Peak Detection in ECG Signals

Authors: Beatriz Lafuente Alcázar, Yash Wani, Amit J. Nimunkar

Abstract:

Cardiovascular Diseases (CVDs) are the leading cause of death globally, and around 80% of sudden cardiac deaths are due to arrhythmias or irregular heartbeats. The majority of these pathologies are revealed by either short-term or long-term alterations in the electrocardiogram (ECG) morphology. The ECG is the main diagnostic tool in cardiology. It is a non-invasive, pain free procedure that measures the heart’s electrical activity and that allows the detecting of abnormal rhythms and underlying conditions. A cardiologist can diagnose a wide range of pathologies based on ECG’s form alterations, but the human interpretation is subjective and it is contingent to error. Moreover, ECG records can be quite prolonged in time, which can further complicate visual diagnosis, and deeply retard disease detection. In this context, deep learning methods have risen as a promising strategy to extract relevant features and eliminate individual subjectivity in ECG analysis. They facilitate the computation of large sets of data and can provide early and precise diagnoses. Therefore, the cardiology field is one of the areas that can most benefit from the implementation of deep learning algorithms. In the present study, a deep learning algorithm is trained following a novel approach, using a combination of different databases as the training set. The goal of the algorithm is to achieve the detection of R-peaks in ECG signals. Its performance is further evaluated in ECG signals with different origins and features to test the model’s ability to generalize its outcomes. Performance of the model for detection of R-peaks for clean and noisy ECGs is presented. The model is able to detect R-peaks in the presence of various types of noise, and when presented with data, it has not been trained. It is expected that this approach will increase the effectiveness and capacity of cardiologists to detect divergences in the normal cardiac activity of their patients.

Keywords: arrhythmia, deep learning, electrocardiogram, machine learning, R-peaks

Procedia PDF Downloads 186