Search results for: market identification
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1861

Search results for: market identification

1291 Graphic Analysis of Genotype by Environment Interaction for Maize Hybrid Yield Using Site Regression Stability Model

Authors: Saeed Safari Dolatabad, Rajab Choukan

Abstract:

Selection of maize (Zea mays) hybrids with wide adaptability across diverse farming environments is important, prior to recommending them to achieve a high rate of hybrid adoption. Grain yield of 14 maize hybrids, tested in a randomized completeblock design with four replicates across 22 environments in Iran, was analyzed using site regression (SREG) stability model. The biplot technique facilitates a visual evaluation of superior genotypes, which is useful for cultivar recommendation and mega-environment identification. The objectives of this study were (i) identification of suitable hybrids with both high mean performance and high stability (ii) to determine mega-environments for maize production in Iran. Biplot analysis identifies two mega-environments in this study. The first mega-environments included KRM, KSH, MGN, DZF A, KRJ, DRB, DZF B, SHZ B, and KHM, where G10 hybrid was the best performing hybrid. The second mega-environment included ESF B, ESF A, and SHZ A, where G4 hybrid was the best hybrid. According to the ideal-hybrid biplot, G10 hybrid was better than all other hybrids, followed by the G1 and G3 hybrids. These hybrids were identified as best hybrids that have high grain yield and high yield stability. GGE biplot analysis provided a framework for identifying the target testing locations that discriminates genotypes that are high yielding and stable.

Keywords: Zea mays L, GGE biplot, Multi-environment trials, Yield stability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1683
1290 Identification of Most Frequently Occurring Lexis in Winnings-announcing Unsolicited Bulke-mails

Authors: Jatinderkumar R. Saini, Apurva A. Desai

Abstract:

e-mail has become an important means of electronic communication but the viability of its usage is marred by Unsolicited Bulk e-mail (UBE) messages. UBE consists of many types like pornographic, virus infected and 'cry-for-help' messages as well as fake and fraudulent offers for jobs, winnings and medicines. UBE poses technical and socio-economic challenges to usage of e-mails. To meet this challenge and combat this menace, we need to understand UBE. Towards this end, the current paper presents a content-based textual analysis of nearly 3000 winnings-announcing UBE. Technically, this is an application of Text Parsing and Tokenization for an un-structured textual document and we approach it using Bag Of Words (BOW) and Vector Space Document Model techniques. We have attempted to identify the most frequently occurring lexis in the winnings-announcing UBE documents. The analysis of such top 100 lexis is also presented. We exhibit the relationship between occurrence of a word from the identified lexisset in the given UBE and the probability that the given UBE will be the one announcing fake winnings. To the best of our knowledge and survey of related literature, this is the first formal attempt for identification of most frequently occurring lexis in winningsannouncing UBE by its textual analysis. Finally, this is a sincere attempt to bring about alertness against and mitigate the threat of such luring but fake UBE.

Keywords: Lexis, Unsolicited Bulk e-mail (UBE), Vector SpaceDocument Model, Winnings, Lottery

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1542
1289 An Identification Method of Geological Boundary Using Elastic Waves

Authors: Masamitsu Chikaraishi, Mutsuto Kawahara

Abstract:

This paper focuses on a technique for identifying the geological boundary of the ground strata in front of a tunnel excavation site using the first order adjoint method based on the optimal control theory. The geological boundary is defined as the boundary which is different layers of elastic modulus. At tunnel excavations, it is important to presume the ground situation ahead of the cutting face beforehand. Excavating into weak strata or fault fracture zones may cause extension of the construction work and human suffering. A theory for determining the geological boundary of the ground in a numerical manner is investigated, employing excavating blasts and its vibration waves as the observation references. According to the optimal control theory, the performance function described by the square sum of the residuals between computed and observed velocities is minimized. The boundary layer is determined by minimizing the performance function. The elastic analysis governed by the Navier equation is carried out, assuming the ground as an elastic body with linear viscous damping. To identify the boundary, the gradient of the performance function with respect to the geological boundary can be calculated using the adjoint equation. The weighed gradient method is effectively applied to the minimization algorithm. To solve the governing and adjoint equations, the Galerkin finite element method and the average acceleration method are employed for the spatial and temporal discretizations, respectively. Based on the method presented in this paper, the different boundary of three strata can be identified. For the numerical studies, the Suemune tunnel excavation site is employed. At first, the blasting force is identified in order to perform the accuracy improvement of analysis. We identify the geological boundary after the estimation of blasting force. With this identification procedure, the numerical analysis results which almost correspond with the observation data were provided.

Keywords: Parameter identification, finite element method, average acceleration method, first order adjoint equation method, weighted gradient method, geological boundary, navier equation, optimal control theory.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1587
1288 Flood Predicting in Karkheh River Basin Using Stochastic ARIMA Model

Authors: Karim Hamidi Machekposhti, Hossein Sedghi, Abdolrasoul Telvari, Hossein Babazadeh

Abstract:

Floods have huge environmental and economic impact. Therefore, flood prediction is given a lot of attention due to its importance. This study analysed the annual maximum streamflow (discharge) (AMS or AMD) of Karkheh River in Karkheh River Basin for flood predicting using ARIMA model. For this purpose, we use the Box-Jenkins approach, which contains four-stage method model identification, parameter estimation, diagnostic checking and forecasting (predicting). The main tool used in ARIMA modelling was the SAS and SPSS software. Model identification was done by visual inspection on the ACF and PACF. SAS software computed the model parameters using the ML, CLS and ULS methods. The diagnostic checking tests, AIC criterion, RACF graph and RPACF graphs, were used for selected model verification. In this study, the best ARIMA models for Annual Maximum Discharge (AMD) time series was (4,1,1) with their AIC value of 88.87. The RACF and RPACF showed residuals’ independence. To forecast AMD for 10 future years, this model showed the ability of the model to predict floods of the river under study in the Karkheh River Basin. Model accuracy was checked by comparing the predicted and observation series by using coefficient of determination (R2).

Keywords: Time series modelling, stochastic processes, ARIMA model, Karkheh River.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1047
1287 Investigating the UAE Residential Valuation System: A Framework for Analysis

Authors: Simon Huston, Ebraheim Lahbash, Ali Parsa

Abstract:

The development of the United Arab Emirates (UAE) into a regional trade, tourism, finance and logistics hub has transformed its real estate markets. However, speculative activity and price volatility remain concerns. UAE residential market values (MV) are exposed to fluctuations in capital flows and migration which, in turn, are affected by geopolitical uncertainty, oil price volatility and global investment market sentiment. Internally, a complex interplay between administrative boundaries, land tenure, building quality and evolving location characteristics fragments UAE residential property markets. In short, the UAE Residential Valuation System (UAE-RVS) confronts multiple challenges to collect, filter and analyze relevant information in complex and dynamic spatial and capital markets. A robust (RVS) can mitigate the risk of unhelpful volatility, speculative excess or investment mistakes. The research outlines the institutional, ontological, dynamic and epistemological issues at play. We highlight the importance of system capabilities, valuation standard salience and stakeholders trust.

Keywords: Valuation, property rights, information, institutions, trust, salience.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2375
1286 Development of a Tilt-Rotor Aircraft Model Using System Identification Technique

Authors: Antonio Vitale, Nicola Genito, Giovanni Cuciniello, Ferdinando Montemari

Abstract:

The introduction of tilt-rotor aircraft into the existing civilian air transportation system will provide beneficial effects due to tilt-rotor capability to combine the characteristics of a helicopter and a fixed-wing aircraft into one vehicle. The disposability of reliable tilt-rotor simulation models supports the development of such vehicle. Indeed, simulation models are required to design automatic control systems that increase safety, reduce pilot's workload and stress, and ensure the optimal aircraft configuration with respect to flight envelope limits, especially during the most critical flight phases such as conversion from helicopter to aircraft mode and vice versa. This article presents a process to build a simplified tilt-rotor simulation model, derived from the analysis of flight data. The model aims to reproduce the complex dynamics of tilt-rotor during the in-flight conversion phase. It uses a set of scheduled linear transfer functions to relate the autopilot reference inputs to the most relevant rigid body state variables. The model also computes information about the rotor flapping dynamics, which are useful to evaluate the aircraft control margin in terms of rotor collective and cyclic commands. The rotor flapping model is derived through a mixed theoretical-empirical approach, which includes physical analytical equations (applicable to helicopter configuration) and parametric corrective functions. The latter are introduced to best fit the actual rotor behavior and balance the differences existing between helicopter and tilt-rotor during flight. Time-domain system identification from flight data is exploited to optimize the model structure and to estimate the model parameters. The presented model-building process was applied to simulated flight data of the ERICA Tilt-Rotor, generated by using a high fidelity simulation model implemented in FlightLab environment. The validation of the obtained model was very satisfying, confirming the validity of the proposed approach.

Keywords: Flapping Dynamics, Flight Dynamics, System Identification, Tilt-Rotor Modeling and Simulation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1290
1285 On-Line Geometrical Identification of Reconfigurable Machine Tool using Virtual Machining

Authors: Alexandru Epureanu, Virgil Teodor

Abstract:

One of the main research directions in CAD/CAM machining area is the reducing of machining time. The feedrate scheduling is one of the advanced techniques that allows keeping constant the uncut chip area and as sequel to keep constant the main cutting force. They are two main ways for feedrate optimization. The first consists in the cutting force monitoring, which presumes to use complex equipment for the force measurement and after this, to set the feedrate regarding the cutting force variation. The second way is to optimize the feedrate by keeping constant the material removal rate regarding the cutting conditions. In this paper there is proposed a new approach using an extended database that replaces the system model. The feedrate scheduling is determined based on the identification of the reconfigurable machine tool, and the feed value determination regarding the uncut chip section area, the contact length between tool and blank and also regarding the geometrical roughness. The first stage consists in the blank and tool monitoring for the determination of actual profiles. The next stage is the determination of programmed tool path that allows obtaining the piece target profile. The graphic representation environment models the tool and blank regions and, after this, the tool model is positioned regarding the blank model according to the programmed tool path. For each of these positions the geometrical roughness value, the uncut chip area and the contact length between tool and blank are calculated. Each of these parameters are compared with the admissible values and according to the result the feed value is established. We can consider that this approach has the following advantages: in case of complex cutting processes the prediction of cutting force is possible; there is considered the real cutting profile which has deviations from the theoretical profile; the blank-tool contact length limitation is possible; it is possible to correct the programmed tool path so that the target profile can be obtained. Applying this method, there are obtained data sets which allow the feedrate scheduling so that the uncut chip area is constant and, as a result, the cutting force is constant, which allows to use more efficiently the machine tool and to obtain the reduction of machining time.

Keywords: Reconfigurable machine tool, system identification, uncut chip area, cutting conditions scheduling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1452
1284 Environmental Capacity and Sustainability of European Regional Airports: A Case Study

Authors: Nicola Gualandi, Luca Mantecchini, Davide Serrau

Abstract:

Airport capacity has always been perceived in the traditional sense as the number of aircraft operations during a specified time corresponding to a tolerable level of average delay and it mostly depends on the airside characteristics, on the fleet mix variability and on the ATM. The adoption of the Directive 2002/30/EC in the EU countries drives the stakeholders to conceive airport capacity in a different way though. Airport capacity in this sense is fundamentally driven by environmental criteria, and since acoustical externalities represent the most important factors, those are the ones that could pose a serious threat to the growth of airports and to aviation market itself in the short-medium term. The importance of the regional airports in the deregulated market grew fast during the last decade since they represent spokes for network carriers and a preferential destination for low-fares carriers. Not only regional airports have witnessed a fast and unexpected growth in traffic but also a fast growth in the complaints for the nuisance by the people living near those airports. In this paper the results of a study conducted in cooperation with the airport of Bologna G. Marconi are presented in order to investigate airport acoustical capacity as a defacto constraint of airport growth.

Keywords: Airport acoustical capacity, airport noise, air traffic noise, sustainability of regional airports.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1660
1283 Bridging the Green-Value-Gap: A South African Approach

Authors: E.J. Cilliers

Abstract:

Green- spaces might be very attractive, but where are the economic benefits? What value do nature and landscape have for us? What difference will it make to jobs, health and the economic strength of areas struggling with deprivation and social problems? [1].There is a need to consider green spaces from a different perspective. Green planning is not just about flora and fauna, but also about planning for economic benefits [2]. It is worth trying to quantify the value of green spaces since nature and landscape are crucially important to our quality of life and sustainable development. The reality, however, is that urban development often takes place at the expense of green spaces. Urbanization is an ongoing process throughout the world; however, hyper-urbanization without environmental planning is destructive, not constructive [3]. Urban spaces are believed to be more valuable than other land uses, particular green areas, simply because of the market value connected to urban spaces. However, attractive landscapes can help raise the quality and value of the urban market even more. In order to reach these objectives of integrated planning, the Green-Value-Gap needs to be bridged. Economists have to understand the concept of Green-Planning and the spinoffs, and Environmentalists have to understand the importance of urban economic development and the benefits thereof to green planning. An interface between Environmental Management, Economic Development and sustainable Spatial Planning are needed to bridge the Green-Value-Gap.

Keywords: Spatial Planning, Environmental Management, Green-Value-Gap, Compensation, Participation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2642
1282 The Ability of Forecasting the Term Structure of Interest Rates Based On Nelson-Siegel and Svensson Model

Authors: Tea Poklepović, Zdravka Aljinović, Branka Marasović

Abstract:

Due to the importance of yield curve and its estimation it is inevitable to have valid methods for yield curve forecasting in cases when there are scarce issues of securities and/or week trade on a secondary market. Therefore in this paper, after the estimation of weekly yield curves on Croatian financial market from October 2011 to August 2012 using Nelson-Siegel and Svensson models, yield curves are forecasted using Vector autoregressive model and Neural networks. In general, it can be concluded that both forecasting methods have good prediction abilities where forecasting of yield curves based on Nelson Siegel estimation model give better results in sense of lower Mean Squared Error than forecasting based on Svensson model Also, in this case Neural networks provide slightly better results. Finally, it can be concluded that most appropriate way of yield curve prediction is Neural networks using Nelson-Siegel estimation of yield curves.

Keywords: Nelson-Siegel model, Neural networks, Svensson model, Vector autoregressive model, Yield curve.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3252
1281 On the Transition of Europe’s Power Sector: Economic Consequences of National Targets

Authors: Geoffrey J. Blanford, Christoph Weissbart

Abstract:

The prospects for the European power sector indicate that it has to almost fully decarbonize in order to reach the economy-wide target of CO2-emission reduction. We apply the EU-REGEN model to explain the penetration of RES from an economic perspective, their spatial distribution, and the complementary role of conventional generation technologies. Furthermore, we identify economic consequences of national energy and climate targets. Our study shows that onshore wind power will be the most crucial generation technology for the future European power sector. Its geographic distribution is driven by resource quality. Gas power will be the major conventional generation technology for backing-up wind power. Moreover, a complete phase out of coal power proves to be not economically optimal. The paper demonstrates that existing national targets have a negative impact, especially on the German region with higher prices and lower revenues. The remaining regions profit are hardly affected. We encourage an EU-wide coordination on the expansion of wind power with harmonized policies. Yet, this requires profitable market structures for both, RES and conventional generation technologies.

Keywords: European decarbonization pathway, power market investment, public policies, technology choice.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 927
1280 The Influences of Marketing Mix on Customer Purchasing Behavior at Chatuchak Plaza Market

Authors: Bundit Pungnirund

Abstract:

The objective of this research was to study the influence of marketing mix on customers purchasing behavior. A total of 397 respondents were collected from customers who were the patronages of the Chatuchak Plaza market. A questionnaire was utilized as a tool to collect data. Statistics utilized in this research included frequency, percentage, mean, standard deviation, and multiple regression analysis. Data were analyzed by using Statistical Package for the Social Sciences. The findings revealed that the majority of respondents were male with the age between 25-34 years old, hold undergraduate degree, married and stay together. The average income of respondents was between 10,001-20,000 baht. In terms of occupation, the majority worked for private companies. The research analysis disclosed that there were three variables of marketing mix which included price (X2), place (X3), and product (X1) which had an influence on the frequency of customer purchasing. These three variables can predict a purchase about 30 percent of the time by using the equation; Y1 = 6.851 + .921(X2) + .949(X3) + .591(X1). It also found that in terms of marketing mixed, there were two variables had an influence on the amount of customer purchasing which were physical characteristic (X6), and the process (X7). These two variables are 17 percent predictive of a purchasing by using the equation: Y2 = 2276.88 + 2980.97(X6) + 2188.09(X7).

Keywords: Influences, Marketing Mixed, Purchasing Behavior.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11543
1279 Metaverse as a Form of Reality and the Impact of Metaverse in Higher Education

Authors: Josefina Bengoechea, Alex Bell

Abstract:

In the metaverse, the characters were avatars working in a 3-dimensional virtual reality. This virtual reality existed beyond reality. The metaverse is a “the post-reality universe”; a perpetual and persistent multiuser environment in which physical reality and digital virtuality are merged. The virtual infrastructure needed to build a metaverse (which is in the process of being created), are: web3 technologies, non-fungible tokens (NFTs), blockchain, smart contracts, and cryptocurrencies. Web3 refers to a new iteration of the actual web2. The actual web2 is dominated by powerful providers like Google, Apple, Amazon, and other corporate tech companies. The vision for web3 is a decentralized, and thus more equitable version of the web. The aim of this paper is, first, to present the Metaverse as a form of reality in which physical reality and digital virtuality combined to provide new experiences to users; second, to discuss the implications for education, specifically for higher education, and how programs will have to be modified so that the skills obtained by graduates match those demanded by the virtual labour market. This paper builds upon a constructivist approach, combining a literature review and research on key publications.

Keywords: Ethics in technology, cross realities, cryptocurrencies, labour market, metaverse, technology in higher education.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 711
1278 Marketing Strategy Analysis of Thai Asia Pacific Brewery Company

Authors: Sinee Sankrusme

Abstract:

The study was a case study analysis about Thai Asia Pacific Brewery Company. The purpose was to analyze the company’s marketing objective, marketing strategy at company level, and marketing mix before liquor liberalization in 2000. Methods used in this study were qualitative and descriptive research approach which demonstrated the following results of the study demonstrated as follows: (1) Marketing objective was to increase market share of Heineken and Amtel, (2) the company’s marketing strategies were brand building strategy and distribution strategy. Additionally, the company also conducted marketing mix strategy as follows. Product strategy: The company added more beer brands namely Amstel and Tiger to provide additional choice to consumers, product and marketing research, and product development. Price strategy: the company had taken the following into consideration: cost, competitor, market, economic situation and tax. Promotion strategy: the company conducted sales promotion and advertising. Distribution strategy: the company extended channels its channels of distribution into food shops, pubs and various entertainment places. This strategy benefited interested persons and people who were engaged in the beer business.

Keywords: Marketing Strategy, Beer, Thai Asia Pacific Brewery Company.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6241
1277 Implementation of an Improved Secure System Detection for E-passport by using EPC RFID Tags

Authors: A. Baith Mohamed, Ayman Abdel-Hamid, Kareem Youssri Mohamed

Abstract:

Current proposals for E-passport or ID-Card is similar to a regular passport with the addition of tiny contactless integrated circuit (computer chip) inserted in the back cover, which will act as a secure storage device of the same data visually displayed on the photo page of the passport. In addition, it will include a digital photograph that will enable biometric comparison, through the use of facial recognition technology at international borders. Moreover, the e-passport will have a new interface, incorporating additional antifraud and security features. However, its problems are reliability, security and privacy. Privacy is a serious issue since there is no encryption between the readers and the E-passport. However, security issues such as authentication, data protection and control techniques cannot be embedded in one process. In this paper, design and prototype implementation of an improved E-passport reader is presented. The passport holder is authenticated online by using GSM network. The GSM network is the main interface between identification center and the e-passport reader. The communication data is protected between server and e-passport reader by using AES to encrypt data for protection will transferring through GSM network. Performance measurements indicate a 19% improvement in encryption cycles versus previously reported results.

Keywords: RFID "Radio Frequency Identification", EPC"Electronic Product Code", ICAO "International Civil Aviation Organization", IFF "Identify Friend or Foe"

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2602
1276 Object-Oriented Programming Strategies in C# for Power Conscious System

Authors: Kayun Chantarasathaporn, Chonawat Srisa-an

Abstract:

Low power consumption is a major constraint for battery-powered system like computer notebook or PDA. In the past, specialists usually designed both specific optimized equipments and codes to relief this concern. Doing like this could work for quite a long time, however, in this era, there is another significant restraint, the time to market. To be able to serve along the power constraint while can launch products in shorter production period, objectoriented programming (OOP) has stepped in to this field. Though everyone knows that OOP has quite much more overhead than assembly and procedural languages, development trend still heads to this new world, which contradicts with the target of low power consumption. Most of the prior power related software researches reported that OOP consumed much resource, however, as industry had to accept it due to business reasons, up to now, no papers yet had mentioned about how to choose the best OOP practice in this power limited boundary. This article is the pioneer that tries to specify and propose the optimized strategy in writing OOP software under energy concerned environment, based on quantitative real results. The language chosen for studying is C# based on .NET Framework 2.0 which is one of the trendy OOP development environments. The recommendation gotten from this research would be a good roadmap that can help developers in coding that well balances between time to market and time of battery.

Keywords: Low power consumption, object oriented programming, power conscious system, software.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1916
1275 Through Biometric Card in Romania: Person Identification by Face, Fingerprint and Voice Recognition

Authors: Hariton N. Costin, Iulian Ciocoiu, Tudor Barbu, Cristian Rotariu

Abstract:

In this paper three different approaches for person verification and identification, i.e. by means of fingerprints, face and voice recognition, are studied. Face recognition uses parts-based representation methods and a manifold learning approach. The assessment criterion is recognition accuracy. The techniques under investigation are: a) Local Non-negative Matrix Factorization (LNMF); b) Independent Components Analysis (ICA); c) NMF with sparse constraints (NMFsc); d) Locality Preserving Projections (Laplacianfaces). Fingerprint detection was approached by classical minutiae (small graphical patterns) matching through image segmentation by using a structural approach and a neural network as decision block. As to voice / speaker recognition, melodic cepstral and delta delta mel cepstral analysis were used as main methods, in order to construct a supervised speaker-dependent voice recognition system. The final decision (e.g. “accept-reject" for a verification task) is taken by using a majority voting technique applied to the three biometrics. The preliminary results, obtained for medium databases of fingerprints, faces and voice recordings, indicate the feasibility of our study and an overall recognition precision (about 92%) permitting the utilization of our system for a future complex biometric card.

Keywords: Biometry, image processing, pattern recognition, speech analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1949
1274 Basic Business-Forces behind the Surviving and Sustainable Organizations: The Case of Medium Scale Contractors in South Africa

Authors: Iruka C. Anugwo, Winston M. Shakantu

Abstract:

The objective of this study is to uncover the basic business-forces that necessitated the survival and sustainable performance of the medium scale contractors in the South African construction market. This study is essential as it set to contribute towards long-term strategic solutions for combating the incessant failure of start-ups construction organizations within South African. The study used a qualitative research methodology; as the most appropriate approach to elicit and understand, and uncover the phenomena that are basic business-forces for the active contractors in the market. The study also adopted a phenomenological study approach; and in-depth interviews were conducted with 20 medium scale contractors in Port Elizabeth, South Africa, between months of August to October 2015. This allowed for an in-depth understanding of the critical and basic business-forces that influenced their survival and performance beyond the first five years of business operation. Findings of the study showed that for potential contractors (startups), to survival in the competitive business environment such as construction industry, they must possess the basic business-forces. These forces are educational knowledge in construction and business management related disciplines, adequate industrial experiences, competencies and capabilities to delivery excellent services and products as well as embracing the spirit of entrepreneurship. Convincingly, it can be concluded that the strategic approach to minimize the endless failure of startups construction businesses; the potential construction contractors must endeavoring to access and acquire the basic educationally knowledge, training and qualification; need to acquire industrial experiences in collaboration with required competencies, capabilities and entrepreneurship acumen. Without these basic business-forces as been discovered in this study, the majority of the contractors gaining entrance in the market will find it difficult to develop and grow a competitive and sustainable construction organization in South Africa.

Keywords: Basic business-forces, medium scale contractors, South Africa, sustainable organisations.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1555
1273 Territorial Availability of Social and Economic Infrastructure in Kazakhstan: Comparative Analysis of Urban and Rural Households

Authors: Nazym Shedenova, Aigul Beimisheva

Abstract:

The market transformation in Kazakhstan during the last two decades has essentially strengthened a gap between development of urban and rural areas. Implementation of market institutes, transition from public financing to paid rendering of social services, change of forms of financing of social and economic infrastructure have led to strengthening of an economic inequality of social groups, including growth of stratification of the city and the village. Sociological survey of urban and rural households in Almaty city and villages of Almaty region has been carried out within the international research project “Livelihoods Strategies of Private Households in Central Asia: A Rural–Urban Comparison in Kazakhstan and Kyrgyzstan" (Germany, Kazakhstan, Kyrgyzstan). The analysis of statistical data and results of sociological research of urban and rural households allows us to reveal issues of territorial development, to investigate an availability of medical, educational and other services in the city and the village, to reveal an evaluation urban and rural dwellers of living conditions, to compare economic strategies of households in the city and the village.

Keywords: Urban and rural households, social and economic infrastructure, territorial availability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2172
1272 Improving Worm Detection with Artificial Neural Networks through Feature Selection and Temporal Analysis Techniques

Authors: Dima Stopel, Zvi Boger, Robert Moskovitch, Yuval Shahar, Yuval Elovici

Abstract:

Computer worm detection is commonly performed by antivirus software tools that rely on prior explicit knowledge of the worm-s code (detection based on code signatures). We present an approach for detection of the presence of computer worms based on Artificial Neural Networks (ANN) using the computer's behavioral measures. Identification of significant features, which describe the activity of a worm within a host, is commonly acquired from security experts. We suggest acquiring these features by applying feature selection methods. We compare three different feature selection techniques for the dimensionality reduction and identification of the most prominent features to capture efficiently the computer behavior in the context of worm activity. Additionally, we explore three different temporal representation techniques for the most prominent features. In order to evaluate the different techniques, several computers were infected with five different worms and 323 different features of the infected computers were measured. We evaluated each technique by preprocessing the dataset according to each one and training the ANN model with the preprocessed data. We then evaluated the ability of the model to detect the presence of a new computer worm, in particular, during heavy user activity on the infected computers.

Keywords: Artificial Neural Networks, Feature Selection, Temporal Analysis, Worm Detection.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1731
1271 Design and Application of NFC-Based Identity and Access Management in Cloud Services

Authors: Shin-Jer Yang, Kai-Tai Yang

Abstract:

In response to a changing world and the fast growth of the Internet, more and more enterprises are replacing web-based services with cloud-based ones. Multi-tenancy technology is becoming more important especially with Software as a Service (SaaS). This in turn leads to a greater focus on the application of Identity and Access Management (IAM). Conventional Near-Field Communication (NFC) based verification relies on a computer browser and a card reader to access an NFC tag. This type of verification does not support mobile device login and user-based access management functions. This study designs an NFC-based third-party cloud identity and access management scheme (NFC-IAM) addressing this shortcoming. Data from simulation tests analyzed with Key Performance Indicators (KPIs) suggest that the NFC-IAM not only takes less time in identity identification but also cuts time by 80% in terms of two-factor authentication and improves verification accuracy to 99.9% or better. In functional performance analyses, NFC-IAM performed better in salability and portability. The NFC-IAM App (Application Software) and back-end system to be developed and deployed in mobile device are to support IAM features and also offers users a more user-friendly experience and stronger security protection. In the future, our NFC-IAM can be employed to different environments including identification for mobile payment systems, permission management for remote equipment monitoring, among other applications.

Keywords: Cloud service, multi-tenancy, NFC, IAM, mobile device.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1119
1270 Embedded Semi-Fragile Signature Based Scheme for Ownership Identification and Color Image Authentication with Recovery

Authors: M. Hamad Hassan, S.A.M. Gilani

Abstract:

In this paper, a novel scheme is proposed for Ownership Identification and Color Image Authentication by deploying Cryptography & Digital Watermarking. The color image is first transformed from RGB to YST color space exclusively designed for watermarking. Followed by color space transformation, each channel is divided into 4×4 non-overlapping blocks with selection of central 2×2 sub-blocks. Depending upon the channel selected two to three LSBs of each central 2×2 sub-block are set to zero to hold the ownership, authentication and recovery information. The size & position of sub-block is important for correct localization, enhanced security & fast computation. As YS ÔèÑ T so it is suitable to embed the recovery information apart from the ownership and authentication information, therefore 4×4 block of T channel along with ownership information is then deployed by SHA160 to compute the content based hash that is unique and invulnerable to birthday attack or hash collision instead of using MD5 that may raise the condition i.e. H(m)=H(m'). For recovery, intensity mean of 4x4 block of each channel is computed and encoded upto eight bits. For watermark embedding, key based mapping of blocks is performed using 2DTorus Automorphism. Our scheme is oblivious, generates highly imperceptible images with correct localization of tampering within reasonable time and has the ability to recover the original work with probability of near one.

Keywords: Hash Collision, LSB, MD5, PSNR, SHA160

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1522
1269 Identification of Most Frequently Occurring Lexis in Body-enhancement Medicinal Unsolicited Bulk e-mails

Authors: Jatinderkumar R. Saini, Apurva A. Desai

Abstract:

e-mail has become an important means of electronic communication but the viability of its usage is marred by Unsolicited Bulk e-mail (UBE) messages. UBE consists of many types like pornographic, virus infected and 'cry-for-help' messages as well as fake and fraudulent offers for jobs, winnings and medicines. UBE poses technical and socio-economic challenges to usage of e-mails. To meet this challenge and combat this menace, we need to understand UBE. Towards this end, the current paper presents a content-based textual analysis of more than 2700 body enhancement medicinal UBE. Technically, this is an application of Text Parsing and Tokenization for an un-structured textual document and we approach it using Bag Of Words (BOW) and Vector Space Document Model techniques. We have attempted to identify the most frequently occurring lexis in the UBE documents that advertise various products for body enhancement. The analysis of such top 100 lexis is also presented. We exhibit the relationship between occurrence of a word from the identified lexis-set in the given UBE and the probability that the given UBE will be the one advertising for fake medicinal product. To the best of our knowledge and survey of related literature, this is the first formal attempt for identification of most frequently occurring lexis in such UBE by its textual analysis. Finally, this is a sincere attempt to bring about alertness against and mitigate the threat of such luring but fake UBE.

Keywords: Body Enhancement, Lexis, Medicinal, Unsolicited Bulk e-mail (UBE), Vector Space Document Model, Viagra

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3510
1268 Adaptive Neuro-Fuzzy Inference System for Financial Trading using Intraday Seasonality Observation Model

Authors: A. Kablan

Abstract:

The prediction of financial time series is a very complicated process. If the efficient market hypothesis holds, then the predictability of most financial time series would be a rather controversial issue, due to the fact that the current price contains already all available information in the market. This paper extends the Adaptive Neuro Fuzzy Inference System for High Frequency Trading which is an expert system that is capable of using fuzzy reasoning combined with the pattern recognition capability of neural networks to be used in financial forecasting and trading in high frequency. However, in order to eliminate unnecessary input in the training phase a new event based volatility model was proposed. Taking volatility and the scaling laws of financial time series into consideration has brought about the development of the Intraday Seasonality Observation Model. This new model allows the observation of specific events and seasonalities in data and subsequently removes any unnecessary data. This new event based volatility model provides the ANFIS system with more accurate input and has increased the overall performance of the system.

Keywords: Adaptive Neuro-fuzzy Inference system, High Frequency Trading, Intraday Seasonality Observation Model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3400
1267 A Novel Approach for Coin Identification using Eigenvalues of Covariance Matrix, Hough Transform and Raster Scan Algorithms

Authors: J. Prakash, K. Rajesh

Abstract:

In this paper we present a new method for coin identification. The proposed method adopts a hybrid scheme using Eigenvalues of covariance matrix, Circular Hough Transform (CHT) and Bresenham-s circle algorithm. The statistical and geometrical properties of the small and large Eigenvalues of the covariance matrix of a set of edge pixels over a connected region of support are explored for the purpose of circular object detection. Sparse matrix technique is used to perform CHT. Since sparse matrices squeeze zero elements and contain only a small number of non-zero elements, they provide an advantage of matrix storage space and computational time. Neighborhood suppression scheme is used to find the valid Hough peaks. The accurate position of the circumference pixels is identified using Raster scan algorithm which uses geometrical symmetry property. After finding circular objects, the proposed method uses the texture on the surface of the coins called texton, which are unique properties of coins, refers to the fundamental micro structure in generic natural images. This method has been tested on several real world images including coin and non-coin images. The performance is also evaluated based on the noise withstanding capability.

Keywords: Circular Hough Transform, Coin detection, Covariance matrix, Eigenvalues, Raster scan Algorithm, Texton.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1882
1266 Evaluating Portfolio Performance by Highlighting Network Property and the Sharpe Ratio in the Stock Market

Authors: Zahra Hatami, Hesham Ali, David Volkman

Abstract:

Selecting a portfolio for investing is a crucial decision for individuals and legal entities. In the last two decades, with economic globalization, a stream of financial innovations has rushed to the aid of financial institutions. The importance of selecting stocks for the portfolio is always a challenging task for investors. This study aims to create a financial network to identify optimal portfolios using network centralities metrics. This research presents a community detection technique of superior stocks that can be described as an optimal stock portfolio to be used by investors. By using the advantages of a network and its property in extracted communities, a group of stocks was selected for each of the various time periods. The performance of the optimal portfolios was compared to the famous index. Their Sharpe ratio was calculated in a timely manner to evaluate their profit for making decisions. The analysis shows that the selected potential portfolio from stocks with low centrality measurement can outperform the market; however, they have a lower Sharpe ratio than stocks with high centrality scores. In other words, stocks with low centralities could outperform the S&P500 yet have a lower Sharpe ratio than high central stocks.

Keywords: Portfolio management performance, network analysis, centrality measurements, Sharpe ratio.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 414
1265 Self-Perceived Employability of Students of International Relations of University of Warmia and Mazury in Poland

Authors: Marzena Świgoń

Abstract:

Nowadays, graduates should be prepared for serious challenges in the internal and external labor market. The notion that a degree is a “passport to employment” has been relegated to the past. In the last few years a phenomenon in the form of the increasing unemployment of highly educated young people in EU countries, including Poland has been observed. Empirical studies were conducted among Polish students in the scope of the so-called self-perceived employability review. In this study, a special scale was used which consisted of 19 statements regarding five components: student’s perception of university; field of study; self-belief; state of the external labor market; and, personal knowledge management. The respondent group consisted of final-year master’s students of International Relations at the University of Warmia and Mazury in Olsztyn, Poland. The findings of the empirical studies were compiled using statistical methods: descriptive statistics and inferential statistics. In general, in light of the conducted studies, the self-perceived employability of the Polish students was not high. Limitations of the studies were discussed, as well as the implications for future research in the scope of the students’ employability.

Keywords: Self-perceived employability, students of international relations, university education.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1477
1264 Extraction of Data from Web Pages: A Vision Based Approach

Authors: P. S. Hiremath, Siddu P. Algur

Abstract:

With the explosive growth of information sources available on the World Wide Web, it has become increasingly difficult to identify the relevant pieces of information, since web pages are often cluttered with irrelevant content like advertisements, navigation-panels, copyright notices etc., surrounding the main content of the web page. Hence, tools for the mining of data regions, data records and data items need to be developed in order to provide value-added services. Currently available automatic techniques to mine data regions from web pages are still unsatisfactory because of their poor performance and tag-dependence. In this paper a novel method to extract data items from the web pages automatically is proposed. It comprises of two steps: (1) Identification and Extraction of the data regions based on visual clues information. (2) Identification of data records and extraction of data items from a data region. For step1, a novel and more effective method is proposed based on visual clues, which finds the data regions formed by all types of tags using visual clues. For step2 a more effective method namely, Extraction of Data Items from web Pages (EDIP), is adopted to mine data items. The EDIP technique is a list-based approach in which the list is a linear data structure. The proposed technique is able to mine the non-contiguous data records and can correctly identify data regions, irrespective of the type of tag in which it is bound. Our experimental results show that the proposed technique performs better than the existing techniques.

Keywords: Web data records, web data regions, web mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1903
1263 Assessing the Theoretical Suitability of Sentinel-2 and WorldView-3 Data for Hydrocarbon Mapping of Spill Events, Using HYSS

Authors: K. Tunde Olagunju, C. Scott Allen, F.D. (Freek) van der Meer

Abstract:

Identification of hydrocarbon oil in remote sensing images is often the first step in monitoring oil during spill events. Most remote sensing methods adopt techniques for hydrocarbon identification to achieve detection in order to model an appropriate cleanup program. Identification on optical sensors does not only allow for detection but also for characterization and quantification. Until recently, in optical remote sensing, quantification and characterization were only potentially possible using high-resolution laboratory and airborne imaging spectrometers (hyperspectral data). Unlike multispectral, hyperspectral data are not freely available, as this data category is mainly obtained via airborne survey at present. In this research, two operational high-resolution multispectral satellites (WorldView-3 and Sentinel-2) are theoretically assessed for their suitability for hydrocarbon characterization, using the Hydrocarbon Spectra Slope model (HYSS). This method utilized the two most persistent hydrocarbon diagnostic/absorption features at 1.73 µm and 2.30 µm for hydrocarbon mapping on multispectral data. In this research, spectra measurement of seven different hydrocarbon oils (crude and refined oil) taken on 10 different substrates with the use of laboratory ASD Fieldspec were convolved to Sentinel-2 and WorldView-3 resolution, using their full width half maximum (FWHM) parameter. The resulting hydrocarbon slope values obtained from the studied samples enable clear qualitative discrimination of most hydrocarbons, despite the presence of different background substrates, particularly on WorldView-3. Due to close conformity of central wavelengths and narrow bandwidths to key hydrocarbon bands used in HYSS, the statistical significance for qualitative analysis on WorldView-3 sensors for all studied hydrocarbon oil returned with 95% confidence level (P-value ˂ 0.01), except for Diesel. Using multifactor analysis of variance (MANOVA), the discriminating power of HYSS is statistically significant for most hydrocarbon-substrate combinations on Sentinel-2 and WorldView-3 FWHM, revealing the potential of these two operational multispectral sensors as rapid response tools for hydrocarbon mapping. One notable exception is highly transmissive hydrocarbons on Sentinel-2 data due to the non-conformity of spectral bands with key hydrocarbon absorptions and the relatively coarse bandwidth (> 100 nm).

Keywords: hydrocarbon, oil spill, remote sensing, hyperspectral, multispectral, hydrocarbon – substrate combination, Sentinel-2, WorldView-3

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 710
1262 From Electroencephalogram to Epileptic Seizures Detection by Using Artificial Neural Networks

Authors: Gaetano Zazzaro, Angelo Martone, Roberto V. Montaquila, Luigi Pavone

Abstract:

Seizure is the main factor that affects the quality of life of epileptic patients. The diagnosis of epilepsy, and hence the identification of epileptogenic zone, is commonly made by using continuous Electroencephalogram (EEG) signal monitoring. Seizure identification on EEG signals is made manually by epileptologists and this process is usually very long and error prone. The aim of this paper is to describe an automated method able to detect seizures in EEG signals, using knowledge discovery in database process and data mining methods and algorithms, which can support physicians during the seizure detection process. Our detection method is based on Artificial Neural Network classifier, trained by applying the multilayer perceptron algorithm, and by using a software application, called Training Builder that has been developed for the massive extraction of features from EEG signals. This tool is able to cover all the data preparation steps ranging from signal processing to data analysis techniques, including the sliding window paradigm, the dimensionality reduction algorithms, information theory, and feature selection measures. The final model shows excellent performances, reaching an accuracy of over 99% during tests on data of a single patient retrieved from a publicly available EEG dataset.

Keywords: Artificial Neural Network, Data Mining, Electroencephalogram, Epilepsy, Feature Extraction, Seizure Detection, Signal Processing.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1322