Search results for: business models
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3395

Search results for: business models

2735 The Effect of Modification and Initial Concentration on Ammonia Removal from Leachate by Zeolite

Authors: Fulya Aydın, Ayşe Kuleyin

Abstract:

The purpose of this study is to investigate the capacity of natural Turkish zeolite for NH4-N removal from landfill leachate. The effects of modification and initial concentration on the removal of NH4-N from leachate were also investigated. The kinetics of adsorption of NH4-N has been discussed using three kinetic models, i.e., the pseudo-second order model, the Elovich equation, the intraparticle diffuion model. Kinetic parameters and correlation coefficients were determined. Equilibrium isotherms for the adsorption of NH4-N were analyzed by Langmuir, Freundlich and Tempkin isotherm models. Langmuir isotherm model was found to best represent the data for NH4-N.

Keywords: Leachate, Ammonium, zeolite

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2318
2734 Exploring Socio-Economic Barriers of Green Entrepreneurship in Iran and Their Interactions Using Interpretive Structural Modeling

Authors: Younis Jabarzadeh, Rahim Sarvari, Negar Ahmadi Alghalandis

Abstract:

Entrepreneurship at both individual and organizational level is one of the most driving forces in economic development and leads to growth and competition, job generation and social development. Especially in developing countries, the role of entrepreneurship in economic and social prosperity is more emphasized. But the effect of global economic development on the environment is undeniable, especially in negative ways, and there is a need to rethink current business models and the way entrepreneurs act to introduce new businesses to address and embed environmental issues in order to achieve sustainable development. In this paper, green or sustainable entrepreneurship is addressed in Iran to identify challenges and barriers entrepreneurs in the economic and social sectors face in developing green business solutions. Sustainable or green entrepreneurship has been gaining interest among scholars in recent years and addressing its challenges and barriers need much more attention to fill the gap in the literature and facilitate the way those entrepreneurs are pursuing. This research comprised of two main phases: qualitative and quantitative. At qualitative phase, after a thorough literature review, fuzzy Delphi method is utilized to verify those challenges and barriers by gathering a panel of experts and surveying them. In this phase, several other contextually related factors were added to the list of identified barriers and challenges mentioned in the literature. Then, at the quantitative phase, Interpretive Structural Modeling is applied to construct a network of interactions among those barriers identified at the previous phase. Again, a panel of subject matter experts comprised of academic and industry experts was surveyed. The results of this study can be used by policymakers in both the public and industry sector, to introduce more systematic solutions to eliminate those barriers and help entrepreneurs overcome challenges of sustainable entrepreneurship. It also contributes to the literature as the first research in this type which deals with the barriers of sustainable entrepreneurship and explores their interaction.

Keywords: Green entrepreneurship, barriers, Fuzzy Delphi Method, interpretive structural modeling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1335
2733 Classifying and Predicting Efficiencies Using Interval DEA Grid Setting

Authors: Yiannis G. Smirlis

Abstract:

The classification and the prediction of efficiencies in Data Envelopment Analysis (DEA) is an important issue, especially in large scale problems or when new units frequently enter the under-assessment set. In this paper, we contribute to the subject by proposing a grid structure based on interval segmentations of the range of values for the inputs and outputs. Such intervals combined, define hyper-rectangles that partition the space of the problem. This structure, exploited by Interval DEA models and a dominance relation, acts as a DEA pre-processor, enabling the classification and prediction of efficiency scores, without applying any DEA models.

Keywords: Data envelopment analysis, interval DEA, efficiency classification, efficiency prediction.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 887
2732 The Strategic Engine Model: Redefined Strategy Structure, as per Market-and Resource-Based Theory Application, Tested in the Automotive Industry

Authors: Krassimir Todorov

Abstract:

The purpose of the paper is to redefine the levels of structure of corporate, business and functional strategies that were established over the past several decades, to a conceptual model, consisting of corporate, business and operations strategies, that are reinforced by functional strategies. We will propose a conceptual framework of different perspectives in the role of strategic operations as a separate strategic place and reposition the remaining functional strategies as supporting tools, existing at all three levels. The proposed model is called ‘the strategic engine’, since the mutual relationships of its ingredients are identical with main elements and working principle of the internal combustion engine. Based on theoretical essence, related to every strategic level, we will prove that the strategic engine model is useful for managers seeking to safeguard the competitive advantage of their companies. Each strategy level is researched through its basic elements. At the corporate level we examine the scope of firm’s product, the vertical and geographical coverage. At the business level, the point of interest is limited to the SWOT analysis’ basic elements. While at operations level, the key research issue relates to the scope of the following performance indicators: cost, quality, speed, flexibility and dependability. In this relationship, the paper provides a different view for the role of operations strategy within the overall strategy concept. We will prove that the theoretical essence of operations goes far beyond the scope of traditionally accepted business functions. Exploring the applications of Resource-based theory and Market-based theory within the strategic levels framework, we will prove that there is a logical consequence of the theoretical impact in corporate, business and operations strategy – at every strategic level, the validity of one theory is substituted to the level of the other. Practical application of the conceptual model is tested in automotive industry. Actually, the proposed theoretical concept is inspired by a leading global automotive group – Inchcape PLC, listed on the London Stock Exchange, and constituent of the FTSE 250 Index.

Keywords: Business strategy, corporate strategy, functional strategies, operations strategy.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 796
2731 Prediction of Computer and Video Game Playing Population: An Age Structured Model

Authors: T. K. Sriram, Joydip Dhar

Abstract:

Models based on stage structure have found varied applications in population models. This paper proposes a stage structured model to study the trends in the computer and video game playing population of US. The game paying population is divided into three compartments based on their age group. After simulating the mathematical model, a forecast of the number of game players in each stage as well as an approximation of the average age of game players in future has been made.

Keywords: Age structure, Forecasting, Mathematical modeling, Stage structure.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1851
2730 Information System Security Effectiveness Attributes: A Tanzanian Company Case Study

Authors: Nerey H. Mvungi, Mosses Makoko

Abstract:

In today-s highly globalised and competitive world access to information plays key role in having an upper hand between business rivals. Hence, proper protection of such crucial resource is core to any modern business. Implementing a successful information security system is basically centered around three pillars; technical solution involving both software and hardware, information security controls to translate the policies and procedure in the system and the people to implement. This paper shows that a lot needs to be done for countries adapting information technology to process, store and distribute information to secure adequately such core resource.

Keywords: security, information systems, controls, technology, practices.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2561
2729 Hybrid Equity Warrants Pricing Formulation under Stochastic Dynamics

Authors: Teh Raihana Nazirah Roslan, Siti Zulaiha Ibrahim, Sharmila Karim

Abstract:

A warrant is a financial contract that confers the right but not the obligation, to buy or sell a security at a certain price before expiration. The standard procedure to value equity warrants using call option pricing models such as the Black–Scholes model had been proven to contain many flaws, such as the assumption of constant interest rate and constant volatility. In fact, existing alternative models were found focusing more on demonstrating techniques for pricing, rather than empirical testing. Therefore, a mathematical model for pricing and analyzing equity warrants which comprises stochastic interest rate and stochastic volatility is essential to incorporate the dynamic relationships between the identified variables and illustrate the real market. Here, the aim is to develop dynamic pricing formulations for hybrid equity warrants by incorporating stochastic interest rates from the Cox-Ingersoll-Ross (CIR) model, along with stochastic volatility from the Heston model. The development of the model involves the derivations of stochastic differential equations that govern the model dynamics. The resulting equations which involve Cauchy problem and heat equations are then solved using partial differential equation approaches. The analytical pricing formulas obtained in this study comply with the form of analytical expressions embedded in the Black-Scholes model and other existing pricing models for equity warrants. This facilitates the practicality of this proposed formula for comparison purposes and further empirical study.

Keywords: Cox-Ingersoll-Ross model, equity warrants, Heston model, hybrid models, stochastic.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 516
2728 Specialized Reduced Models of Dynamic Flows in 2-Stroke Engines

Authors: S. Cagin, X. Fischer, E. Delacourt, N. Bourabaa, C. Morin, D. Coutellier, B. Carré, S. Loumé

Abstract:

The complexity of scavenging by ports and its impact on engine efficiency create the need to understand and to model it as realistically as possible. However, there are few empirical scavenging models and these are highly specialized. In a design optimization process, they appear very restricted and their field of use is limited. This paper presents a comparison of two methods to establish and reduce a model of the scavenging process in 2-stroke diesel engines. To solve the lack of scavenging models, a CFD model has been developed and is used as the referent case. However, its large size requires a reduction. Two techniques have been tested depending on their fields of application: The NTF method and neural networks. They both appear highly appropriate drastically reducing the model’s size (over 90% reduction) with a low relative error rate (under 10%). Furthermore, each method produces a reduced model which can be used in distinct specialized fields of application: the distribution of a quantity (mass fraction for example) in the cylinder at each time step (pseudo-dynamic model) or the qualification of scavenging at the end of the process (pseudo-static model).

Keywords: Diesel engine, Design optimization, Model reduction, Neural network, NTF algorithm, Scavenging.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1263
2727 Optimisation of Intermodal Transport Chain of Supermarkets on Isle of Wight, UK

Authors: Jingya Liu, Yue Wu, Jiabin Luo

Abstract:

This work investigates an intermodal transportation system for delivering goods from a Regional Distribution Centre to supermarkets on the Isle of Wight (IOW) via the port of Southampton or Portsmouth in the UK. We consider this integrated logistics chain as a 3-echelon transportation system. In such a system, there are two types of transport methods used to deliver goods across the Solent Channel: one is accompanied transport, which is used by most supermarkets on the IOW, such as Spar, Lidl and Co-operative food; the other is unaccompanied transport, which is used by Aldi. Five transport scenarios are studied based on different transport modes and ferry routes. The aim is to determine an optimal delivery plan for supermarkets of different business scales on IOW, in order to minimise the total running cost, fuel consumptions and carbon emissions. The problem is modelled as a vehicle routing problem with time windows and solved by genetic algorithm. The computing results suggested that accompanied transport is more cost efficient for small and medium business-scale supermarket chains on IOW, while unaccompanied transport has the potential to improve the efficiency and effectiveness of large business scale supermarket chains.

Keywords: Genetic algorithm, intermodal transport system, Isle of Wight, optimization, supermarket.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 954
2726 On the Performance of Information Criteria in Latent Segment Models

Authors: Jaime R. S. Fonseca

Abstract:

Nevertheless the widespread application of finite mixture models in segmentation, finite mixture model selection is still an important issue. In fact, the selection of an adequate number of segments is a key issue in deriving latent segments structures and it is desirable that the selection criteria used for this end are effective. In order to select among several information criteria, which may support the selection of the correct number of segments we conduct a simulation study. In particular, this study is intended to determine which information criteria are more appropriate for mixture model selection when considering data sets with only categorical segmentation base variables. The generation of mixtures of multinomial data supports the proposed analysis. As a result, we establish a relationship between the level of measurement of segmentation variables and some (eleven) information criteria-s performance. The criterion AIC3 shows better performance (it indicates the correct number of the simulated segments- structure more often) when referring to mixtures of multinomial segmentation base variables.

Keywords: Quantitative Methods, Multivariate Data Analysis, Clustering, Finite Mixture Models, Information Theoretical Criteria, Simulation experiments.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1475
2725 TheAnalyzer: Clustering-Based System for Improving Business Productivity by Analyzing User Profiles to Enhance Human-Computer Interaction

Authors: D. S. A. Nanayakkara, K. J. P. G. Perera

Abstract:

E-commerce platforms have revolutionized the shopping experience, offering convenient ways for consumers to make purchases. To improve interactions with customers and optimize marketing strategies, it is essential for businesses to understand user behavior, preferences, and needs on these platforms. This paper focuses on recommending businesses to customize interactions with users based on their behavioral patterns, leveraging data-driven analysis and machine learning techniques. Businesses can improve engagement and boost the adoption of e-commerce platforms by aligning behavioral patterns with user goals of usability and satisfaction. We propose TheAnalyzer, a clustering-based system designed to enhance business productivity by analyzing user-profiles and improving human-computer interaction. TheAnalyzer seamlessly integrates with business applications, collecting relevant data points based on users' natural interactions without additional burdens such as questionnaires or surveys. It defines five key user analytics as features for its dataset, which are easily captured through users' interactions with e-commerce platforms. This research presents a study demonstrating the successful distinction of users into specific groups based on the five key analytics considered by TheAnalyzer. With the assistance of domain experts, customized business rules can be attached to each group, enabling TheAnalyzer to influence business applications and provide an enhanced personalized user experience. The outcomes are evaluated quantitatively and qualitatively, demonstrating that utilizing TheAnalyzer’s capabilities can optimize business outcomes, enhance customer satisfaction, and drive sustainable growth. The findings of this research contribute to the advancement of personalized interactions in e-commerce platforms. By leveraging user behavioral patterns and analyzing both new and existing users, businesses can effectively tailor their interactions to improve customer satisfaction, loyalty and ultimately drive sales.

Keywords: Data clustering, data standardization, dimensionality reduction, human-computer interaction, user profiling.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 143
2724 Mapping Knowledge Model Onto Java Codes

Authors: B.A.Gobin, R.K.Subramanian

Abstract:

This paper gives an overview of the mapping mechanism of SEAM-a methodology for the automatic generation of knowledge models and its mapping onto Java codes. It discusses the rules that will be used to map the different components in the knowledge model automatically onto Java classes, properties and methods. The aim of developing this mechanism is to help in the creation of a prototype which will be used to validate the knowledge model which has been generated automatically. It will also help to link the modeling phase with the implementation phase as existing knowledge engineering methodologies do not provide for proper guidelines for the transition from the knowledge modeling phase to development phase. This will decrease the development overheads associated to the development of Knowledge Based Systems.

Keywords: KBS, OWL, ontology, knowledge models

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1334
2723 Review of Studies on Agility in Knowledge Management

Authors: Ferdi Sönmez, Başak Buluz

Abstract:

Agility in Knowledge Management (AKM) tries to capture agility requirements and their respective answers within the framework of knowledge and learning for organizations. Since it is rather a new construct, it is difficult to claim that it has been sufficiently discussed and analyzed in practical and theoretical realms. Like the term ‘agile learning’, it is also commonly addressed in the software development and information technology fields and across the related areas where those technologies can be applied. The organizational perspective towards AKM, seems to need some more time to become scholarly mature. Nevertheless, in the literature one can come across some implicit usages of this term occasionally. This research is aimed to explore the conceptual background of agility in KM, re-conceptualize it and extend it to business applications with a special focus on e-business.

Keywords: Knowledge management, agility requirements, agility in knowledge management, knowledge.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1206
2722 Wangle the Organizational Internal and External Knowledge – A New Horizon for Sustaining the Business Stability

Authors: Asim N., M. Mazhar Manzoor, Shariq A.

Abstract:

Knowledge is renowned as a significant component for sustaining competitive advantage and gives leading edge in business. This study emphasizes towards proper and effectuate utilization of internal and external (both either explicit or tacit) knowledge comes from stakeholders, highly supportive to combat with the challenges and enhance organizational productivity. Furthermore, it proposed a model under context of IRSA framework which facilitates the organization including flow of knowledge and experience sharing among employees. In discussion section an innovative model which indulges all functionality as mentioned in analysis section.

Keywords: Effective Decision-Making, Internal & ExternalKnowledge, Knowledge Management, Tacit & Explicit Knowledge.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1274
2721 A 15 Minute-Based Approach for Berth Allocation and Quay Crane Assignment

Authors: Hoi-Lam Ma, Sai-Ho Chung

Abstract:

In traditional integrated berth allocation with quay crane assignment models, time dimension is usually assumed in hourly based. However, nowadays, transshipment becomes the main business to many container terminals, especially in Southeast Asia (e.g. Hong Kong and Singapore). In these terminals, vessel arrivals are usually very frequent with small handling volume and very short staying time. Therefore, the traditional hourly-based modeling approach may cause significant berth and quay crane idling, and consequently cannot meet their practical needs. In this connection, a 15-minute-based modeling approach is requested by industrial practitioners. Accordingly, a Three-level Genetic Algorithm (3LGA) with Quay Crane (QC) shifting heuristics is designed to fulfill the research gap. The objective function here is to minimize the total service time. Preliminary numerical results show that the proposed 15-minute-based approach can reduce the berth and QC idling significantly.

Keywords: Transshipment, integrated berth allocation, variable-in-time quay crane assignment, quay crane assignment.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 670
2720 Identification, Prediction and Detection of the Process Fault in a Cement Rotary Kiln by Locally Linear Neuro-Fuzzy Technique

Authors: Masoud Sadeghian, Alireza Fatehi

Abstract:

In this paper, we use nonlinear system identification method to predict and detect process fault of a cement rotary kiln. After selecting proper inputs and output, an input-output model is identified for the plant. To identify the various operation points in the kiln, Locally Linear Neuro-Fuzzy (LLNF) model is used. This model is trained by LOLIMOT algorithm which is an incremental treestructure algorithm. Then, by using this method, we obtained 3 distinct models for the normal and faulty situations in the kiln. One of the models is for normal condition of the kiln with 15 minutes prediction horizon. The other two models are for the two faulty situations in the kiln with 7 minutes prediction horizon are presented. At the end, we detect these faults in validation data. The data collected from White Saveh Cement Company is used for in this study.

Keywords: Cement Rotary Kiln, Fault Detection, Delay Estimation Method, Locally Linear Neuro Fuzzy Model, LOLIMOT.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1626
2719 Chinese Entrepreneurship in the Internet Age: Lessons from Alibaba.com

Authors: Linda Sau-ling LAI

Abstract:

The story of Alibaba demonstrates a credible example of how a small start-up company can eventually make it big in the global economy through the Internet. This case study does not attempt to present Alibaba as a perfect formula; rather, it discusses the strategies carried out by the firm and, in the process, culls out the important lessons that can guide start-ups and aspiring entrepreneurs in the complex world of online trading. Similar to the interesting and exotic Asian cuisine that continuously evolves from the diversity of Asia-s people and their unique culture and personality, Alibaba has successfully transformed itself over the years, adapting to the changes in and demands of online businessto- business (B2B) commerce.

Keywords: Entrepreneurship, electronic commerce, leadership, business model, small and medium enterprises.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7116
2718 Institutional Aspects of Information Security in Russian Economy

Authors: Mingaleva Zhanna, Kapuskina Tatiana

Abstract:

The article touches upon questions of information security in Russian Economy. It covers theoretical bases of information security and causes of its development. The theory is proved by the analysis of business activities and the main tendencies of information security development. Perm region has been chosen as the bases for the analysis, being the fastestdeveloping region that uses methods of information security in managing it economy. As a result of the study the authors of the given article have formulated their own vision of the problem of information security in various branches of economy and stated prospects of information security development and its growing role in Russian economy

Keywords: security of business, management of information security, institutional analyses.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1274
2717 Multi-models Approach for Describing and Verifying Constraints Based Interactive Systems

Authors: Mamoun Sqali, Mohamed Wassim Trojet

Abstract:

The requirements analysis, modeling, and simulation have consistently been one of the main challenges during the development of complex systems. The scenarios and the state machines are two successful models to describe the behavior of an interactive system. The scenarios represent examples of system execution in the form of sequences of messages exchanged between objects and are a partial view of the system. In contrast, state machines can represent the overall system behavior. The automation of processing scenarios in the state machines provide some answers to various problems such as system behavior validation and scenarios consistency checking. In this paper, we propose a method for translating scenarios in state machines represented by Discreet EVent Specification and procedure to detect implied scenarios. Each induced DEVS model represents the behavior of an object of the system. The global system behavior is described by coupling the atomic DEVS models and validated through simulation. We improve the validation process with integrating formal methods to eliminate logical inconsistencies in the global model. For that end, we use the Z notation.

Keywords: Scenarios, DEVS, synthesis, validation and verification, simulation, formal verification, z notation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1342
2716 Hybrid Project Management Model Based on Lean and Agile Approach

Authors: F. Z. Eddoug, J. Benhra, R. Benabbou

Abstract:

Excellence and Success are the ultimate goal for any project and in order to achieve it, every project manager looks for the convenient tools and methods. This work proposes a framework that seeks an efficient management of general project through a lean and agile approach. In order to get this objective, the article was divided in two stages, the first one was emphasized on exploring and analyzing the existing project management models and then in the second one the desired framework was created, beginning by focusing on seven existing models and then proposing for each phase of the framework the convenient lean and agile tools.

Keywords: Agility, hybrid project management, lean, scrum.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 350
2715 Co-payment Strategies for Chronic Medications: A Qualitative and Comparative Analysis at European Level

Authors: Pedro M. Abreu, Bruno R. Mendes

Abstract:

The management of pharmacotherapy and the process of dispensing medicines is becoming critical in clinical pharmacy due to the increase of incidence and prevalence of chronic diseases, the complexity and customization of therapeutic regimens, the introduction of innovative and more expensive medicines, the unbalanced relation between expenditure and revenue as well as due to the lack of rationalization associated with medication use. For these reasons, co-payments emerged in Europe in the 70s and have been applied over the past few years in healthcare. Co-payments lead to a rationing and rationalization of user’s access under healthcare services and products, and simultaneously, to a qualification and improvement of the services and products for the end-user. This analysis, under hospital practices particularly and co-payment strategies in general, was carried out on all the European regions and identified four reference countries, that apply repeatedly this tool and with different approaches. The structure, content and adaptation of European co-payments were analyzed through 7 qualitative attributes and 19 performance indicators, and the results expressed in a scorecard, allowing to conclude that the German models (total score of 68,2% and 63,6% in both elected co-payments) can collect more compliance and effectiveness, the English models (total score of 50%) can be more accessible, and the French models (total score of 50%) can be more adequate to the socio-economic and legal framework. Other European models did not show the same quality and/or performance, so were not taken as a standard in the future design of co-payments strategies. In this sense, we can see in the co-payments a strategy not only to moderate the consumption of healthcare products and services, but especially to improve them, as well as a strategy to increment the value that the end-user assigns to these services and products, such as medicines.

Keywords: Clinical pharmacy, co-payments, healthcare, medicines.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1297
2714 On Hyperbolic Gompertz Growth Model

Authors: Angela Unna Chukwu, Samuel Oluwafemi Oyamakin

Abstract:

We proposed a Hyperbolic Gompertz Growth Model (HGGM), which was developed by introducing a shape parameter (allometric). This was achieved by convoluting hyperbolic sine function on the intrinsic rate of growth in the classical gompertz growth equation. The resulting integral solution obtained deterministically was reprogrammed into a statistical model and used in modeling the height and diameter of Pines (Pinus caribaea). Its ability in model prediction was compared with the classical gompertz growth model, an approach which mimicked the natural variability of height/diameter increment with respect to age and therefore provides a more realistic height/diameter predictions using goodness of fit tests and model selection criteria. The Kolmogorov Smirnov test and Shapiro-Wilk test was also used to test the compliance of the error term to normality assumptions while the independence of the error term was confirmed using the runs test. The mean function of top height/Dbh over age using the two models under study predicted closely the observed values of top height/Dbh in the hyperbolic gompertz growth models better than the source model (classical gompertz growth model) while the results of R2, Adj. R2, MSE and AIC confirmed the predictive power of the Hyperbolic Gompertz growth models over its source model.

Keywords: Height, Dbh, forest, Pinus caribaea, hyperbolic, gompertz.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2658
2713 Efficient Tuning Parameter Selection by Cross-Validated Score in High Dimensional Models

Authors: Yoonsuh Jung

Abstract:

As DNA microarray data contain relatively small sample size compared to the number of genes, high dimensional models are often employed. In high dimensional models, the selection of tuning parameter (or, penalty parameter) is often one of the crucial parts of the modeling. Cross-validation is one of the most common methods for the tuning parameter selection, which selects a parameter value with the smallest cross-validated score. However, selecting a single value as an ‘optimal’ value for the parameter can be very unstable due to the sampling variation since the sample sizes of microarray data are often small. Our approach is to choose multiple candidates of tuning parameter first, then average the candidates with different weights depending on their performance. The additional step of estimating the weights and averaging the candidates rarely increase the computational cost, while it can considerably improve the traditional cross-validation. We show that the selected value from the suggested methods often lead to stable parameter selection as well as improved detection of significant genetic variables compared to the tradition cross-validation via real data and simulated data sets.

Keywords: Cross Validation, Parameter Averaging, Parameter Selection, Regularization Parameter Search.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1518
2712 Seismic Vulnerability Assessment of Buildings in Algiers Area

Authors: F. Lazzali, M. Farsi

Abstract:

Several models of vulnerability assessment have been proposed. The selection of one of these models depends on the objectives of the study. The classical methodologies for seismic vulnerability analysis, as a part of seismic risk analysis, have been formulated with statistical criteria based on a rapid observation. The information relating to the buildings performance is statistically elaborated. In this paper, we use the European Macroseismic Scale EMS-98 to define the relationship between damage and macroseismic intensity to assess the seismic vulnerability. Applying to Algiers area, the first step is to identify building typologies and to assign vulnerability classes. In the second step, damages are investigated according to EMS-98.

Keywords: Damage, EMS-98, inventory building, vulnerability classes

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1768
2711 Students as Global Citizens: Lessons from the International Study Tour

Authors: Ana Hol

Abstract:

Study and work operations are being transformed with the uses of technologies and are consequently becoming global. This paper outlines lessons learned based on the international study tour that Australian Bachelor of Information Systems students undertook. This research identifies that for the study tour to be successful, students need to gain skills that global citizens require. For example, students will need to gain an understanding of local cultures, local customs and habits. Furthermore, students would also need to gain an understanding of how a field of their future career expertise operates in the host country, how study and business are conducted internationally, which tools and technologies are currently being utilized on a global scale, what trends drive future developments world-wide and how business negotiations and collaborations are being undertaken across borders. Furthermore, this research provides a guide to educators who are planning, guiding and running study tours as it outlines the requirements of having a pre-tour preparatory session, carefully planned and executed tour itineraries and post-tour sessions during which students can reflect on their experiences and lessons learned so that they can apply them to future international business visits and ventures.

Keywords: Global education, international experiences, international study tours, students as global citizens, student centered education.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1540
2710 Effects of Level Densities and Those of a-Parameter in the Framework of Preequilibrium Model for 63,65Cu(n,xp) Reactions in Neutrons at 9 to 15 MeV

Authors: L. Yettou

Abstract:

In this study, the calculations of proton emission spectra produced by 63Cu(n,xp) and 65Cu(n,xp) reactions are used in the framework of preequilibrium models using the EMPIRE code and TALYS code. Exciton Model predidtions combined with the Kalbach angular distribution systematics and the Hybrid Monte Carlo Simulation (HMS) were used. The effects of levels densities and those of a-parameter have been investigated for our calculations. The comparison with experimental data shows clear improvement over the Exciton Model and HMS calculations.

Keywords: Preequilibrium models, level density, level density a-parameter, 63Cu(n, xp) and 65Cu(n, xp) reactions.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 459
2709 Comparison of Methods of Estimation for Use in Goodness of Fit Tests for Binary Multilevel Models

Authors: I. V. Pinto, M. R. Sooriyarachchi

Abstract:

It can be frequently observed that the data arising in our environment have a hierarchical or a nested structure attached with the data. Multilevel modelling is a modern approach to handle this kind of data. When multilevel modelling is combined with a binary response, the estimation methods get complex in nature and the usual techniques are derived from quasi-likelihood method. The estimation methods which are compared in this study are, marginal quasi-likelihood (order 1 & order 2) (MQL1, MQL2) and penalized quasi-likelihood (order 1 & order 2) (PQL1, PQL2). A statistical model is of no use if it does not reflect the given dataset. Therefore, checking the adequacy of the fitted model through a goodness-of-fit (GOF) test is an essential stage in any modelling procedure. However, prior to usage, it is also equally important to confirm that the GOF test performs well and is suitable for the given model. This study assesses the suitability of the GOF test developed for binary response multilevel models with respect to the method used in model estimation. An extensive set of simulations was conducted using MLwiN (v 2.19) with varying number of clusters, cluster sizes and intra cluster correlations. The test maintained the desirable Type-I error for models estimated using PQL2 and it failed for almost all the combinations of MQL. Power of the test was adequate for most of the combinations in all estimation methods except MQL1. Moreover, models were fitted using the four methods to a real-life dataset and performance of the test was compared for each model.

Keywords: Goodness-of-fit test, marginal quasi-likelihood, multilevel modelling, type-I error, penalized quasi-likelihood, power, quasi-likelihood.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 694
2708 Photogrammetry and GIS Integration for Archaeological Documentation of Ahl-Alkahf, Jordan

Authors: Rami Al-Ruzouq, Abdallah Al-Zoubi, Abdel-Rahman Abueladas, Petya Dimitrova

Abstract:

Protection and proper management of archaeological heritage are an essential process of studying and interpreting the generations present and future. Protecting the archaeological heritage is based upon multidiscipline professional collaboration. This study aims to gather data by different sources (Photogrammetry and Geographic Information System (GIS)) integrated for the purpose of documenting one the of significant archeological sites (Ahl-Alkahf, Jordan). 3D modeling deals with the actual image of the features, shapes and texture to represent reality as realistically as possible by using texture. The 3D coordinates that result of the photogrammetric adjustment procedures are used to create 3D-models of the study area. Adding Textures to the 3D-models surfaces gives a 'real world' appearance to the displayed models. GIS system combined all data, including boundary maps, indicating the location of archeological sites, transportation layer, digital elevation model and orthoimages. For realistic representation of the study area, 3D - GIS model prepared, where efficient generation, management and visualization of such special data can be achieved.

Keywords: Archaeology, close range photogrammetry, ortho-photo, 3D-GIS

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2113
2707 Implementation of State-Space and Super-Element Techniques for the Modeling and Control of Smart Structures with Damping Characteristics

Authors: Nader Ghareeb, R¨udiger Schmidt

Abstract:

Minimizing the weight in flexible structures means reducing material and costs as well. However, these structures could become prone to vibrations. Attenuating these vibrations has become a pivotal engineering problem that shifted the focus of many research endeavors. One technique to do that is to design and implement an active control system. This system is mainly composed of a vibrating structure, a sensor to perceive the vibrations, an actuator to counteract the influence of disturbances, and finally a controller to generate the appropriate control signals. In this work, two different techniques are explored to create two different mathematical models of an active control system. The first model is a finite element model with a reduced number of nodes and it is called a super-element. The second model is in the form of state-space representation, i.e. a set of partial differential equations. The damping coefficients are calculated and incorporated into both models. The effectiveness of these models is demonstrated when the system is excited by its first natural frequency and an active control strategy is developed and implemented to attenuate the resulting vibrations. Results from both modeling techniques are presented and compared.

Keywords: Finite element analysis, super-element, state-space model.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 785
2706 Thai Perception on Litecoin Value

Authors: Toby Gibbs, Suwaree Yordchim

Abstract:

This research analyzes factors affecting the success of Litecoin Value within Thailand and develops a guideline for selfreliance for effective business implementation. Samples in this study included 119 people through surveys. The results revealed four main factors affecting the success as follows: 1) Future Career training should be pursued in applied Litecoin development. 2) Didn't grasp the concept of a digital currency or see the benefit of a digital currency. 3) There is a great need to educate the next generation of learners on the benefits of Litecoin within the community. 4) A great majority didn't know what Litecoin was. The guideline for self-reliance planning consisted of 4 aspects: 1) Development planning: by arranging meet up groups to conduct further education on Litecoin and share solutions on adoption into every day usage. Local communities need to develop awareness of the usefulness of Litecoin and share the value of Litecoin among friends and family. 2) Computer Science and Business Management staff should develop skills to expand on the benefits of Litecoin within their departments. 3) Further research should be pursued on how Litecoin Value can improve business and tourism within Thailand. 4) Local communities should focus on developing Litecoin awareness by encouraging street vendors to accept Litecoin as another form of payment for services rendered.

Keywords: Litecoin, Mining, Confirmations.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2745