Search results for: contractual complexity
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1664

Search results for: contractual complexity

1484 Analysis of Complex Business Negotiations: Contributions from Agency-Theory

Authors: Jan Van Uden

Abstract:

The paper reviews classical agency-theory and its contributions to the analysis of complex business negotiations and gives an approach for the modification of the basic agency-model in order to examine the negotiation specific dimensions of agency-problems. By illustrating fundamental potentials for the modification of agency-theory in context of business negotiations the paper highlights recent empirical research that investigates agent-based negotiations and inter-team constellations. A general theoretical analysis of complex negotiation would be based on a two-level approach. First, the modification of the basic agency-model in order to illustrate the organizational context of business negotiations (i.e., multi-agent issues, common-agencies, multi-period models and the concept of bounded rationality). Second, the application of the modified agency-model on complex business negotiations to identify agency-problems and relating areas of risk in the negotiation process. The paper is placed on the first level of analysis – the modification. The method builds on the one hand on insights from behavior decision research (BRD) and on the other hand on findings from agency-theory as normative directives to the modification of the basic model. Through neoclassical assumptions concerning the fundamental aspects of agency-relationships in business negotiations (i.e., asymmetric information, self-interest, risk preferences and conflict of interests), agency-theory helps to draw solutions on stated worst-case-scenarios taken from the daily negotiation routine. As agency-theory is the only universal approach able to identify trade-offs between certain aspects of economic cooperation, insights obtained provide a deeper understanding of the forces that shape business negotiation complexity. The need for a modification of the basic model is illustrated by highlighting selected issues of business negotiations from agency-theory perspective: Negotiation Teams require a multi-agent approach under the condition that often decision-makers as superior-agents are part of the team. The diversity of competences and decision-making authority is a phenomenon that overrides the assumptions of classical agency-theory and varies greatly in context of certain forms of business negotiations. Further, the basic model is bound to dyadic relationships preceded by the delegation of decision-making authority and builds on a contractual created (vertical) hierarchy. As a result, horizontal dynamics within the negotiation team playing an important role for negotiation success are therefore not considered in the investigation of agency-problems. Also, the trade-off between short-term relationships within the negotiation sphere and the long-term relationships of the corporate sphere calls for a multi-period perspective taking into account the sphere-specific governance-mechanisms already established (i.e., reward and monitoring systems). Within the analysis, the implementation of bounded rationality is closely related to findings from BRD to assess the impact of negotiation behavior on underlying principal-agent-relationships. As empirical findings show, the disclosure and reservation of information to the agent affect his negotiation behavior as well as final negotiation outcomes. Last, in context of business negotiations, asymmetric information is often intended by decision-makers acting as superior-agents or principals which calls for a bilateral risk-approach to agency-relations.

Keywords: business negotiations, agency-theory, negotiation analysis, interteam negotiations

Procedia PDF Downloads 115
1483 An Improved Data Aided Channel Estimation Technique Using Genetic Algorithm for Massive Multi-Input Multiple-Output

Authors: M. Kislu Noman, Syed Mohammed Shamsul Islam, Shahriar Hassan, Raihana Pervin

Abstract:

With the increasing rate of wireless devices and high bandwidth operations, wireless networking and communications are becoming over crowded. To cope with such crowdy and messy situation, massive MIMO is designed to work with hundreds of low costs serving antennas at a time as well as improve the spectral efficiency at the same time. TDD has been used for gaining beamforming which is a major part of massive MIMO, to gain its best improvement to transmit and receive pilot sequences. All the benefits are only possible if the channel state information or channel estimation is gained properly. The common methods to estimate channel matrix used so far is LS, MMSE and a linear version of MMSE also proposed in many research works. We have optimized these methods using genetic algorithm to minimize the mean squared error and finding the best channel matrix from existing algorithms with less computational complexity. Our simulation result has shown that the use of GA worked beautifully on existing algorithms in a Rayleigh slow fading channel and existence of Additive White Gaussian Noise. We found that the GA optimized LS is better than existing algorithms as GA provides optimal result in some few iterations in terms of MSE with respect to SNR and computational complexity.

Keywords: channel estimation, LMMSE, LS, MIMO, MMSE

Procedia PDF Downloads 166
1482 Factors Impacting Geostatistical Modeling Accuracy and Modeling Strategy of Fluvial Facies Models

Authors: Benbiao Song, Yan Gao, Zhuo Liu

Abstract:

Geostatistical modeling is the key technic for reservoir characterization, the quality of geological models will influence the prediction of reservoir performance greatly, but few studies have been done to quantify the factors impacting geostatistical reservoir modeling accuracy. In this study, 16 fluvial prototype models have been established to represent different geological complexity, 6 cases range from 16 to 361 wells were defined to reproduce all those 16 prototype models by different methodologies including SIS, object-based and MPFS algorithms accompany with different constraint parameters. Modeling accuracy ratio was defined to quantify the influence of each factor, and ten realizations were averaged to represent each accuracy ratio under the same modeling condition and parameters association. Totally 5760 simulations were done to quantify the relative contribution of each factor to the simulation accuracy, and the results can be used as strategy guide for facies modeling in the similar condition. It is founded that data density, geological trend and geological complexity have great impact on modeling accuracy. Modeling accuracy may up to 90% when channel sand width reaches up to 1.5 times of well space under whatever condition by SIS and MPFS methods. When well density is low, the contribution of geological trend may increase the modeling accuracy from 40% to 70%, while the use of proper variogram may have very limited contribution for SIS method. It can be implied that when well data are dense enough to cover simple geobodies, few efforts were needed to construct an acceptable model, when geobodies are complex with insufficient data group, it is better to construct a set of robust geological trend than rely on a reliable variogram function. For object-based method, the modeling accuracy does not increase obviously as SIS method by the increase of data density, but kept rational appearance when data density is low. MPFS methods have the similar trend with SIS method, but the use of proper geological trend accompany with rational variogram may have better modeling accuracy than MPFS method. It implies that the geological modeling strategy for a real reservoir case needs to be optimized by evaluation of dataset, geological complexity, geological constraint information and the modeling objective.

Keywords: fluvial facies, geostatistics, geological trend, modeling strategy, modeling accuracy, variogram

Procedia PDF Downloads 239
1481 Deep Routing Strategy: Deep Learning based Intelligent Routing in Software Defined Internet of Things.

Authors: Zabeehullah, Fahim Arif, Yawar Abbas

Abstract:

Software Defined Network (SDN) is a next genera-tion networking model which simplifies the traditional network complexities and improve the utilization of constrained resources. Currently, most of the SDN based Internet of Things(IoT) environments use traditional network routing strategies which work on the basis of max or min metric value. However, IoT network heterogeneity, dynamic traffic flow and complexity demands intelligent and self-adaptive routing algorithms because traditional routing algorithms lack the self-adaptions, intelligence and efficient utilization of resources. To some extent, SDN, due its flexibility, and centralized control has managed the IoT complexity and heterogeneity but still Software Defined IoT (SDIoT) lacks intelligence. To address this challenge, we proposed a model called Deep Routing Strategy (DRS) which uses Deep Learning algorithm to perform routing in SDIoT intelligently and efficiently. Our model uses real-time traffic for training and learning. Results demonstrate that proposed model has achieved high accuracy and low packet loss rate during path selection. Proposed model has also outperformed benchmark routing algorithm (OSPF). Moreover, proposed model provided encouraging results during high dynamic traffic flow.

Keywords: SDN, IoT, DL, ML, DRS

Procedia PDF Downloads 87
1480 Challenges for Interface Designers in Designing Sensor Dashboards in the Context of Industry 4.0

Authors: Naveen Kumar, Shyambihari Prajapati

Abstract:

Industry 4.0 is the fourth industrial revolution that focuses on interconnectivity of machine to machine, human to machine and human to human via Internet of Things (IoT). Technologies of industry 4.0 facilitate communication between human and machine through IoT and forms Cyber-Physical Production System (CPPS). In CPPS, multiple shop floors sensor data are connected through IoT and displayed through sensor dashboard to the operator. These sensor dashboards have enormous amount of information to be presented which becomes complex for operators to perform monitoring, controlling and interpretation tasks. Designing handheld sensor dashboards for supervision task will become a challenge for the interface designers. This paper reports emerging technologies of industry 4.0, changing context of increasing information complexity in consecutive industrial revolutions and upcoming design challenges for interface designers in context of Industry 4.0. Authors conclude that information complexity of sensor dashboards design has increased with consecutive industrial revolutions and designs of sensor dashboard causes cognitive load on users. Designing such complex dashboards interfaces in Industry 4.0 context will become main challenges for the interface designers.

Keywords: Industry4.0, sensor dashboard design, cyber-physical production system, Interface designer

Procedia PDF Downloads 105
1479 Managing Information Technology: An Overview of Information Technology Governance

Authors: Mehdi Asgarkhani

Abstract:

Today, investment on Information Technology (IT) solutions in most organizations is the largest component of capital expenditure. As capital investment on IT continues to grow, IT managers and strategists are expected to develop and put in practice effective decision making models (frameworks) that improve decision-making processes for the use of IT in organizations and optimize the investment on IT solutions. To be exact, there is an expectation that organizations not only maximize the benefits of adopting IT solutions but also avoid the many pitfalls that are associated with rapid introduction of technological change. Different organizations depending on size, complexity of solutions required and processes used for financial management and budgeting may use different techniques for managing strategic investment on IT solutions. Decision making processes for strategic use of IT within organizations are often referred to as IT Governance (or Corporate IT Governance). This paper examines IT governance - as a tool for best practice in decision making about IT strategies. Discussions in this paper represent phase I of a project which was initiated to investigate trends in strategic decision making on IT strategies. Phase I is concerned mainly with review of literature and a number of case studies, establishing that the practice of IT governance, depending on the complexity of IT solutions, organization's size and organization's stage of maturity, varies significantly – from informal approaches to sophisticated formal frameworks.

Keywords: IT governance, corporate governance, IT governance frameworks, IT governance components, aligning IT with business strategies

Procedia PDF Downloads 382
1478 Influence of Procurement Methods on Cost Performance of Building Projects in Gombe State, Nigeria

Authors: S. U. Kunya, S. Abdulkadir, M. A. Anas, L. Z. Adam

Abstract:

Procurement methods is described as systems of contractual arrangements used by the contractor in order to secure the design and construction services based on the stipulated cost and within the required time and quality. Despite that, major projects in the Nigerian construction industry failed because of wrong procurement methods with major consequences leads to cost overrun which needs to find lasting solution. The aim of the study is to evaluate the influence of procurement methods on cost performance of building projects in Gombe State, Nigeria. Study adopts descriptive and explorative design approach. Data were collected through administering of one hundred questionnaire using convenient sampling techniques. Data analyses using percentages, mean value and Anova analysis. Major finding show that more than fifty percent (50%) of procurement methods available are mainly utilized in the study area and the top procurement methods that have high impacts on cost performance as compare with the other methods is project management and direct labour procurement methods. The results of hypothesis’ tests with pvalue 0.12 and 0.07 validated that there was no significant variation in the perception of stakeholders’ on the impacts of procurements methods on cost performance. Therefore, the study concluded that projects management and direct labour are the most appropriate procurement methods that will ensure successful completion of project at stipulated cost in building projects.

Keywords: cost, effects, performance, procurement, projects

Procedia PDF Downloads 199
1477 Final Costs of Civil Claims

Authors: Behnam Habibi Dargah

Abstract:

The economics of cost-benefit theory seeks to monitor claims and determine their final price. The cost of litigation is important because it is a measure of the efficiency of the justice system. From an economic point of view, the cost of litigation is considered to be the point of equilibrium of litigation, whereby litigation is regarded as a high-risk investment and is initiated when the costs are less than the probable and expected benefits. Costs are economically separated into private and social costs. Private cost includes material (direct and indirect) and spiritual costs. The social costs of litigation are also subsidized-centric due to the public and governmental nature of litigation and cover both types of bureaucratic bureaucracy and the costs of judicial misconduct. Macroeconomic policy in the economics of justice is the reverse engineering of controlling the social costs of litigation by employing selective litigation and working on the judicial culture to achieve rationality in the monopoly system. Procedures for controlling and managing court costs are also circumscribed to economic patterns in the field. Rational cost allocation model and cost transfer model. The rational allocation model deals with cost-tolerance systems, and the transfer model also considers three models of transferability, including legal, judicial and contractual transferability, which will be described and explored in the present article in a comparative manner.

Keywords: cost of litigation, economics of litigation, private cost, social cost, cost of litigation

Procedia PDF Downloads 99
1476 Testing the Simplification Hypothesis in Constrained Language Use: An Entropy-Based Approach

Authors: Jiaxin Chen

Abstract:

Translations have been labeled as more simplified than non-translations, featuring less diversified and more frequent lexical items and simpler syntactic structures. Such simplified linguistic features have been identified in other bilingualism-influenced language varieties, including non-native and learner language use. Therefore, it has been proposed that translation could be studied within a broader framework of constrained language, and simplification is one of the universal features shared by constrained language varieties due to similar cognitive-physiological and social-interactive constraints. Yet contradicting findings have also been presented. To address this issue, this study intends to adopt Shannon’s entropy-based measures to quantify complexity in language use. Entropy measures the level of uncertainty or unpredictability in message content, and it has been adapted in linguistic studies to quantify linguistic variance, including morphological diversity and lexical richness. In this study, the complexity of lexical and syntactic choices will be captured by word-form entropy and pos-form entropy, and a comparison will be made between constrained and non-constrained language use to test the simplification hypothesis. The entropy-based method is employed because it captures both the frequency of linguistic choices and their evenness of distribution, which are unavailable when using traditional indices. Another advantage of the entropy-based measure is that it is reasonably stable across languages and thus allows for a reliable comparison among studies on different language pairs. In terms of the data for the present study, one established (CLOB) and two self-compiled corpora will be used to represent native written English and two constrained varieties (L2 written English and translated English), respectively. Each corpus consists of around 200,000 tokens. Genre (press) and text length (around 2,000 words per text) are comparable across corpora. More specifically, word-form entropy and pos-form entropy will be calculated as indicators of lexical and syntactical complexity, and ANOVA tests will be conducted to explore if there is any corpora effect. It is hypothesized that both L2 written English and translated English have lower entropy compared to non-constrained written English. The similarities and divergences between the two constrained varieties may provide indications of the constraints shared by and peculiar to each variety.

Keywords: constrained language use, entropy-based measures, lexical simplification, syntactical simplification

Procedia PDF Downloads 68
1475 Testing a Flexible Manufacturing System Facility Production Capacity through Discrete Event Simulation: Automotive Case Study

Authors: Justyna Rybicka, Ashutosh Tiwari, Shane Enticott

Abstract:

In the age of automation and computation aiding manufacturing, it is clear that manufacturing systems have become more complex than ever before. Although technological advances provide the capability to gain more value with fewer resources, sometimes utilisation of the manufacturing capabilities available to organisations is difficult to achieve. Flexible manufacturing systems (FMS) provide a unique capability to manufacturing organisations where there is a need for product range diversification by providing line efficiency through production flexibility. This is very valuable in trend driven production set-ups or niche volume production requirements. Although FMS provides flexible and efficient facilities, its optimal set-up is key in achieving production performance. As many variables are interlinked due to the flexibility provided by the FMS, analytical calculations are not always sufficient to predict the FMS’ performance. Simulation modelling is capable of capturing the complexity and constraints associated with FMS. This paper demonstrates how discrete event simulation (DES) can address complexity in an FMS to optimise the production line performance. A case study of an automotive FMS is presented. The DES model demonstrates different configuration options depending on prioritising objectives: utilisation and throughput. Additionally, this paper provides insight into understanding the impact of system set-up constraints on the FMS performance and demonstrates the exploration into the optimal production set-up.

Keywords: discrete event simulation, flexible manufacturing system, capacity performance, automotive

Procedia PDF Downloads 308
1474 The Impact of Major Accounting Events on Managerial Ability and the Accuracy of Environmental Capital Expenditure Projections of the Environmentally Sensitive Industries

Authors: Jason Chen, Jennifer Chen, Shiyu Li

Abstract:

We examine whether managerial ability (MA), the passing of Sarbanes-Oxley in 2002 (SOX), and corporate operational complexity affect the accuracy of environmental capital expenditure projections of the environmentally sensitive industries (ESI). Prior studies found that firms in the ESI manipulated their projected environmental capital expenditures as a tool to achieve corporate legitimation and suggested that human factors must be examined to determine whether they are part of the determinants. We use MA to proxy for the latent human factors to examine whether MA affects the accuracy of financial disclosures in the ESI. To expand Chen and Chen (2020), we further investigate whether (1) SOX and (2) firms with complex operations and financial reporting in conjunction with MA affect firms’ projection accuracy. We find, overall, that MA is positively correlated with firm’s projection accuracy in the annual 10-Ks. Furthermore, results suggest that SOX has a positive, yet temporary, effect on MA, and that leads to better accuracy. Finally, MA matters for firms with more complex operations and financial reporting to make less projection errors than their less-complex counterparts. These results suggest that MA is a determinant that affects the accuracy of environmental capital expenditure projections for the firms in the ESI.

Keywords: managerial ability, environmentally sensitive industries, sox, corporate operational complexity

Procedia PDF Downloads 117
1473 Developing Offshore Energy Grids in Norway as Capability Platforms

Authors: Vidar Hepsø

Abstract:

The energy and oil companies on the Norwegian Continental shelf come from a situation where each asset control and manage their energy supply (island mode) and move towards a situation where the assets need to collaborate and coordinate energy use with others due to increased cost and scarcity of electric energy sharing the energy that is provided. Currently, several areas are electrified either with an onshore grid cable or are receiving intermittent energy from offshore wind-parks. While the onshore grid in Norway is well regulated, the offshore grid is still in the making, with several oil and gas electrification projects and offshore wind development just started. The paper will describe the shift in the mindset that comes with operating this new offshore grid. This transition process heralds an increase in collaboration across boundaries and integration of energy management across companies, businesses, technical disciplines, and engagement with stakeholders in the larger society. This transition will be described as a function of the new challenges with increased complexity of the energy mix (wind, oil/gas, hydrogen and others) coupled with increased technical and organization complexity in energy management. Organizational complexity denotes an increasing integration across boundaries, whether these boundaries are company, vendors, professional disciplines, regulatory regimes/bodies, businesses, and across numerous societal stakeholders. New practices must be developed, made legitimate and institutionalized across these boundaries. Only parts of this complexity can be mitigated technically, e.g.: by use of batteries, mixing energy systems and simulation/ forecasting tools. Many challenges must be mitigated with legitimated societal and institutionalized governance practices on many levels. Offshore electrification supports Norway’s 2030 climate targets but is also controversial since it is exploiting the larger society’s energy resources. This means that new systems and practices must also be transparent, not only for the industry and the authorities, but must also be acceptable and just for the larger society. The paper report from ongoing work in Norway, participant observation and interviews in projects and people working with offshore grid development in Norway. One case presented is the development of an offshore floating windfarm connected to two offshore installations and the second case is an offshore grid development initiative providing six installations electric energy via an onshore cable. The development of the offshore grid is analyzed using a capability platform framework, that describes the technical, competence, work process and governance capabilities that are under development in Norway. A capability platform is a ‘stack’ with the following layers: intelligent infrastructure, information and collaboration, knowledge sharing & analytics and finally business operations. The need for better collaboration and energy forecasting tools/capabilities in this stack will be given a special attention in the two use cases that are presented.

Keywords: capability platform, electrification, carbon footprint, control rooms, energy forecsting, operational model

Procedia PDF Downloads 44
1472 Potentials of Additive Manufacturing: An Approach to Increase the Flexibility of Production Systems

Authors: A. Luft, S. Bremen, N. Balc

Abstract:

The task of flexibility planning and design, just like factory planning, for example, is to create the long-term systemic framework that constitutes the restriction for short-term operational management. This is a strategic challenge since, due to the decision defect character of the underlying flexibility problem, multiple types of flexibility need to be considered over the course of various scenarios, production programs, and production system configurations. In this context, an evaluation model has been developed that integrates both conventional and additive resources on a basic task level and allows the quantification of flexibility enhancement in terms of mix and volume flexibility, complexity reduction, and machine capacity. The model helps companies to decide in early decision-making processes about the potential gains of implementing additive manufacturing technologies on a strategic level. For companies, it is essential to consider both additive and conventional manufacturing beyond pure unit costs. It is necessary to achieve an integrative view of manufacturing that incorporates both additive and conventional manufacturing resources and quantifies their potential with regard to flexibility and manufacturing complexity. This also requires a structured process for the strategic production systems design that spans the design of various scenarios and allows for multi-dimensional and comparative analysis. A respective guideline for the planning of additive resources on a strategic level is being laid out in this paper.

Keywords: additive manufacturing, production system design, flexibility enhancement, strategic guideline

Procedia PDF Downloads 98
1471 Pushing the Boundary of Parallel Tractability for Ontology Materialization via Boolean Circuits

Authors: Zhangquan Zhou, Guilin Qi

Abstract:

Materialization is an important reasoning service for applications built on the Web Ontology Language (OWL). To make materialization efficient in practice, current research focuses on deciding tractability of an ontology language and designing parallel reasoning algorithms. However, some well-known large-scale ontologies, such as YAGO, have been shown to have good performance for parallel reasoning, but they are expressed in ontology languages that are not parallelly tractable, i.e., the reasoning is inherently sequential in the worst case. This motivates us to study the problem of parallel tractability of ontology materialization from a theoretical perspective. That is we aim to identify the ontologies for which materialization is parallelly tractable, i.e., in the NC complexity. Since the NC complexity is defined based on Boolean circuit that is widely used to investigate parallel computing problems, we first transform the problem of materialization to evaluation of Boolean circuits, and then study the problem of parallel tractability based on circuits. In this work, we focus on datalog rewritable ontology languages. We use Boolean circuits to identify two classes of datalog rewritable ontologies (called parallelly tractable classes) such that materialization over them is parallelly tractable. We further investigate the parallel tractability of materialization of a datalog rewritable OWL fragment DHL (Description Horn Logic). Based on the above results, we analyze real-world datasets and show that many ontologies expressed in DHL belong to the parallelly tractable classes.

Keywords: ontology materialization, parallel reasoning, datalog, Boolean circuit

Procedia PDF Downloads 245
1470 Urban Networks as Model of Sustainable Design

Authors: Agryzkov Taras, Oliver Jose L., Tortosa Leandro, Vicent Jose

Abstract:

This paper aims to demonstrate how the consideration of cities as a special kind of complex network, called urban network, may lead to the use of design tools coming from network theories which, in fact, results in a quite sustainable approach. There is no doubt that the irruption in contemporary thought of Gaia as an essential political agent proposes a narrative that has been extended to the field of creative processes in which, of course, the activity of Urban Design is found. The rationalist paradigm is put in crisis, and from the so-called sciences of complexity, its way of describing reality and of intervening in it is questioned. Thus, a new way of understanding reality surges, which has to do with a redefinition of the human being's own place in what is now understood as a delicate and complex network. In this sense, we know that in these systems of connected and interdependent elements, the influences generated by them originate emergent properties and behaviors for the whole that, individually studied, would not make sense. We believe that the design of cities cannot remain oblivious to these principles, and therefore this research aims to demonstrate the potential that they have for decision-making in the urban environment. Thus, we will see an example of action in the field of public mobility, another example in the design of commercial areas, and a third example in the field of redensification of sprawl areas, in which different aspects of network theory have been applied to change the urban design. We think that even though these actions have been developed in European cities, and more specifically in the Mediterranean area in Spain, the reflections and tools could have a broader scope of action.

Keywords: graphs, complexity sciences, urban networks, urban design

Procedia PDF Downloads 125
1469 Portfolio Management for Construction Company during Covid-19 Using AHP Technique

Authors: Sareh Rajabi, Salwa Bheiry

Abstract:

In general, Covid-19 created many financial and non-financial damages to the economy and community. Level and severity of covid-19 as pandemic case varies over the region and due to different types of the projects. Covid-19 virus emerged as one of the most imperative risk management factors word-wide recently. Therefore, as part of portfolio management assessment, it is essential to evaluate severity of such risk on the project and program in portfolio management level to avoid any risky portfolio. Covid-19 appeared very effectively in South America, part of Europe and Middle East. Such pandemic infection affected the whole universe, due to lock down, interruption in supply chain management, health and safety requirements, transportations and commercial impacts. Therefore, this research proposes Analytical Hierarchy Process (AHP) to analyze and assess such pandemic case like Covid-19 and its impacts on the construction projects. The AHP technique uses four sub-criteria: Health and safety, commercial risk, completion risk and contractual risk to evaluate the project and program. The result will provide the decision makers with information which project has higher or lower risk in case of Covid-19 and pandemic scenario. Therefore, the decision makers can have most feasible solution based on effective weighted criteria for project selection within their portfolio to match with the organization’s strategies.

Keywords: portfolio management, risk management, COVID-19, analytical hierarchy process technique

Procedia PDF Downloads 89
1468 Binary Logistic Regression Model in Predicting the Employability of Senior High School Graduates

Authors: Cromwell F. Gopo, Joy L. Picar

Abstract:

This study aimed to predict the employability of senior high school graduates for S.Y. 2018- 2019 in the Davao del Norte Division through quantitative research design using the descriptive status and predictive approaches among the indicated parameters, namely gender, school type, academics, academic award recipient, skills, values, and strand. The respondents of the study were the 33 secondary schools offering senior high school programs identified through simple random sampling, which resulted in 1,530 cases of graduates’ secondary data, which were analyzed using frequency, percentage, mean, standard deviation, and binary logistic regression. Results showed that the majority of the senior high school graduates who come from large schools were females. Further, less than half of these graduates received any academic award in any semester. In general, the graduates’ performance in academics, skills, and values were proficient. Moreover, less than half of the graduates were not employed. Then, those who were employed were either contractual, casual, or part-time workers dominated by GAS graduates. Further, the predictors of employability were gender and the Information and Communications Technology (ICT) strand, while the remaining variables did not add significantly to the model. The null hypothesis had been rejected as the coefficients of the predictors in the binary logistic regression equation did not take the value of 0. After utilizing the model, it was concluded that Technical-Vocational-Livelihood (TVL) graduates except ICT had greater estimates of employability.

Keywords: employability, senior high school graduates, Davao del Norte, Philippines

Procedia PDF Downloads 115
1467 Hotel Deposit Contract and Coverage of Risks Resulting, through Insurance Contracts, in Tourism within the HoReCa Domain: Alternative Dispute Resolution Methods on These Contracts

Authors: Laura Ramona Nae

Abstract:

The issue of risks faced by companies providing tourist and hotel services in the HoReCa field, related to the goods belonging to consumer tourists left in hotel storage, has acquired a new dimension in the context of the economic and geo-political influences that have recently intervened at the global level. Thus, hoteliers and not only had to create contractual mechanisms regarding the risks and to protect the businesses in this field of activity. This situation has led to a reassessment of the importance of insurance, in particular with regard to hotel liability insurance-premises liability, safety, and security of goods. Interpretation of clauses in contracts concluded between hoteliers and tourists consuming hotel services and products, all the more so in the current pandemic context of Covid 19, stressed the increase in the number of disputes generated by them. This article presents a general picture of the significance of the risks related to the activity carried out in the hospitality industry, tourism, respectively within the HoReCa field. The study mainly marks the specificities of the hotel deposit contract, as well as the related insurance specific to the field, as a way to cover these risks. The article also refers to alternative methods of out-of-court settlement of disputes (ADR) in the HoReCa domain, generally used in both Romania and the European Union.

Keywords: consumer tourist, disputes and ADR methods, deposit contract, hotel warehouse and hotelier insurance, hotel services and tourist products, HoReCa

Procedia PDF Downloads 32
1466 Optimization Based Extreme Learning Machine for Watermarking of an Image in DWT Domain

Authors: RAM PAL SINGH, VIKASH CHAUDHARY, MONIKA VERMA

Abstract:

In this paper, we proposed the implementation of optimization based Extreme Learning Machine (ELM) for watermarking of B-channel of color image in discrete wavelet transform (DWT) domain. ELM, a regularization algorithm, works based on generalized single-hidden-layer feed-forward neural networks (SLFNs). However, hidden layer parameters, generally called feature mapping in context of ELM need not to be tuned every time. This paper shows the embedding and extraction processes of watermark with the help of ELM and results are compared with already used machine learning models for watermarking.Here, a cover image is divide into suitable numbers of non-overlapping blocks of required size and DWT is applied to each block to be transformed in low frequency sub-band domain. Basically, ELM gives a unified leaning platform with a feature mapping, that is, mapping between hidden layer and output layer of SLFNs, is tried for watermark embedding and extraction purpose in a cover image. Although ELM has widespread application right from binary classification, multiclass classification to regression and function estimation etc. Unlike SVM based algorithm which achieve suboptimal solution with high computational complexity, ELM can provide better generalization performance results with very small complexity. Efficacy of optimization method based ELM algorithm is measured by using quantitative and qualitative parameters on a watermarked image even though image is subjected to different types of geometrical and conventional attacks.

Keywords: BER, DWT, extreme leaning machine (ELM), PSNR

Procedia PDF Downloads 285
1465 Maintenance Optimization for a Multi-Component System Using Factored Partially Observable Markov Decision Processes

Authors: Ipek Kivanc, Demet Ozgur-Unluakin

Abstract:

Over the past years, technological innovations and advancements have played an important role in the industrial world. Due to technological improvements, the degree of complexity of the systems has increased. Hence, all systems are getting more uncertain that emerges from increased complexity, resulting in more cost. It is challenging to cope with this situation. So, implementing efficient planning of maintenance activities in such systems are getting more essential. Partially Observable Markov Decision Processes (POMDPs) are powerful tools for stochastic sequential decision problems under uncertainty. Although maintenance optimization in a dynamic environment can be modeled as such a sequential decision problem, POMDPs are not widely used for tackling maintenance problems. However, they can be well-suited frameworks for obtaining optimal maintenance policies. In the classical representation of the POMDP framework, the system is denoted by a single node which has multiple states. The main drawback of this classical approach is that the state space grows exponentially with the number of state variables. On the other side, factored representation of POMDPs enables to simplify the complexity of the states by taking advantage of the factored structure already available in the nature of the problem. The main idea of factored POMDPs is that they can be compactly modeled through dynamic Bayesian networks (DBNs), which are graphical representations for stochastic processes, by exploiting the structure of this representation. This study aims to demonstrate how maintenance planning of dynamic systems can be modeled with factored POMDPs. An empirical maintenance planning problem of a dynamic system consisting of four partially observable components deteriorating in time is designed. To solve the empirical model, we resort to Symbolic Perseus solver which is one of the state-of-the-art factored POMDP solvers enabling approximate solutions. We generate some more predefined policies based on corrective or proactive maintenance strategies. We execute the policies on the empirical problem for many replications and compare their performances under various scenarios. The results show that the computed policies from the POMDP model are superior to the others. Acknowledgment: This work is supported by the Scientific and Technological Research Council of Turkey (TÜBİTAK) under grant no: 117M587.

Keywords: factored representation, maintenance, multi-component system, partially observable Markov decision processes

Procedia PDF Downloads 112
1464 Community Perceptions and Attitudes Regarding Wildlife Crime in South Africa

Authors: Louiza C. Duncker, Duarte Gonçalves

Abstract:

Wildlife crime is a complex problem with many interconnected facets, which are generally responded to in parts or fragments in efforts to “break down” the complexity into manageable components. However, fragmentation increases complexity as coherence and cooperation become diluted. A whole-of-society approach has been developed towards finding a common goal and integrated approach to preventing wildlife crime. As part of this development, research was conducted in rural communities adjacent to conservation areas in South Africa to define and comprehend the challenges faced by them, and to understand their perceptions of wildlife crime. The results of the research showed that the perceptions of community members varied - most were in favor of conservation and of protecting rhinos, only if they derive adequate benefit from it. Regardless of gender, income level, education level, or access to services, conservation was perceived to be good and bad by the same people. Even though people in the communities are poor, a willingness to stop rhino poaching does exist amongst them, but their perception of parks not caring about people triggered an attitude of not being willing to stop, prevent or report poaching. Understanding the nuances, the history, the interests and values of community members, and the drivers behind poaching mind-sets (intrinsic or driven by transnational organized crime) is imperative to create sustainable and resilient communities on multiple levels that make a substantial positive impact on people’s lives, but also conserve wildlife for posterity.

Keywords: community perceptions, conservation, rhino poaching, whole-of-society approach, wildlife crime

Procedia PDF Downloads 215
1463 The Friction Of Oil Contaminated Granular Soils; Experimental Study

Authors: Miron A, Tadmor R, Pinkert S

Abstract:

Soil contamination is a pressing environmental concern, drawing considerable focus due to its adverse ecological and health outcomes, and the frequent occurrence of contamination incidents in recent years. The interaction between the oil pollutant and the host soil can alter the mechanical properties of the soil in a manner that can crucially affect engineering challenges associated with the stability of soil systems. The geotechnical investigation of contaminated soils has gained momentum since the Gulf War in the 1990s, when a massive amount of oil was spilled into the ocean. Over recent years, various types of soil contaminations have been studied to understand the impact of pollution type, uncovering the mechanical complexity that arises not just from the pollutant type but also from the properties of the host soil and the interplay between them. This complexity is associated with diametrically opposite effects in different soil types. For instance, while certain oils may enhance the frictional properties of cohesive soils, they can reduce the friction in granular soils. This striking difference can be attributed to the different mechanisms at play: physico-chemical interactions predominate in the former case, whereas lubrication effects are more significant in the latter. this study introduces an empirical law designed to quantify the mechanical effect of oil contamination in granular soils, factoring the properties of both the contaminating oil and the host soil. This law is achieved by comprehensive experimental research that spans a wide array of oil types and soils with unique configurations and morphologies. By integrating these diverse data points, our law facilitates accurate predictions of how oil contamination modifies the frictional characteristics of general granular soils.

Keywords: contaminated soils, lubrication, friction, granular media

Procedia PDF Downloads 29
1462 Benthic Cover in Coral Reef Environments under Influence of Submarine Groundwater Discharges

Authors: Arlett A. Rosado-Torres, Ismael Marino-Tapia

Abstract:

Changes in benthic cover of coral dominated systems to macroalgae dominance are widely studied worldwide. Watershed pollutants are potentially as important as overfishing causing phase shift. In certain regions of the world most of the continental inputs are through submarine groundwater discharges (SGD), which can play a significant ecological role because the concentration of its nutrients is usually greater that the one found in surface seawater. These stressors have adversely affected coral reefs, particularly in the Caribbean. Measurements of benthic cover (with video tracing, through a Go Pro camera), reef roughness (acoustic estimates with an Acoustic Doppler Current Velocity profiler and a differential GPS), thermohaline conditions (conductivity-temperature-depth (CTD) instrument) and nutrient measurements were taken in different sites in the reef lagoon of Puerto Morelos, Q. Roo, Mexico including those with influence of SGD and without it. The results suggest a link between SGD, macroalgae cover and structural complexity. Punctual water samples and data series from a CTD Diver confirm the presence of the SGD. On the site where the SGD is, the macroalgae cover is larger than in the other sites. To establish a causal link between this phase shift and SGD, the DELFT 3D hydrodynamic model (FLOW and WAVE modules) was performed under different environmental conditions and discharge magnitudes. The model was validated using measurements of oceanographic instruments anchored in the lagoon and forereef. The SGD is consistently favoring macroalgae populations and affecting structural complexity of the reef.

Keywords: hydrodynamic model, macroalgae, nutrients, phase shift

Procedia PDF Downloads 126
1461 Extracting the Coupled Dynamics in Thin-Walled Beams from Numerical Data Bases

Authors: Mohammad A. Bani-Khaled

Abstract:

In this work we use the Discrete Proper Orthogonal Decomposition transform to characterize the properties of coupled dynamics in thin-walled beams by exploiting numerical simulations obtained from finite element simulations. The outcomes of the will improve our understanding of the linear and nonlinear coupled behavior of thin-walled beams structures. Thin-walled beams have widespread usage in modern engineering application in both large scale structures (aeronautical structures), as well as in nano-structures (nano-tubes). Therefore, detailed knowledge in regard to the properties of coupled vibrations and buckling in these structures are of great interest in the research community. Due to the geometric complexity in the overall structure and in particular in the cross-sections it is necessary to involve computational mechanics to numerically simulate the dynamics. In using numerical computational techniques, it is not necessary to over simplify a model in order to solve the equations of motions. Computational dynamics methods produce databases of controlled resolution in time and space. These numerical databases contain information on the properties of the coupled dynamics. In order to extract the system dynamic properties and strength of coupling among the various fields of the motion, processing techniques are required. Time- Proper Orthogonal Decomposition transform is a powerful tool for processing databases for the dynamics. It will be used to study the coupled dynamics of thin-walled basic structures. These structures are ideal to form a basis for a systematic study of coupled dynamics in structures of complex geometry.

Keywords: coupled dynamics, geometric complexity, proper orthogonal decomposition (POD), thin walled beams

Procedia PDF Downloads 400
1460 Index t-SNE: Tracking Dynamics of High-Dimensional Datasets with Coherent Embeddings

Authors: Gaelle Candel, David Naccache

Abstract:

t-SNE is an embedding method that the data science community has widely used. It helps two main tasks: to display results by coloring items according to the item class or feature value; and for forensic, giving a first overview of the dataset distribution. Two interesting characteristics of t-SNE are the structure preservation property and the answer to the crowding problem, where all neighbors in high dimensional space cannot be represented correctly in low dimensional space. t-SNE preserves the local neighborhood, and similar items are nicely spaced by adjusting to the local density. These two characteristics produce a meaningful representation, where the cluster area is proportional to its size in number, and relationships between clusters are materialized by closeness on the embedding. This algorithm is non-parametric. The transformation from a high to low dimensional space is described but not learned. Two initializations of the algorithm would lead to two different embeddings. In a forensic approach, analysts would like to compare two or more datasets using their embedding. A naive approach would be to embed all datasets together. However, this process is costly as the complexity of t-SNE is quadratic and would be infeasible for too many datasets. Another approach would be to learn a parametric model over an embedding built with a subset of data. While this approach is highly scalable, points could be mapped at the same exact position, making them indistinguishable. This type of model would be unable to adapt to new outliers nor concept drift. This paper presents a methodology to reuse an embedding to create a new one, where cluster positions are preserved. The optimization process minimizes two costs, one relative to the embedding shape and the second relative to the support embedding’ match. The embedding with the support process can be repeated more than once, with the newly obtained embedding. The successive embedding can be used to study the impact of one variable over the dataset distribution or monitor changes over time. This method has the same complexity as t-SNE per embedding, and memory requirements are only doubled. For a dataset of n elements sorted and split into k subsets, the total embedding complexity would be reduced from O(n²) to O(n²=k), and the memory requirement from n² to 2(n=k)², which enables computation on recent laptops. The method showed promising results on a real-world dataset, allowing to observe the birth, evolution, and death of clusters. The proposed approach facilitates identifying significant trends and changes, which empowers the monitoring high dimensional datasets’ dynamics.

Keywords: concept drift, data visualization, dimension reduction, embedding, monitoring, reusability, t-SNE, unsupervised learning

Procedia PDF Downloads 119
1459 Seaworthiness and Liability Risks Involving Technology and Cybersecurity in Transport and Logistics

Authors: Eugene Wong, Felix Chan, Linsey Chen, Joey Cheung

Abstract:

The widespread use of technologies and cyber/digital means for complex maritime operations have led to a sharp rise in global cyber-attacks. They have generated an increasing number of liability disputes, insurance claims, and legal proceedings. An array of antiquated case law, regulations, international conventions, and obsolete contractual clauses drafted in the pre-technology era have become grossly inadequate in addressing the contemporary challenges. This paper offers a critique of the ambiguity of cybersecurity liabilities under the obligation of seaworthiness entailed in the Hague-Visby Rules, which apply either by law in a large number of jurisdictions or by express incorporation into the shipping documents. This paper also evaluates the legal and technological criteria for assessing whether a vessel is properly equipped with the latest offshore technologies for navigation and cargo delivery operations. Examples include computer applications, networks and servers, enterprise systems, global positioning systems, and data centers. A critical analysis of the carriers’ obligations to exercise due diligence in preventing or mitigating cyber-attacks is also conducted in this paper. It is hoped that the present study will offer original and crucial insights to policymakers, regulators, carriers, cargo interests, and insurance underwriters closely involved in dispute prevention and resolution arising from cybersecurity liabilities.

Keywords: seaworthiness, cybersecurity, liabilities, risks, maritime, transport

Procedia PDF Downloads 114
1458 Circular Economy and Remedial Frameworks in Contract Law

Authors: Reza Beheshti

Abstract:

This paper examines remedies for defective manufactured goods in commercial circular economic transactions. The linear ‘take-make-dispose’ model fits well with the conventional remedial framework in which damages are considered the primary remedy. Damages under English Sales Law encourages buyers to look for a substitute seller with broadly similar goods to the ones agreed on in the original contract, enter into contract with this new seller and hence terminate the original contract. By doing so, the buyer ends the contractual relationship. This seems contrary to the core principles of the circular economy: keeping products, components, and materials in longer use, which can partly be achieved by product refurbishment. This process involves returning a product to good working condition by replacing or repairing major components that are faulty or close to failure and making ‘cosmetic’ changes to update the appearance of a product. This remedy has not been widely accepted or applied in commercial cases, which in turn flags up the secondary nature of performance-related remedies. This paper critically analyses the laws concerning the seller’s duty to cure in English law and the extent to which they correspond with core principles of the circular economy. In addition, this paper takes into account the potential of circular economic transactions being characterised as something other than sales. In such situations, the likely outcome will be a license to use products, which may limit the choice of remedy further. Consequently, this paper suggests an outline remedial framework specifically for commercial circular economic transactions in manufactured goods.

Keywords: circular economy, contract law, remedies, English Sales Law

Procedia PDF Downloads 118
1457 Multiscale Entropy Analysis of Electroencephalogram (EEG) of Alcoholic and Control Subjects

Authors: Lal Hussain, Wajid Aziz, Imtiaz Ahmed Awan, Sharjeel Saeed

Abstract:

Multiscale entropy analysis (MSE) is a useful technique recently developed to quantify the dynamics of physiological signals at different time scales. This study is aimed at investigating the electroencephalogram (EEG) signals to analyze the background activity of alcoholic and control subjects by inspecting various coarse-grained sequences formed at different time scales. EEG recordings of alcoholic and control subjects were taken from the publically available machine learning repository of University of California (UCI) acquired using 64 electrodes. The MSE analysis was performed on the EEG data acquired from all the electrodes of alcoholic and control subjects. Mann-Whitney rank test was used to find significant differences between the groups and result were considered statistically significant for p-values<0.05. The area under receiver operator curve was computed to find the degree separation between the groups. The mean ranks of MSE values at all the times scales for all electrodes were higher control subject as compared to alcoholic subjects. Higher mean ranks represent higher complexity and vice versa. The finding indicated that EEG signals acquired through electrodes C3, C4, F3, F7, F8, O1, O2, P3, T7 showed significant differences between alcoholic and control subjects at time scales 1 to 5. Moreover, all electrodes exhibit significance level at different time scales. Likewise, the highest accuracy and separation was obtained at the central region (C3 and C4), front polar regions (P3, O1, F3, F7, F8 and T8) while other electrodes such asFp1, Fp2, P4 and F4 shows no significant results.

Keywords: electroencephalogram (EEG), multiscale sample entropy (MSE), Mann-Whitney test (MMT), Receiver Operator Curve (ROC), complexity analysis

Procedia PDF Downloads 353
1456 Formulation and Test of a Model to explain the Complexity of Road Accident Events in South Africa

Authors: Dimakatso Machetele, Kowiyou Yessoufou

Abstract:

Whilst several studies indicated that road accident events might be more complex than thought, we have a limited scientific understanding of this complexity in South Africa. The present project proposes and tests a more comprehensive metamodel that integrates multiple causality relationships among variables previously linked to road accidents. This was done by fitting a structural equation model (SEM) to the data collected from various sources. The study also fitted the GARCH Model (Generalized Auto-Regressive Conditional Heteroskedasticity) to predict the future of road accidents in the country. The analysis shows that the number of road accidents has been increasing since 1935. The road fatality rate follows a polynomial shape following the equation: y = -0.0114x²+1.2378x-2.2627 (R²=0.76) with y = death rate and x = year. This trend results in an average death rate of 23.14 deaths per 100,000 people. Furthermore, the analysis shows that the number of crashes could be significantly explained by the total number of vehicles (P < 0.001), number of registered vehicles (P < 0.001), number of unregistered vehicles (P = 0.003) and the population of the country (P < 0.001). As opposed to expectation, the number of driver licenses issued and total distance traveled by vehicles do not correlate significantly with the number of crashes (P > 0.05). Furthermore, the analysis reveals that the number of casualties could be linked significantly to the number of registered vehicles (P < 0.001) and total distance traveled by vehicles (P = 0.03). As for the number of fatal crashes, the analysis reveals that the total number of vehicles (P < 0.001), number of registered (P < 0.001) and unregistered vehicles (P < 0.001), the population of the country (P < 0.001) and the total distance traveled by vehicles (P < 0.001) correlate significantly with the number of fatal crashes. However, the number of casualties and again the number of driver licenses do not seem to determine the number of fatal crashes (P > 0.05). Finally, the number of crashes is predicted to be roughly constant overtime at 617,253 accidents for the next 10 years, with the worse scenario suggesting that this number may reach 1 896 667. The number of casualties was also predicted to be roughly constant at 93 531 overtime, although this number may reach 661 531 in the worst-case scenario. However, although the number of fatal crashes may decrease over time, it is forecasted to reach 11 241 fatal crashes within the next 10 years, with the worse scenario estimated at 19 034 within the same period. Finally, the number of fatalities is also predicted to be roughly constant at 14 739 but may also reach 172 784 in the worse scenario. Overall, the present study reveals the complexity of road accidents and allows us to propose several recommendations aimed to reduce the trend of road accidents, casualties, fatal crashes, and death in South Africa.

Keywords: road accidents, South Africa, statistical modelling, trends

Procedia PDF Downloads 134
1455 Enhancing Coaching Development in South African Women’s Rugby: Insights from Coaches and Players on Effectiveness

Authors: Jocelyn Solomons, Sheree Bekker, Ryan Groom, Wilbur Kraak

Abstract:

Sports coaching is marked by inherent elements of complexity. Coaches constantly encounter ambiguity, as they are unable to have complete certainty regarding the perspectives and expectations of stakeholders. Moreover, the coaching environment is characterised by its dynamic nature and intricate micro-political dynamics which further add to the complexity that coaches must navigate. This research study offers a unique perspective on the practical manifestation of coaching effectiveness in the South African (SA) context, where the sport is in its early stages of development. With a predominant presence of male coaches training female players and players originating from diverse sporting backgrounds, including a majority of those who commence their rugby careers at the university level, this exploration, along with practical recommendations, becomes essential. It allows for a nuanced understanding of coaching practices within a rugby system that concurrently focuses on development and high performance. By integrating the views of both players and coaches, insights are gained that extend traditional assessments, enabling a comprehensive understanding of coaching effectiveness and its implications in this evolving Women’s Rugby landscape. Through semi-structured interviews, the research delves into their assessments of coaching strategies, methodologies, and outcomes, aiming to understand coaching efficacy and its impact on player development. The findings contribute to a nuanced understanding of coaching effectiveness, paving the way for evidence-based recommendations to enhance coaching development and positively impact the sport's growth and success in SA.

Keywords: women’s rugby, coaching effectiveness, coaching, rugby, coaching education

Procedia PDF Downloads 23