Search results for: Marcel Wehrle
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 35

Search results for: Marcel Wehrle

35 Building a Hierarchical, Granular Knowledge Cube

Authors: Alexander Denzler, Marcel Wehrle, Andreas Meier

Abstract:

A knowledge base stores facts and rules about the world that applications can use for the purpose of reasoning. By applying the concept of granular computing to a knowledge base, several advantages emerge. These can be harnessed by applications to improve their capabilities and performance. In this paper, the concept behind such a construct, called a granular knowledge cube, is defined, and its intended use as an instrument that manages to cope with different data types and detect knowledge domains is elaborated. Furthermore, the underlying architecture, consisting of the three layers of the storing, representing, and structuring of knowledge, is described. Finally, benefits as well as challenges of deploying it are listed alongside application types that could profit from having such an enhanced knowledge base.

Keywords: granular computing, granular knowledge, hierarchical structuring, knowledge bases

Procedia PDF Downloads 459
34 Gabriel Marcel and Friedrich Nietzsche: Existence and Death of God

Authors: Paolo Scolari

Abstract:

Nietzschean thought flows like a current throughout Marcel’s philosophy. Marcel is in constant dialogue with him. He wants to give homage to him, making him one of the most eminent representatives of existential thought. His enthusiasm is triggered by Nietzsche’s phrase: ‘God is dead,’ the fil rouge that ties all of the Nietzschean references scattered through marcelian texts. The death of God is the theme which emphasises both the greatness and simultaneously the tragedy of Nietzsche. Marcel wants to substitute the idea ‘God is dead’ with its original meaning: a tragic existential characteristic that imitators of Nietzsche seemed to have blurred. An interpretation that Marcel achieves aiming at double target. On the one hand he removes the heavy metaphysical suit from Nietzsche’s aphorisms on the death of God, that his interpreters have made them wear – Heidegger especially. On the other hand, he removes a stratus of trivialisation which takes the aphorisms out of context and transforms them into advertising slogans – here Sartre becomes the target. In the lecture: Nietzsche: l'homme devant la mort de dieu, Marcel hurls himself against the metaphysical Heidegger interpretation of the death of God. A hermeneutical proposal definitely original, but also a bit too abstract. An interpretation without bite, that does not grasp the tragic existential weight of the original Nietzschean idea. ‘We are probably on the wrong road,’ announces, ‘when at all costs, like Heidegger, we want to make a metaphysic out of Nietzsche.’ Marcel also criticizes Sartre. He lands in Geneva and reacts to the journalists, by saying: ‘Gentlemen, God is dead’. Marcel only needs this impromptu exclamation to understand how Sartre misinterprets the meaning of the death of God. Sartre mistakes and loses the existential sense of this idea in favour of the sensational and trivialisation of it. Marcel then wipes the slate clean from these two limited interpretations of the declaration of the death of God. This is much more than a metaphysical quarrel and not at all comparable to any advertising slogan. Behind the cry ‘God is dead’ there is the existence of an anguished man who experiences in his solitude the actual death of God. A man who has killed God with his own hands, haunted by the chill that from now on he will have to live in a completely different way. The death of God, however, is not the end. Marcel spots a new beginning at the point in which nihilism is overcome and the Übermensch is born. Dialoguing with Nietzsche he notices to being in the presence of a great spirit that has contributed to the renewal of a spiritual horizon. He descends to the most profound depths of his thought, aware that the way out is really far below, in the remotest areas of existence. The ambivalence of Nietzsche does not scare him. Rather such a thought, characterised by contradiction, will simultaneously be infinitely dangerous and infinitely healthy.

Keywords: Nietzsche's Death of God, Gabriel Marcel, Heidegger, Sartre

Procedia PDF Downloads 184
33 Doing More with Less: Passion for Entrepreneurship in the Research-Constraint Contexts of Developing and Emerging Economies

Authors: Marcel Hechler

Abstract:

Since passion is considered one of the most important motivating factors for entrepreneurship, we examined the influence of the availability of resources and information on the emergence of a harmonious passion for entrepreneurship (HPE). Drawing on self-determination theory and a cross-cultural sample of 1,085 entrepreneurs from seven developing countries, we argue that the availability of resources and information increases an entrepreneur's autonomy and, thus, promotes the emergence of HPE.

Keywords: harmonious passion, access to resources and information, developing and emerging countries, self-determination theory

Procedia PDF Downloads 125
32 Good Advice Is Hard to Come By: A Cross-Cultural Perspective on Opposing Views and Entrepreneurial Passion

Authors: Marcel Hechler

Abstract:

The purpose of this study is to understand the impact of entrepreneurs' receptiveness to opposing views on their entrepreneurial passion. Following a cross-cultural approach, we surveyed 1,228 entrepreneurs in seven developing and emerging countries. Besides a positive relationship between receptiveness to opposing views and harmonious passion for entrepreneurship, we found first evidence for a significant moderating effect of access to information reinforcing the positive main effect.

Keywords: harmonious passion, developing and emerging countries, self-determination theory, receptiveness to opposing views

Procedia PDF Downloads 174
31 Low-Emission Commuting with Micro Public Transport: Investigation of Travel Times and CO₂ Emissions

Authors: Marcel Ciesla, Victoria Oberascher, Sven Eder, Stefan Kirchweger, Wolfgang E. Baaske, Gerald Ostermayer

Abstract:

The omnipresent trend towards sustainable mobility is a major challenge, especially for commuters in rural areas. The use of micro public transport systems is expected to significantly reduce pollutant emissions, as several commuters travel the first mile together with a single pick-up bus instead of their own car. In this paper, different aspects of such a micro public transport system are analyzed. The main findings of the investigations should be how the travel times of commuters change and how many CO₂ emissions can be saved if some of the commuters use public transport instead of their own vehicle.

Keywords: micro public transport, green transportation, sustainable mobility, low-emission commuting

Procedia PDF Downloads 419
30 Towards a Standardization in Scheduling Models: Assessing the Variety of Homonyms

Authors: Marcel Rojahn, Edzard Weber, Norbert Gronau

Abstract:

Terminology is a critical instrument for each researcher. Different terminologies for the same research object may arise in different research communities. By this inconsistency, many synergistic effects get lost. Theories and models will be more understandable and reusable if a common terminology is applied. This paper examines the terminological (in) consistency for the research field of job-shop scheduling through a literature review. There is an enormous variety in the choice of terms and mathematical notation for the same concept. The comparability, reusability, and combinability of scheduling methods are unnecessarily hampered by the arbitrary use of homonyms and synonyms. The acceptance in the community of used variables and notation forms is shown by means of a compliance quotient. This is proven by the evaluation of 240 scientific publications on planning methods.

Keywords: job-shop scheduling, terminology, notation, standardization

Procedia PDF Downloads 73
29 Inverse Heat Transfer Analysis of a Melting Furnace Using Levenberg-Marquardt Method

Authors: Mohamed Hafid, Marcel Lacroix

Abstract:

This study presents a simple inverse heat transfer procedure for predicting the wall erosion and the time-varying thickness of the protective bank that covers the inside surface of the refractory brick wall of a melting furnace. The direct problem is solved by using the Finite-Volume model. The melting/solidification process is modeled using the enthalpy method. The inverse procedure rests on the Levenberg-Marquardt method combined with the Broyden method. The effect of the location of the temperature sensors and of the measurement noise on the inverse predictions is investigated. Recommendations are made concerning the location of the temperature sensor.

Keywords: melting furnace, inverse heat transfer, enthalpy method, levenberg–marquardt method

Procedia PDF Downloads 288
28 An Explorative Study of the Application of Project Management in German Research Projects

Authors: Marcel Randermann, Roland Jochem

Abstract:

Research activities are mostly conducted in form of projects. In fact, research projects take the highest share of all project forms combined. However, project management is very rarely applied purposefully by researchers and scientists. More specifically no project management frameworks, methods or tools are not being used to plan, execute or control research project to ensure research success or improve project quality. In this qualitative study, several interviews were conducted with scientists and research managers from German institutions to gain insights into project management activities, to determine challenges and barriers, and to evaluate premises for successful project management. The analyses show that conventional project management is not easily applicable in scientific environments and researchers’ mindsets prevent a reasonable application.

Keywords: academics, project management methods, research and science projects, scientist's mindset

Procedia PDF Downloads 153
27 Delivery of Positively Charged Proteins Using Hyaluronic Acid Microgels

Authors: Elaheh Jooybar, Mohammad J. Abdekhodaie, Marcel Karperien, Pieter J. Dijkstra

Abstract:

In this study, hyaluronic acid (HA) microgels were developed for the goal of protein delivery. First, a hyaluronic acid-tyramine conjugate (HA-TA) was synthesized with a degree of substitution of 13 TA moieties per 100 disaccharide units. Then, HA-TA microdroplets were produced using a water in oil emulsion method and crosslinked in the presence of horseradish peroxidase (HRP) and hydrogen peroxide (H2O2). Loading capacity and the release kinetics of lysozyme and BSA, as model proteins, were investigated. It was shown that lysozyme, a cationic protein, can be incorporated efficiently in the HA microgels, while the loading efficiency for BSA, as a negatively charged protein, is low. The release profile of lysozyme showed a sustained release over a period of one month. The results demonstrated that the HA-TA microgels are a good carrier for spatial delivery of cationic proteins for biomedical applications.

Keywords: microgel, inverse emulsion, protein delivery, hyaluronic acid, crosslinking

Procedia PDF Downloads 131
26 Trade Liberalisation and South Africa’s CO2 Emissions

Authors: Marcel Kohler

Abstract:

The effect of trade liberalization on environmental conditions has yielded a great deal of debate in the current energy economics literature. Although research on the relationship between income growth and CO2 emissions is not new in South Africa, few studies address the role that South Africa’s foreign trade plays in this context. This paper undertakes to investigate empirically the impact of South Africa’s foreign trade reforms over the last four decades on its energy consumption and CO2 emissions by taking into account not only the direct effect of trade on each, but also its indirect effect through income induced growth. Using co integration techniques we attempt to disentangle the long and short-run relationship between trade openness, income per capita and energy consumption and CO2 emissions in South Africa. The preliminary results of this study find support for a positive bi-directional relationship between output and CO2 emissions, as well as between trade openness and CO2. This evidence confirms the expectation that as the South African economy opens up to foreign trade and experiences growth in per capita income, the countries CO2 emissions will increase.

Keywords: trade openness, CO2 emissions, cointegration, South Africa

Procedia PDF Downloads 371
25 Prediction of the Thermal Parameters of a High-Temperature Metallurgical Reactor Using Inverse Heat Transfer

Authors: Mohamed Hafid, Marcel Lacroix

Abstract:

This study presents an inverse analysis for predicting the thermal conductivities and the heat flux of a high-temperature metallurgical reactor simultaneously. Once these thermal parameters are predicted, the time-varying thickness of the protective phase-change bank that covers the inside surface of the brick walls of a metallurgical reactor can be calculated. The enthalpy method is used to solve the melting/solidification process of the protective bank. The inverse model rests on the Levenberg-Marquardt Method (LMM) combined with the Broyden method (BM). A statistical analysis for the thermal parameter estimation is carried out. The effect of the position of the temperature sensors, total number of measurements and measurement noise on the accuracy of inverse predictions is investigated. Recommendations are made concerning the location of temperature sensors.

Keywords: inverse heat transfer, phase change, metallurgical reactor, Levenberg–Marquardt method, Broyden method, bank thickness

Procedia PDF Downloads 296
24 Comparison of Different Electrical Machines with Permanent Magnets in the Stator for Use as an Industrial Drive

Authors: Marcel Lehr, Andreas Binder

Abstract:

This paper compares three different permanent magnet synchronous machines (Doubly-Salient-Permanent-Magnet-Machine (DSPM), Flux-Reversal-Permanent-Magnet-Machine (FRPM), Flux-Switching-Permanent-Magnet-Machine (FSPM)) with the permanent magnets in the stator of the machine for use as an industrial drive for 400 V Y, 45 kW and 1000 ... 3000 min-1. The machines are compared based on the magnetic co-energy and Finite-Element-Method-Simulations regarding the torque density. The results show that the FSPM provides the highest torque density of the three machines. Therefore, an FSPM prototype was built, tested on a test bench and finally compared with an already built conventional permanent magnet synchronous machine (PMSM) of the same size (stator outer diameter dso = 314 mm, axial length lFe = 180 mm) and rating with surface-mounted rotor magnets. These measurements show that the conventional PMSM and the FSPM machine are roughly equivalent in their electrical behavior.

Keywords: doubly-salient-permanent-magnet-machine, flux-reversal-permanent-magnet-machine, flux-switching-permanent-magnet-machine, industrial drive

Procedia PDF Downloads 328
23 An Inverse Heat Transfer Algorithm for Predicting the Thermal Properties of Tumors during Cryosurgery

Authors: Mohamed Hafid, Marcel Lacroix

Abstract:

This study aimed at developing an inverse heat transfer approach for predicting the time-varying freezing front and the temperature distribution of tumors during cryosurgery. Using a temperature probe pressed against the layer of tumor, the inverse approach is able to predict simultaneously the metabolic heat generation and the blood perfusion rate of the tumor. Once these parameters are predicted, the temperature-field and time-varying freezing fronts are determined with the direct model. The direct model rests on one-dimensional Pennes bioheat equation. The phase change problem is handled with the enthalpy method. The Levenberg-Marquardt Method (LMM) combined to the Broyden Method (BM) is used to solve the inverse model. The effect (a) of the thermal properties of the diseased tissues; (b) of the initial guesses for the unknown thermal properties; (c) of the data capture frequency; and (d) of the noise on the recorded temperatures is examined. It is shown that the proposed inverse approach remains accurate for all the cases investigated.

Keywords: cryosurgery, inverse heat transfer, Levenberg-Marquardt method, thermal properties, Pennes model, enthalpy method

Procedia PDF Downloads 162
22 Validating Condition-Based Maintenance Algorithms through Simulation

Authors: Marcel Chevalier, Léo Dupont, Sylvain Marié, Frédérique Roffet, Elena Stolyarova, William Templier, Costin Vasile

Abstract:

Industrial end-users are currently facing an increasing need to reduce the risk of unexpected failures and optimize their maintenance. This calls for both short-term analysis and long-term ageing anticipation. At Schneider Electric, we tackle those two issues using both machine learning and first principles models. Machine learning models are incrementally trained from normal data to predict expected values and detect statistically significant short-term deviations. Ageing models are constructed by breaking down physical systems into sub-assemblies, then determining relevant degradation modes and associating each one to the right kinetic law. Validating such anomaly detection and maintenance models is challenging, both because actual incident and ageing data are rare and distorted by human interventions, and incremental learning depends on human feedback. To overcome these difficulties, we propose to simulate physics, systems, and humans -including asset maintenance operations- in order to validate the overall approaches in accelerated time and possibly choose between algorithmic alternatives.

Keywords: degradation models, ageing, anomaly detection, soft sensor, incremental learning

Procedia PDF Downloads 88
21 Multivariate Analytical Insights into Spatial and Temporal Variation in Water Quality of a Major Drinking Water Reservoir

Authors: Azadeh Golshan, Craig Evans, Phillip Geary, Abigail Morrow, Zoe Rogers, Marcel Maeder

Abstract:

22 physicochemical variables have been determined in water samples collected weekly from January to December in 2013 from three sampling stations located within a major drinking water reservoir. Classical Multivariate Curve Resolution Alternating Least Squares (MCR-ALS) analysis was used to investigate the environmental factors associated with the physico-chemical variability of the water samples at each of the sampling stations. Matrix augmentation MCR-ALS (MA-MCR-ALS) was also applied, and the two sets of results were compared for interpretative clarity. Links between these factors, reservoir inflows and catchment land-uses were investigated and interpreted in relation to chemical composition of the water and their resolved geographical distribution profiles. The results suggested that the major factors affecting reservoir water quality were those associated with agricultural runoff, with evidence of influence on algal photosynthesis within the water column. Water quality variability within the reservoir was also found to be strongly linked to physical parameters such as water temperature and the occurrence of thermal stratification. The two methods applied (MCR-ALS and MA-MCR-ALS) led to similar conclusions; however, MA-MCR-ALS appeared to provide results more amenable to interpretation of temporal and geological variation than those obtained through classical MCR-ALS.

Keywords: drinking water reservoir, multivariate analysis, physico-chemical parameters, water quality

Procedia PDF Downloads 246
20 The Implications of Digital Art Passing the Turing Test

Authors: Marcel Becker

Abstract:

Applicaon of the Turing to digital art will lead to beer understanding of the meaning of digital art. This topic has gained importance with the emergence of AICAN (‘AI Creave Adversarial Network’) machines. Many digital artworks will pass the test, but it must be quesoned whether this is a disqualificaon of human art. The paper integrates into two threads of philosophical discussions. It starts with discussions about the Turing test. The paper first refutes the objecon that the comparison is not jusfied given the interacve set up of the Turing test. And in discussing the Turing test the paper lays bare a not widely discussed peculiarity of the test. Turing talked about the ‘universal machine’, but in the test the core queson is whether the computer can replace a parcular human being (a man, disnguished from a woman). This tension is used to emphasize the highly contextual use of digital machines. The main part of the paper discusses aesthec theories in order to understand the possibilies and consequences of digital art passing the Turing test. It applies these theories on several categories of art. The main findings of the study are that the forms of digital that ‘imitate’ great arsts might pass the Turing test but do not contribute to further development of art and do not fulfill a vital condion of art. And there is a category of art in which the personality and the background of the arst are important. There are limitaons in the way digital art can replace human in expressing and culvang meaning. Nevertheless there are forms of art that pass the Turing without complicaons. The applicaon of aesthec theories shows that these forms must are strongly connected to autonomous and formalisc forms of art, that have come up in the 20th century.

Keywords: ditigal art, turing test, aesthetics, creative adversarial network

Procedia PDF Downloads 30
19 Exploring Time-Series Phosphoproteomic Datasets in the Context of Network Models

Authors: Sandeep Kaur, Jenny Vuong, Marcel Julliard, Sean O'Donoghue

Abstract:

Time-series data are useful for modelling as they can enable model-evaluation. However, when reconstructing models from phosphoproteomic data, often non-exact methods are utilised, as the knowledge regarding the network structure, such as, which kinases and phosphatases lead to the observed phosphorylation state, is incomplete. Thus, such reactions are often hypothesised, which gives rise to uncertainty. Here, we propose a framework, implemented via a web-based tool (as an extension to Minardo), which given time-series phosphoproteomic datasets, can generate κ models. The incompleteness and uncertainty in the generated model and reactions are clearly presented to the user via the visual method. Furthermore, we demonstrate, via a toy EGF signalling model, the use of algorithmic verification to verify κ models. Manually formulated requirements were evaluated with regards to the model, leading to the highlighting of the nodes causing unsatisfiability (i.e. error causing nodes). We aim to integrate such methods into our web-based tool and demonstrate how the identified erroneous nodes can be presented to the user via the visual method. Thus, in this research we present a framework, to enable a user to explore phosphorylation proteomic time-series data in the context of models. The observer can visualise which reactions in the model are highly uncertain, and which nodes cause incorrect simulation outputs. A tool such as this enables an end-user to determine the empirical analysis to perform, to reduce uncertainty in the presented model - thus enabling a better understanding of the underlying system.

Keywords: κ-models, model verification, time-series phosphoproteomic datasets, uncertainty and error visualisation

Procedia PDF Downloads 217
18 Application of the Finite Window Method to a Time-Dependent Convection-Diffusion Equation

Authors: Raoul Ouambo Tobou, Alexis Kuitche, Marcel Edoun

Abstract:

The FWM (Finite Window Method) is a new numerical meshfree technique for solving problems defined either in terms of PDEs (Partial Differential Equation) or by a set of conservation/equilibrium laws. The principle behind the FWM is that in such problem each element of the concerned domain is interacting with its neighbors and will always try to adapt to keep in equilibrium with respect to those neighbors. This leads to a very simple and robust problem solving scheme, well suited for transfer problems. In this work, we have applied the FWM to an unsteady scalar convection-diffusion equation. Despite its simplicity, it is well known that convection-diffusion problems can be challenging to be solved numerically, especially when convection is highly dominant. This has led researchers to set the scalar convection-diffusion equation as a benchmark one used to analyze and derive the required conditions or artifacts needed to numerically solve problems where convection and diffusion occur simultaneously. We have shown here that the standard FWM can be used to solve convection-diffusion equations in a robust manner as no adjustments (Upwinding or Artificial Diffusion addition) were required to obtain good results even for high Peclet numbers and coarse space and time steps. A comparison was performed between the FWM scheme and both a first order implicit Finite Volume Scheme (Upwind scheme) and a third order implicit Finite Volume Scheme (QUICK Scheme). The results of the comparison was that for equal space and time grid spacing, the FWM yields a much better precision than the used Finite Volume schemes, all having similar computational cost and conditioning number.

Keywords: Finite Window Method, Convection-Diffusion, Numerical Technique, Convergence

Procedia PDF Downloads 298
17 The Effect of Loud Working Environment on Incidence of Back Pain

Authors: Marcel Duh, Jadranka Stricevic, David Halozan, Dusan Celan

Abstract:

Back pain is not only the result of structural or biomechanical abnormalities of the spine but is also associated with cognitive and behavioral aspects of pain and thus represents biopsychosocial problem. Stressors are not only interpersonal conflicts, negative life events, and dangerous situations but also noise. Effects of noise on human beings are psychological (excitement, stress), sensory, and physiological. The harmful effects of noise can be seen in the 40-65 dB range and are manifested as fatigue, irritability, poor sleep and psychological discomfort of the worker. Within 65-90 dB range, body metabolism increases, oxygen consumption is higher, tachycardia and hypertension appear, and the tone of skeletal muscles increases. The purpose of the study was to determine whether the stress caused by noise at the work place increases the incidence of back pain. Measurements of noise levels were carried out in three different wards of social care institution. The measurement on each ward was repeated 3 times (total of 9 measurements) for 8 hours during the morning shift. The device was set up in the room where clients spent most of the day. The staff on the ward replied to the questionnaire consisting of closed type questions about basic demographic information and information about back pain. We find that noise levels as measured in our study had no statistically significant effect on the incidence of back pain (p = 0.90). We also find that health care workers who perceive their work as stressful, have more back pain than those who perceive their job as unstressful, but correlation is statistically insignificant (p = 0.682). With our study, we have proven findings of other authors, that noise level below 65 dB does not have a significant influence on the incidence of back pain.

Keywords: health care workers, musculoskeletal disorder, noise, sick leave

Procedia PDF Downloads 87
16 Demand-Oriented Supplier Integration in Agile New Product Development Projects

Authors: Guenther Schuh, Stephan Schroeder, Marcel Faulhaber

Abstract:

Companies are facing an increasing pressure to innovate faster, cheaper and more radical in last years, due to shrinking product lifecycles and higher volatility of markets and customer demands. Especially established companies struggle meeting those demands. Thus, many producing companies are adapting their development processes to address this increasing pressure. One approach taken by many companies is the use of agile, highly iterative development processes to reduce development times and costs as well as to increase the fulfilled customer requirements and the realized level of innovation. At the same time decreasing depths of added value and increasing focus on core competencies as well as a growing product complexity result in a high dependency on suppliers and external development partners during the product development. Thus, a successful introduction of agile development methods into the development of physical products requires also a successful integration of the necessary external partners and suppliers into the new processes and procedures and an adaption of the organizational interfaces to external partners according to the new circumstances and requirements of agile development processes. For an effective and efficient product development, the design of customer-supplier-relationships should be demand-oriented. A significant influence on the required design has the characteristics of the procurement object. Examples therefore are the complexity of technical interfaces between supply object and final product or the importance of the supplied component for the major product functionalities. Thus, this paper presents an approach to derive general requirements on the design of supplier integration according to the characteristics of supply objects. First, therefore the most relevant evaluation criteria and characteristics have been identified based on a thorough literature review. Subsequently the resulting requirements on the design of the supplier integration were derived depending on the different possible values of these criteria.

Keywords: iterative development processes, agile new product development, procurement, supplier integration

Procedia PDF Downloads 135
15 Marketing Strategy and Marketing Mix for Rural Tour Package in Bali: Case Study of Munduk

Authors: Made Darmiati, Ni Putu Evi Wijayanti, Ni Ketut Wiwiek Agustina, Putu Gde Arie Yudhistira, Marcel Hardono

Abstract:

The establishment of tourist village has been the main concern for pro-poor tourism in Indonesia especially in Bali in order to create alternative tourist destination. The case study of this research was Munduk, a tourist village located in Buleleng Regency, Bali Province. Munduk has been unstable in terms of tourist visit in 2012 until 2016. The concept of marketing strategy and its marketing mix are concepts that suitable for application in Munduk as the prime owner of trekking and other rural tour packages to increase the number of visitor in particularly during low season. The research study aims to determine the internal factors (strengths and weaknesses) and external factors (opportunities and threats) impacting the number of tourist visit so that they could formulate appropriate marketing strategy for Munduk Tourist Village. Data has been obtained by observation, interviews with stakeholders, questionnaire to 100 participants and documentation. In addition, this research study uses descriptive qualitative methods and techniques known as SWOT (Strengths, Weaknesses, Opportunities, Threats) analysis by internal factors and external factors impacting the level of tourist visit to Munduk Tourist Village in Buleleng Regency, Bali. The sampling was done by ‘accidental sampling technique’ to obtain the participants to analyse the results of the SWOT analysis. Further assessment of internal and external weights has resulted respectively (1.84 , 1.84) which are in the first quadrant of the diagram in which S-O (Strengths-Opportunities) Strategy. As the prime owner of the trekking and other rural tour packages in the village, Munduk should maximise its strengths and take other opportunities as possible to wrap and design trekking and other rural tour packages and then offer the package to travel agents in Bali.

Keywords: marketing mix, marketing strategy, rural tourism, SWOT matrix

Procedia PDF Downloads 230
14 Changing Colours and Odours: Exploring Cues Used by Insect Pollinators in Two Brassicaceous Plants

Authors: Katherine Y. Barragan-Fonseca, Joop J. A. Van Loon, Marcel Dicke, Dani Lucas-Barbosa

Abstract:

Flowering plants use different traits to attract pollinators, which indicate flower location and reward quality. Visual and olfactory cues are among the most important floral traits exploited by pollinating insects. Pollination can alter physical and chemical cues of flowers, which can subsequently influence the behaviour of flower visitors. We investigated the main cues exploited by the syrphid fly Episyrphus balteatus and the butterfly Pieris brassicae when visiting flowers of Brassica nigra and Raphanus sativus plants. We studied post-pollination changes and their effects on the behaviour of flower visitors and flower volatile emission. Preference of pollinators was investigated by offering visual and olfactory cues simultaneously as well as separately in two-choice bioassays. We also assessed whether pollen is used as a cue by pollinating insects. In addition, we studied whether behavioural responses could be correlated with changes in plant volatile emission, by collecting volatiles from flower headspace. P. brassicae and E. balteatus did not use pollen as a cue in either of the two plant species studied. Interestingly, pollinators showed a strong bias for visual cues over olfactory cues when exposed to B. nigra plants. Flower visits by pollinators were influenced by post-pollination changes in B. nigra. In contrast, plant responses to pollination did not influence pollinator preference for R. sativus flowers. These results correlate well with floral volatile emission of B. nigra and R. sativus; pollination influenced the volatile profile of B. nigra flowers but not that of R. sativus. Collectively, our data show that different pollinators exploit different visual and olfactory traits when searching for nectar or pollen of flowers of two close related plant species. Although the syrphid fly consumes mostly pollen from brassicaceous flowers, it cannot detect pollen from a distance and likely associates other flower traits with quantity and quality of pollen.

Keywords: plant volatiles, pollinators, post-pollination changes, visual and odour cues

Procedia PDF Downloads 119
13 Open Innovation in SMEs: A Multiple Case Study of Collaboration between Start-ups and Craft Enterprises

Authors: Carl-Philipp Valentin Beichert, Marcel Seger

Abstract:

Digital transformation and climate change require small and medium-sized enterprises (SME) to rethink their way of doing business. Inter-firm collaboration is recognized as helpful means of promoting innovation and competitiveness. In this context, collaborations with start-ups offer valuable opportunities through their innovative products, services, and business models. SMEs, and in particular German craft enterprises, play an important role in the country’s society and economy. Companies in this heterogeneous economic sector have unique characteristics and are limited in their ability to innovate due to their small size and lack of resources. Collaborating with start-ups could help to overcome these shortcomings. To investigate how collaborations emerge and what factors are decisive to successfully drive collaboration, we apply an explorative, qualitative research design. A sample of ten case studies was selected, with the collaboration between a start-up and a craft enterprise forming the unit of analysis. Semi-structured interviews with 20 company representatives allow for a two-sided perspective on the respective collaboration. The interview data is enriched by publicly available data and three expert interviews. As a result, objectives, initiation practices, applied collaboration types, barriers, as well as key success factors could be identified. The results indicate a three-phase collaboration process comprising an initiation, concept, and partner phase (ICP). The ICP framework proposed accordingly highlights the success factors (personal fit, communication, expertise, structure, network) for craft enterprises and start-ups for each collaboration phase. The role of a mediator in the start-up company, with strong expertise in the respective craft sector, is considered an important lever for overcoming barriers such as cultural and communication differences. The ICP framework thus provides promising directions for further research and can help practitioners establish successful collaborations.

Keywords: open innovation, SME, craft businesses, startup collaboration, qualitative research

Procedia PDF Downloads 49
12 Monetary Evaluation of Dispatching Decisions in Consideration of Choice of Transport

Authors: Marcel Schneider, Nils Nießen

Abstract:

Microscopic simulation programs enable the description of the two processes of railway operation and the previous timetabling. Occupation conflicts are often solved based on defined train priorities on both process levels. These conflict resolutions produce knock-on delays for the involved trains. The sum of knock-on delays is commonly used to evaluate the quality of railway operations. It is either compared to an acceptable level-of-service or the delays are evaluated economically by linearly monetary functions. It is impossible to properly evaluate dispatching decisions without a well-founded objective function. This paper presents a new approach for evaluation of dispatching decisions. It uses models of choice of transport and considers the behaviour of the end-costumers. These models evaluate the knock-on delays in more detail than linearly monetary functions and consider other competing modes of transport. The new approach pursues the coupling of a microscopic model of railway operation with the macroscopic model of choice of transport. First it will be implemented for the railway operations process, but it can also be used for timetabling. The evaluation considers the possibility to change over to other transport modes by the end-costumers. The new approach first looks at the rail-mounted and road transport, but it can also be extended to air transport. The split of the end-costumers is described by the modal-split. The reactions by the end-costumers have an effect on the revenues of the railway undertakings. Various travel purposes has different pavement reserves and tolerances towards delays. Longer journey times affect besides revenue changes also additional costs. The costs depend either on time or track and arise from circulation of workers and vehicles. Only the variable values are summarised in the contribution margin, which is the base for the monetary evaluation of the delays. The contribution margin is calculated for different resolution decisions of the same conflict. The conflict resolution is improved until the monetary loss becomes minimised. The iterative process therefore determines an optimum conflict resolution by observing the change of the contribution margin. Furthermore, a monetary value of each dispatching decision can also be determined.

Keywords: choice of transport, knock-on delays, monetary evaluation, railway operations

Procedia PDF Downloads 289
11 Hounsfield-Based Automatic Evaluation of Volumetric Breast Density on Radiotherapy CT-Scans

Authors: E. M. D. Akuoko, Eliana Vasquez Osorio, Marcel Van Herk, Marianne Aznar

Abstract:

Radiotherapy is an integral part of treatment for many patients with breast cancer. However, side effects can occur, e.g., fibrosis or erythema. If patients at higher risks of radiation-induced side effects could be identified before treatment, they could be given more individual information about the risks and benefits of radiotherapy. We hypothesize that breast density is correlated with the risk of side effects and present a novel method for automatic evaluation based on radiotherapy planning CT scans. Methods: 799 supine CT scans of breast radiotherapy patients were available from the REQUITE dataset. The methodology was first established in a subset of 114 patients (cohort 1) before being applied to the whole dataset (cohort 2). All patients were scanned in the supine position, with arms up, and the treated breast (ipsilateral) was identified. Manual experts contour available in 96 patients for both the ipsilateral and contralateral breast in cohort 1. Breast tissue was segmented using atlas-based automatic contouring software, ADMIRE® v3.4 (Elekta AB, Sweden). Once validated, the automatic segmentation method was applied to cohort 2. Breast density was then investigated by thresholding voxels within the contours, using Otsu threshold and pixel intensity ranges based on Hounsfield units (-200 to -100 for fatty tissue, and -99 to +100 for fibro-glandular tissue). Volumetric breast density (VBD) was defined as the volume of fibro-glandular tissue / (volume of fibro-glandular tissue + volume of fatty tissue). A sensitivity analysis was performed to verify whether calculated VBD was affected by the choice of breast contour. In addition, we investigated the correlation between volumetric breast density (VBD) and patient age and breast size. VBD values were compared between ipsilateral and contralateral breast contours. Results: Estimated VBD values were 0.40 (range 0.17-0.91) in cohort 1, and 0.43 (0.096-0.99) in cohort 2. We observed ipsilateral breasts to be denser than contralateral breasts. Breast density was negatively associated with breast volume (Spearman: R=-0.5, p-value < 2.2e-16) and age (Spearman: R=-0.24, p-value = 4.6e-10). Conclusion: VBD estimates could be obtained automatically on a large CT dataset. Patients’ age or breast volume may not be the only variables that explain breast density. Future work will focus on assessing the usefulness of VBD as a predictive variable for radiation-induced side effects.

Keywords: breast cancer, automatic image segmentation, radiotherapy, big data, breast density, medical imaging

Procedia PDF Downloads 98
10 Tourists' Perception to the Service Quality of White Water Rafting in Bali: Case Study of Ayung River

Authors: Ni Putu Evi Wijayanti, Made Darmiati, Ni Ketut Wiwiek Agustina, Putu Gde Arie Yudhistira, Marcel Hardono

Abstract:

This research study discusses the tourists’ perception to white water rafting service quality in Bali (Case Study: Ayung River). The aim is to determine the tourists’ perception to: firstly, the services quality of white water rafting trip in Bali, secondly, is to determine which dimensions of the service quality that need to take main handling priority in accordance with the level of important service of white water rafting company’s working performance toward the service quality of rafting in Bali especially on Ayung Riveri, lastly, is to know the efforts are needed to improve the service quality of white water rafting trip for tourist in Bali, specifically on Ayung River. This research uses the concept of the service quality with five principal dimensions, namely: Tangibles, Reliability, Responsiveness, Assurance, Empathy. Location of the research is tourist destination area of the Ayung River, that lies between the boundary of Badung Regency at Western part and Gianyar Regency eastern side. There are three rafting companies located on the Ayung River. This research took 100 respondents who were selected as a sample by using purposive sampling method. Data were collected through questionnaires distributed to domestic tourists then tabulated using the weighting scale (Likert scale) and analyzed using analysis of the benefit performance (important performance analysis) in the form of Cartesian diagram. The results of the research are translated into three points. Firstly, there are 23 indicators assessed by the service aspect of domestic tourists where the highest value is the aspect of familiarity between the tourist and employees with points (0.29) and the lowest score is the aspect of the clarity of the Ayung River water discharge value (-0.35). This shows that the indicator has not been fully able to meet the expectations of service aspects of the rating. Secondly, the dimensions of service quality that requires serious attention is the dimension of tangibles. The third point is the efforts that needs to be done adapted to the results of the Cartesian diagram breaks down into four quadrants. Based on the results of the research suggested to the manager of the white water rafting tour in order to continuously improve the service quality to tourists, performing new innovations in terms of product variations, provide insight and training to its employees to increase their competence, especially in the field of excellent service so that the satisfaction rating can be achieved.

Keywords: perception, rafting, service quality, tourist satisfaction

Procedia PDF Downloads 202
9 Applying the Global Trigger Tool in German Hospitals: A Retrospective Study in Surgery and Neurosurgery

Authors: Mareen Brosterhaus, Antje Hammer, Steffen Kalina, Stefan Grau, Anjali A. Roeth, Hany Ashmawy, Thomas Gross, Marcel Binnebosel, Wolfram T. Knoefel, Tanja Manser

Abstract:

Background: The identification of critical incidents in hospitals is an essential component of improving patient safety. To date, various methods have been used to measure and characterize such critical incidents. These methods are often viewed by physicians and nurses as external quality assurance, and this creates obstacles to the reporting events and the implementation of recommendations in practice. One way to overcome this problem is to use tools that directly involve staff in measuring indicators of quality and safety of care in the department. One such instrument is the global trigger tool (GTT), which helps physicians and nurses identify adverse events by systematically reviewing randomly selected patient records. Based on so-called ‘triggers’ (warning signals), indications of adverse events can be given. While the tool is already used internationally, its implementation in German hospitals has been very limited. Objectives: This study aimed to assess the feasibility and potential of the global trigger tool for identifying adverse events in German hospitals. Methods: A total of 120 patient records were randomly selected from two surgical, and one neurosurgery, departments of three university hospitals in Germany over a period of two months per department between January and July, 2017. The records were reviewed using an adaptation of the German version of the Institute for Healthcare Improvement Global Trigger Tool to identify triggers and adverse event rates per 1000 patient days and per 100 admissions. The severity of adverse events was classified using the National Coordinating Council for Medication Error Reporting and Prevention. Results: A total of 53 adverse events were detected in the three departments. This corresponded to adverse event rates of 25.5-72.1 per 1000 patient-days and from 25.0 to 60.0 per 100 admissions across the three departments. 98.1% of identified adverse events were associated with non-permanent harm without (Category E–71.7%) or with (Category F–26.4%) the need for prolonged hospitalization. One adverse event (1.9%) was associated with potentially permanent harm to the patient. We also identified practical challenges in the implementation of the tool, such as the need for adaptation of the global trigger tool to the respective department. Conclusions: The global trigger tool is feasible and an effective instrument for quality measurement when adapted to the departmental specifics. Based on our experience, we recommend a continuous use of the tool thereby directly involving clinicians in quality improvement.

Keywords: adverse events, global trigger tool, patient safety, record review

Procedia PDF Downloads 204
8 The Effect of Improvement Programs in the Mean Time to Repair and in the Mean Time between Failures on Overall Lead Time: A Simulation Using the System Dynamics-Factory Physics Model

Authors: Marcel Heimar Ribeiro Utiyama, Fernanda Caveiro Correia, Dario Henrique Alliprandini

Abstract:

The importance of the correct allocation of improvement programs is of growing interest in recent years. Due to their limited resources, companies must ensure that their financial resources are directed to the correct workstations in order to be the most effective and survive facing the strong competition. However, to our best knowledge, the literature about allocation of improvement programs does not analyze in depth this problem when the flow shop process has two capacity constrained resources. This is a research gap which is deeply studied in this work. The purpose of this work is to identify the best strategy to allocate improvement programs in a flow shop with two capacity constrained resources. Data were collected from a flow shop process with seven workstations in an industrial control and automation company, which process 13.690 units on average per month. The data were used to conduct a simulation with the System Dynamics-Factory Physics model. The main variables considered, due to their importance on lead time reduction, were the mean time between failures and the mean time to repair. The lead time reduction was the output measure of the simulations. Ten different strategies were created: (i) focused time to repair improvement, (ii) focused time between failures improvement, (iii) distributed time to repair improvement, (iv) distributed time between failures improvement, (v) focused time to repair and time between failures improvement, (vi) distributed time to repair and between failures improvement, (vii) hybrid time to repair improvement, (viii) hybrid time between failures improvements, (ix) time to repair improvement strategy towards the two capacity constrained resources, (x) time between failures improvement strategy towards the two capacity constrained resources. The ten strategies tested are variations of the three main strategies for improvement programs named focused, distributed and hybrid. Several comparisons among the effect of the ten strategies in lead time reduction were performed. The results indicated that for the flow shop analyzed, the focused strategies delivered the best results. When it is not possible to perform a large investment on the capacity constrained resources, companies should use hybrid approaches. An important contribution to the academy is the hybrid approach, which proposes a new way to direct the efforts of improvements. In addition, the study in a flow shop with two strong capacity constrained resources (more than 95% of utilization) is an important contribution to the literature. Another important contribution is the problem of allocation with two CCRs and the possibility of having floating capacity constrained resources. The results provided the best improvement strategies considering the different strategies of allocation of improvement programs and different positions of the capacity constrained resources. Finally, it is possible to state that both strategies, hybrid time to repair improvement and hybrid time between failures improvement, delivered best results compared to the respective distributed strategies. The main limitations of this study are mainly regarding the flow shop analyzed. Future work can further investigate different flow shop configurations like a varying number of workstations, different number of products or even different positions of the two capacity constrained resources.

Keywords: allocation of improvement programs, capacity constrained resource, hybrid strategy, lead time, mean time to repair, mean time between failures

Procedia PDF Downloads 83
7 Analysis of the Relationship between Micro-Regional Human Development and Brazil's Greenhouse Gases Emission

Authors: Geanderson Eduardo Ambrósio, Dênis Antônio Da Cunha, Marcel Viana Pires

Abstract:

Historically, human development has been based on economic gains associated with intensive energy activities, which often are exhaustive in the emission of Greenhouse Gases (GHGs). It requires the establishment of targets for mitigation of GHGs in order to disassociate the human development from emissions and prevent further climate change. Brazil presents itself as one of the most GHGs emitters and it is of critical importance to discuss such reductions in intra-national framework with the objective of distributional equity to explore its full mitigation potential without compromising the development of less developed societies. This research displays some incipient considerations about which Brazil’s micro-regions should reduce, when the reductions should be initiated and what its magnitude should be. We started with the methodological assumption that human development and GHGs emissions arise in the future as their behavior was observed in the past. Furthermore, we assume that once a micro-region became developed, it is able to maintain gains in human development without the need of keep growing GHGs emissions rates. The human development index and the carbon dioxide equivalent emissions (CO2e) were extrapolated to the year 2050, which allowed us to calculate when the micro-regions will become developed and the mass of GHG’s emitted. The results indicate that Brazil must throw 300 GT CO2e in the atmosphere between 2011 and 2050, of which only 50 GT will be issued by micro-regions before it’s develop and 250 GT will be released after development. We also determined national mitigation targets and structured reduction schemes where only the developed micro-regions would be required to reduce. The micro-region of São Paulo, the most developed of the country, should be also the one that reduces emissions at most, emitting, in 2050, 90% less than the value observed in 2010. On the other hand, less developed micro-regions will be responsible for less impactful reductions, i.e. Vale do Ipanema will issue in 2050 only 10% below the value observed in 2010. Such methodological assumption would lead the country to issue, in 2050, 56.5% lower than that observed in 2010, so that the cumulative emissions between 2011 and 2050 would reduce by 130 GT CO2e over the initial projection. The fact of associating the magnitude of the reductions to the level of human development of the micro-regions encourages the adoption of policies that favor both variables as the governmental planner will have to deal with both the increasing demand for higher standards of living and with the increasing magnitude of reducing emissions. However, if economic agents do not act proactively in local and national level, the country is closer to the scenario in which emits more than the one in which mitigates emissions. The research highlighted the importance of considering the heterogeneity in determining individual mitigation targets and also ratified the theoretical and methodological feasibility to allocate larger share of contribution for those who historically emitted more. It is understood that the proposals and discussions presented should be considered in mitigation policy formulation in Brazil regardless of the adopted reduction target.

Keywords: greenhouse gases, human development, mitigation, intensive energy activities

Procedia PDF Downloads 286
6 Deficient Multisensory Integration with Concomitant Resting-State Connectivity in Adult Attention Deficit/Hyperactivity Disorder (ADHD)

Authors: Marcel Schulze, Behrem Aslan, Silke Lux, Alexandra Philipsen

Abstract:

Objective: Patients with Attention Deficit/Hyperactivity Disorder (ADHD) often report that they are being flooded by sensory impressions. Studies investigating sensory processing show hypersensitivity for sensory inputs across the senses in children and adults with ADHD. Especially the auditory modality is affected by deficient acoustical inhibition and modulation of signals. While studying unimodal signal-processing is relevant and well-suited in a controlled laboratory environment, everyday life situations occur multimodal. A complex interplay of the senses is necessary to form a unified percept. In order to achieve this, the unimodal sensory modalities are bound together in a process called multisensory integration (MI). In the current study we investigate MI in an adult ADHD sample using the McGurk-effect – a well-known illusion where incongruent speech like phonemes lead in case of successful integration to a new perceived phoneme via late top-down attentional allocation . In ADHD neuronal dysregulation at rest e.g., aberrant within or between network functional connectivity may also account for difficulties in integrating across the senses. Therefore, the current study includes resting-state functional connectivity to investigate a possible relation of deficient network connectivity and the ability of stimulus integration. Method: Twenty-five ADHD patients (6 females, age: 30.08 (SD:9,3) years) and twenty-four healthy controls (9 females; age: 26.88 (SD: 6.3) years) were recruited. MI was examined using the McGurk effect, where - in case of successful MI - incongruent speech-like phonemes between visual and auditory modality are leading to a perception of a new phoneme. Mann-Whitney-U test was applied to assess statistical differences between groups. Echo-planar imaging-resting-state functional MRI was acquired on a 3.0 Tesla Siemens Magnetom MR scanner. A seed-to-voxel analysis was realized using the CONN toolbox. Results: Susceptibility to McGurk was significantly lowered for ADHD patients (ADHDMdn:5.83%, ControlsMdn:44.2%, U= 160.5, p=0.022, r=-0.34). When ADHD patients integrated phonemes, reaction times were significantly longer (ADHDMdn:1260ms, ControlsMdn:582ms, U=41.0, p<.000, r= -0.56). In functional connectivity medio temporal gyrus (seed) was negatively associated with primary auditory cortex, inferior frontal gyrus, precentral gyrus, and fusiform gyrus. Conclusion: MI seems to be deficient for ADHD patients for stimuli that need top-down attentional allocation. This finding is supported by stronger functional connectivity from unimodal sensory areas to polymodal, MI convergence zones for complex stimuli in ADHD patients.

Keywords: attention-deficit hyperactivity disorder, audiovisual integration, McGurk-effect, resting-state functional connectivity

Procedia PDF Downloads 89