Search results for: extra dimensions model
16901 Efficiency Measurement of Turkish via the Stochastic Frontier Model
Authors: Yeliz Mert Kantar, İsmail Yeni̇lmez, Ibrahim Arik
Abstract:
In this study, the efficiency measurement of the top fifty Turkish Universities has been conducted. The top fifty Turkish Universities are listed by The Scientific and Technological Research Council of Turkey (TÜBITAK) according to the Entrepreneur and Innovative University Index every year. The index is calculated based on four components since 2018. Four components are scientific and technological research competency, intellectual property pool, cooperation and interaction, and economic and social contribution. The four components consist of twenty-three sub-components. The 2021 list announced in January 2022 is discussed in this study. Efficiency analysis have been carried out using the Stochastic Frontier Model. Statistical significance of the sub-components that make up the index with certain weights has been examined in terms of the efficiency measurement calculated through the Stochastic Frontier Model. The relationship between the efficiency ranking estimated based on the Stochastic Frontier Model and the Entrepreneur and Innovative University Index ranking is discussed in detail.Keywords: efficiency, entrepreneur and innovative universities, turkish universities, stochastic frontier model, tübi̇tak
Procedia PDF Downloads 8916900 Catchment Yield Prediction in an Ungauged Basin Using PyTOPKAPI
Authors: B. S. Fatoyinbo, D. Stretch, O. T. Amoo, D. Allopi
Abstract:
This study extends the use of the Drainage Area Regionalization (DAR) method in generating synthetic data and calibrating PyTOPKAPI stream yield for an ungauged basin at a daily time scale. The generation of runoff in determining a river yield has been subjected to various topographic and spatial meteorological variables, which integers form the Catchment Characteristics Model (CCM). Many of the conventional CCM models adapted in Africa have been challenged with a paucity of adequate, relevance and accurate data to parameterize and validate the potential. The purpose of generating synthetic flow is to test a hydrological model, which will not suffer from the impact of very low flows or very high flows, thus allowing to check whether the model is structurally sound enough or not. The employed physically-based, watershed-scale hydrologic model (PyTOPKAPI) was parameterized with GIS-pre-processing parameters and remote sensing hydro-meteorological variables. The validation with mean annual runoff ratio proposes a decent graphical understanding between observed and the simulated discharge. The Nash-Sutcliffe efficiency and coefficient of determination (R²) values of 0.704 and 0.739 proves strong model efficiency. Given the current climate variability impact, water planner can now assert a tool for flow quantification and sustainable planning purposes.Keywords: catchment characteristics model, GIS, synthetic data, ungauged basin
Procedia PDF Downloads 32716899 Exploring Time-Series Phosphoproteomic Datasets in the Context of Network Models
Authors: Sandeep Kaur, Jenny Vuong, Marcel Julliard, Sean O'Donoghue
Abstract:
Time-series data are useful for modelling as they can enable model-evaluation. However, when reconstructing models from phosphoproteomic data, often non-exact methods are utilised, as the knowledge regarding the network structure, such as, which kinases and phosphatases lead to the observed phosphorylation state, is incomplete. Thus, such reactions are often hypothesised, which gives rise to uncertainty. Here, we propose a framework, implemented via a web-based tool (as an extension to Minardo), which given time-series phosphoproteomic datasets, can generate κ models. The incompleteness and uncertainty in the generated model and reactions are clearly presented to the user via the visual method. Furthermore, we demonstrate, via a toy EGF signalling model, the use of algorithmic verification to verify κ models. Manually formulated requirements were evaluated with regards to the model, leading to the highlighting of the nodes causing unsatisfiability (i.e. error causing nodes). We aim to integrate such methods into our web-based tool and demonstrate how the identified erroneous nodes can be presented to the user via the visual method. Thus, in this research we present a framework, to enable a user to explore phosphorylation proteomic time-series data in the context of models. The observer can visualise which reactions in the model are highly uncertain, and which nodes cause incorrect simulation outputs. A tool such as this enables an end-user to determine the empirical analysis to perform, to reduce uncertainty in the presented model - thus enabling a better understanding of the underlying system.Keywords: κ-models, model verification, time-series phosphoproteomic datasets, uncertainty and error visualisation
Procedia PDF Downloads 25616898 Development of Digital Twin Concept to Detect Abnormal Changes in Structural Behaviour
Authors: Shady Adib, Vladimir Vinogradov, Peter Gosling
Abstract:
Digital Twin (DT) technology is a new technology that appeared in the early 21st century. The DT is defined as the digital representation of living and non-living physical assets. By connecting the physical and virtual assets, data are transmitted smoothly, allowing the virtual asset to fully represent the physical asset. Although there are lots of studies conducted on the DT concept, there is still limited information about the ability of the DT models for monitoring and detecting unexpected changes in structural behaviour in real time. This is due to the large computational efforts required for the analysis and an excessively large amount of data transferred from sensors. This paper aims to develop the DT concept to be able to detect the abnormal changes in structural behaviour in real time using advanced modelling techniques, deep learning algorithms, and data acquisition systems, taking into consideration model uncertainties. finite element (FE) models were first developed offline to be used with a reduced basis (RB) model order reduction technique for the construction of low-dimensional space to speed the analysis during the online stage. The RB model was validated against experimental test results for the establishment of a DT model of a two-dimensional truss. The established DT model and deep learning algorithms were used to identify the location of damage once it has appeared during the online stage. Finally, the RB model was used again to identify the damage severity. It was found that using the RB model, constructed offline, speeds the FE analysis during the online stage. The constructed RB model showed higher accuracy for predicting the damage severity, while deep learning algorithms were found to be useful for estimating the location of damage with small severity.Keywords: data acquisition system, deep learning, digital twin, model uncertainties, reduced basis, reduced order model
Procedia PDF Downloads 9916897 Switched Uses of a Bidirectional Microphone as a Microphone and Sensors with High Gain and Wide Frequency Range
Authors: Toru Shionoya, Yosuke Kurihara, Takashi Kaburagi, Kajiro Watanabe
Abstract:
Mass-produced bidirectional microphones have attractive characteristics. They work as a microphone as well as a sensor with high gain over a wide frequency range; they are also highly reliable and economical. We present novel multiple functional uses of the microphones. A mathematical model for explaining the high-pass-filtering characteristics of bidirectional microphones was presented. Based on the model, the characteristics of the microphone were investigated, and a novel use for the microphone as a sensor with a wide frequency range was presented. In this study, applications for using the microphone as a security sensor and a human biosensor were introduced. The mathematical model was validated through experiments, and the feasibility of the abovementioned applications for security monitoring and the biosignal monitoring were examined through experiments.Keywords: bidirectional microphone, low-frequency, mathematical model, frequency response
Procedia PDF Downloads 54516896 Comparative Assessment of a Distributed Model and a Lumped Model for Estimating of Sediments Yielding in Small Urban Areas
Authors: J.Zambrano Nájera, M.Gómez Valentín
Abstract:
Increases in urbanization during XX century, have brought as one major problem the increased of sediment production. Hydraulic erosion is one of the major causes of increasing of sediments in small urban catchments. Such increments in sediment yielding in header urban catchments can caused obstruction of drainage systems, making impossible to capture urban runoff, increasing runoff volumes and thus exacerbating problems of urban flooding. For these reasons, it is more and more important to study of sediment production in urban watershed for properly analyze and solve problems associated to sediments. The study of sediments production has improved with the use of mathematical modeling. For that reason, it is proposed a new physically based model applicable to small header urban watersheds that includes the advantages of distributed physically base models, but with more realistic data requirements. Additionally, in this paper the model proposed is compared with a lumped model, reviewing the results, the advantages and disadvantages between the both of them.Keywords: erosion, hydrologic modeling, urban runoff, sediment modeling, sediment yielding, urban planning
Procedia PDF Downloads 34816895 Positioning a Southern Inclusive Framework Embedded in the Social Model of Disability Theory Contextualised for Guyana
Authors: Lidon Lashley
Abstract:
This paper presents how the social model of disability can be used to reshape inclusive education practices in Guyana. Inclusive education in Guyana is metamorphosizing but still firmly held in the tenets of the Medical Model of Disability which influences the experiences of children with Special Education Needs and/or Disabilities (SEN/D). An ethnographic approach to data gathering was employed in this study. Qualitative data was gathered from the voices of children with and without SEN/D as well as their mainstream teachers to present the interplay of discourses and subjectivities in the situation. The data was analyzed using Adele Clarke's postmodern approach to grounded theory analysis called situational analysis. The data suggest that it is possible but will be challenging to fully contextualize and adopt Loreman's synthesis and Booths and Ainscow's Index in the two mainstream schools studied. In addition, the data paved the way for the presentation of the social model framework specific to Guyana called 'Southern Inclusive Education Framework for Guyana' and its support tool called 'The Inclusive Checker created for Southern mainstream primary classrooms.Keywords: social model of disability, medical model of disability, subjectivities, metamorphosis, special education needs, postcolonial Guyana, inclusion, culture, mainstream primary schools, Loreman's synthesis, Booths and Ainscow's index
Procedia PDF Downloads 16216894 Analysis of the Diffusion Behavior of an Information and Communication Technology Platform for City Logistics
Authors: Giulio Mangano, Alberto De Marco, Giovanni Zenezini
Abstract:
The concept of City Logistics (CL) has emerged to improve the impacts of last mile freight distribution in urban areas. In this paper, a System Dynamics (SD) model exploring the dynamics of the diffusion of a ICT platform for CL management across different populations is proposed. For the development of the model two sources have been used. On the one hand, the major diffusion variables and feedback loops are derived from a literature review of existing diffusion models. On the other hand, the parameters are represented by the value propositions delivered by the platform as a response to some of the users’ needs. To extract the most important value propositions the Business Model Canvas approach has been used. Such approach in fact focuses on understanding how a company can create value for her target customers. These variables and parameters are thus translated into a SD diffusion model with three different populations namely municipalities, logistics service providers, and own account carriers. Results show that, the three populations under analysis fully adopt the platform within the simulation time frame, highlighting a strong demand by different stakeholders for CL projects aiming at carrying out more efficient urban logistics operations.Keywords: city logistics, simulation, system dynamics, business model
Procedia PDF Downloads 26716893 Evaluation of a Piecewise Linear Mixed-Effects Model in the Analysis of Randomized Cross-over Trial
Authors: Moses Mwangi, Geert Verbeke, Geert Molenberghs
Abstract:
Cross-over designs are commonly used in randomized clinical trials to estimate efficacy of a new treatment with respect to a reference treatment (placebo or standard). The main advantage of using cross-over design over conventional parallel design is its flexibility, where every subject become its own control, thereby reducing confounding effect. Jones & Kenward, discuss in detail more recent developments in the analysis of cross-over trials. We revisit the simple piecewise linear mixed-effects model, proposed by Mwangi et. al, (in press) for its first application in the analysis of cross-over trials. We compared performance of the proposed piecewise linear mixed-effects model with two commonly cited statistical models namely, (1) Grizzle model; and (2) Jones & Kenward model, used in estimation of the treatment effect, in the analysis of randomized cross-over trial. We estimate two performance measurements (mean square error (MSE) and coverage probability) for the three methods, using data simulated from the proposed piecewise linear mixed-effects model. Piecewise linear mixed-effects model yielded lowest MSE estimates compared to Grizzle and Jones & Kenward models for both small (Nobs=20) and large (Nobs=600) sample sizes. It’s coverage probability were highest compared to Grizzle and Jones & Kenward models for both small and large sample sizes. A piecewise linear mixed-effects model is a better estimator of treatment effect than its two competing estimators (Grizzle and Jones & Kenward models) in the analysis of cross-over trials. The data generating mechanism used in this paper captures two time periods for a simple 2-Treatments x 2-Periods cross-over design. Its application is extendible to more complex cross-over designs with multiple treatments and periods. In addition, it is important to note that, even for single response models, adding more random effects increases the complexity of the model and thus may be difficult or impossible to fit in some cases.Keywords: Evaluation, Grizzle model, Jones & Kenward model, Performance measures, Simulation
Procedia PDF Downloads 12416892 Analysis of Production Forecasting in Unconventional Gas Resources Development Using Machine Learning and Data-Driven Approach
Authors: Dongkwon Han, Sangho Kim, Sunil Kwon
Abstract:
Unconventional gas resources have dramatically changed the future energy landscape. Unlike conventional gas resources, the key challenges in unconventional gas have been the requirement that applies to advanced approaches for production forecasting due to uncertainty and complexity of fluid flow. In this study, artificial neural network (ANN) model which integrates machine learning and data-driven approach was developed to predict productivity in shale gas. The database of 129 wells of Eagle Ford shale basin used for testing and training of the ANN model. The Input data related to hydraulic fracturing, well completion and productivity of shale gas were selected and the output data is a cumulative production. The performance of the ANN using all data sets, clustering and variables importance (VI) models were compared in the mean absolute percentage error (MAPE). ANN model using all data sets, clustering, and VI were obtained as 44.22%, 10.08% (cluster 1), 5.26% (cluster 2), 6.35%(cluster 3), and 32.23% (ANN VI), 23.19% (SVM VI), respectively. The results showed that the pre-trained ANN model provides more accurate results than the ANN model using all data sets.Keywords: unconventional gas, artificial neural network, machine learning, clustering, variables importance
Procedia PDF Downloads 19616891 Development of a Thermodynamic Model for Ladle Metallurgy Steel Making Processes Using Factsage and Its Macro Facility
Authors: Prasenjit Singha, Ajay Kumar Shukla
Abstract:
To produce high-quality steel in larger volumes, dynamic control of composition and temperature throughout the process is essential. In this paper, we developed a mass transfer model based on thermodynamics to simulate the ladle metallurgy steel-making process using FactSage and its macro facility. The overall heat and mass transfer processes consist of one equilibrium chamber, two non-equilibrium chambers, and one adiabatic reactor. The flow of material, as well as heat transfer, occurs across four interconnected unit chambers and a reactor. We used the macro programming facility of FactSage™ software to understand the thermochemical model of the secondary steel making process. In our model, we varied the oxygen content during the process and studied their effect on the composition of the final hot metal and slag. The model has been validated with respect to the plant data for the steel composition, which is similar to the ladle metallurgy steel-making process in the industry. The resulting composition profile serves as a guiding tool to optimize the process of ladle metallurgy in steel-making industries.Keywords: desulphurization, degassing, factsage, reactor
Procedia PDF Downloads 21716890 Increasing Participation of KUD (Rural Unit Cooperative) Through 'Kemal Propuri' System to Independence Farmers
Authors: Ikrima Zaleda Zia, Devi Fitri Kumalasari, Rosita Khusna, Farah Hidayati, Ilham Fajrul Haq, Amin Yusuf Efendi
Abstract:
Fertilizer is one of the production factors that are important to agriculture. Fertilizers contribution to the agricultural sector improvement is quite high. Fertilizers scarcity on the society are giving effect to agricultural sector, that is decreasing farmers production. Through a system called Kemal Propuri, society will be taught how to be independent, especially in terms of supplying the fertilizer and how to earn extra income besides of relying on the agriculture production. This research aims to determine implementation measures of Kemal Propuri in realizing farmers independence. This research was designed to use descriptive research with a qualitative approach. In this case, writers are trying to make an illustration of the increasing role of KUD (rural unit cooperative) through Kemal Propuri system (Independence System Through Individual Fertilizer Production) towards farmer independence. It can be concluded that Kemal Propuri system can contribute in order to achieve farmers independence. Independence fertilizer production will overcome farmers dependence of the subsidized fertilizer from the government.Keywords: Kemal Propuri, KUD (Rural Unit Cooperative), independence farmers, fertilizer production
Procedia PDF Downloads 38616889 Fuzzy Control of Thermally Isolated Greenhouse Building by Utilizing Underground Heat Exchanger and Outside Weather Conditions
Authors: Raghad Alhusari, Farag Omar, Moustafa Fadel
Abstract:
A traditional greenhouse is a metal frame agricultural building used for cultivation plants in a controlled environment isolated from external climatic changes. Using greenhouses in agriculture is an efficient way to reduce the water consumption, where agriculture field is considered the biggest water consumer world widely. Controlling greenhouse environment yields better productivity of plants but demands an increase of electric power. Although various control approaches have been used towards greenhouse automation, most of them are applied to traditional greenhouses with ventilation fans and/or evaporation cooling system. Such approaches are still demanding high energy and water consumption. The aim of this research is to develop a fuzzy control system that minimizes water and energy consumption by utilizing outside weather conditions and underground heat exchanger to maintain the optimum climate of the greenhouse. The proposed control system is implemented on an experimental model of thermally isolated greenhouse structure with dimensions of 6x5x2.8 meters. It uses fans for extracting heat from the ground heat exchanger system, motors for automatic open/close of the greenhouse windows and LED as lighting system. The controller is integrated also with environmental condition sensors. It was found that using the air-to-air horizontal ground heat exchanger with 90 mm diameter and 2 mm thickness placed 2.5 m below the ground surface results in decreasing the greenhouse temperature of 3.28 ˚C which saves around 3 kW of consumed energy. It also eliminated the water consumption needed in evaporation cooling systems which are traditionally used for cooling the greenhouse environment.Keywords: automation, earth-to-air heat exchangers, fuzzy control, greenhouse, sustainable buildings
Procedia PDF Downloads 12916888 Onmanee Prajuabjinda, Pakakrong Thondeeying, Jipisute Chunthorng-Orn, Bhanuz Dechayont, Arunporn Itharat
Authors: Ekrem Erdem, Can Tansel Tugcu
Abstract:
Improved resource efficiency of production is a key requirement for sustainable growth, worldwide. In this regards, by considering the energy and tourism as the extra inputs to the classical Coub-Douglas production function, this study aims at investigating the efficiency changes in the North African countries. To this end, the study uses panel data for the period 1995-2010 and adopts the Malmquist index based on the data envelopment analysis. Results show that tourism increases technical and scale efficiencies, while it decreases technological and total factor productivity changes. On the other hand, when the production function is augmented by the energy input, technical efficiency change decreases, while the technological change, scale efficiency change and total factor productivity change increase. Thus, in order to satisfy the needs for sustainable growth, North African governments should take some measures for increasing the contribution that the tourism makes to economic growth and some others for efficient use of resources in the energy sector.Keywords: data envelopment analysis, economic efficiency, North African countries, sustainable growth
Procedia PDF Downloads 34316887 Flexible Capacitive Sensors Based on Paper Sheets
Authors: Mojtaba Farzaneh, Majid Baghaei Nejad
Abstract:
This article proposes a new Flexible Capacitive Tactile Sensors based on paper sheets. This method combines the parameters of sensor's material and dielectric, and forms a new model of flexible capacitive sensors. The present article tries to present a practical explanation of this method's application and advantages. With the use of this new method, it is possible to make a more flexibility and accurate sensor in comparison with the current models. To assess the performance of this model, the common capacitive sensor is simulated and the proposed model of this article and one of the existing models are assessed. The results of this article indicate that the proposed model of this article can enhance the speed and accuracy of tactile sensor and has less error in comparison with the current models. Based on the results of this study, it can be claimed that in comparison with the current models, the proposed model of this article is capable of representing more flexibility and more accurate output parameters for touching the sensor, especially in abnormal situations and uneven surfaces, and increases accuracy and practicality.Keywords: capacitive sensor, paper sheets, flexible, tactile, uneven
Procedia PDF Downloads 35316886 Revolutionizing Financial Forecasts: Enhancing Predictions with Graph Convolutional Networks (GCN) - Long Short-Term Memory (LSTM) Fusion
Authors: Ali Kazemi
Abstract:
Those within the volatile and interconnected international economic markets, appropriately predicting market trends, hold substantial fees for traders and financial establishments. Traditional device mastering strategies have made full-size strides in forecasting marketplace movements; however, monetary data's complicated and networked nature calls for extra sophisticated processes. This observation offers a groundbreaking method for monetary marketplace prediction that leverages the synergistic capability of Graph Convolutional Networks (GCNs) and Long Short-Term Memory (LSTM) networks. Our suggested algorithm is meticulously designed to forecast the traits of inventory market indices and cryptocurrency costs, utilizing a comprehensive dataset spanning from January 1, 2015, to December 31, 2023. This era, marked by sizable volatility and transformation in financial markets, affords a solid basis for schooling and checking out our predictive version. Our algorithm integrates diverse facts to construct a dynamic economic graph that correctly reflects market intricacies. We meticulously collect opening, closing, and high and low costs daily for key inventory marketplace indices (e.g., S&P 500, NASDAQ) and widespread cryptocurrencies (e.g., Bitcoin, Ethereum), ensuring a holistic view of marketplace traits. Daily trading volumes are also incorporated to seize marketplace pastime and liquidity, providing critical insights into the market's shopping for and selling dynamics. Furthermore, recognizing the profound influence of the monetary surroundings on financial markets, we integrate critical macroeconomic signs with hobby fees, inflation rates, GDP increase, and unemployment costs into our model. Our GCN algorithm is adept at learning the relational patterns amongst specific financial devices represented as nodes in a comprehensive market graph. Edges in this graph encapsulate the relationships based totally on co-movement styles and sentiment correlations, enabling our version to grasp the complicated community of influences governing marketplace moves. Complementing this, our LSTM algorithm is trained on sequences of the spatial-temporal illustration discovered through the GCN, enriched with historic fee and extent records. This lets the LSTM seize and expect temporal marketplace developments accurately. Inside the complete assessment of our GCN-LSTM algorithm across the inventory marketplace and cryptocurrency datasets, the version confirmed advanced predictive accuracy and profitability compared to conventional and opportunity machine learning to know benchmarks. Specifically, the model performed a Mean Absolute Error (MAE) of 0.85%, indicating high precision in predicting day-by-day charge movements. The RMSE was recorded at 1.2%, underscoring the model's effectiveness in minimizing tremendous prediction mistakes, which is vital in volatile markets. Furthermore, when assessing the model's predictive performance on directional market movements, it achieved an accuracy rate of 78%, significantly outperforming the benchmark models, averaging an accuracy of 65%. This high degree of accuracy is instrumental for techniques that predict the course of price moves. This study showcases the efficacy of mixing graph-based totally and sequential deep learning knowledge in economic marketplace prediction and highlights the fee of a comprehensive, records-pushed evaluation framework. Our findings promise to revolutionize investment techniques and hazard management practices, offering investors and economic analysts a powerful device to navigate the complexities of cutting-edge economic markets.Keywords: financial market prediction, graph convolutional networks (GCNs), long short-term memory (LSTM), cryptocurrency forecasting
Procedia PDF Downloads 6616885 Creative Peace Diplomacy Model by the Perspective of Dialogue Management for International Relations
Authors: Bilgehan Gültekin, Tuba Gültekin
Abstract:
Peace diplomacy is the most important international tool to keep peace all over the world. The study titled “peace diplomacy for international relations” is consist of three part. In the first part, peace diplomacy is going to be introduced as a tool of peace communication and peace management. And, in this part, peace communication will be explained by international communication perspective. In the second part of the study,public relations events and communication campaigns will be developed originally for peace diplomacy. In this part, it is aimed original public communication dialogue management tools for peace diplomacy. the aim of the final part of the study, is to produce original public communication model for international relations. The model includes peace modules, peace management projects, original dialogue procedures and protocols, dialogue education, dialogue management strategies, peace actors, communication models, peace team management and public diplomacy steps. The creative part of the study aims to develop a model used for international relations for all countries. Creative Peace Diplomacy Model will be developed in the case of Turkey-Turkey-France and Turkey-Greece relations. So, communication and public relations events and campaigns are going to be developed as original for only this study.Keywords: peace diplomacy, public communication model, dialogue management, international relations
Procedia PDF Downloads 54116884 A Fuzzy Multiobjective Model for Bed Allocation Optimized by Artificial Bee Colony Algorithm
Authors: Jalal Abdulkareem Sultan, Abdulhakeem Luqman Hasan
Abstract:
With the development of health care systems competition, hospitals face more and more pressures. Meanwhile, resource allocation has a vital effect on achieving competitive advantages in hospitals. Selecting the appropriate number of beds is one of the most important sections in hospital management. However, in real situation, bed allocation selection is a multiple objective problem about different items with vagueness and randomness of the data. It is very complex. Hence, research about bed allocation problem is relatively scarce under considering multiple departments, nursing hours, and stochastic information about arrival and service of patients. In this paper, we develop a fuzzy multiobjective bed allocation model for overcoming uncertainty and multiple departments. Fuzzy objectives and weights are simultaneously applied to help the managers to select the suitable beds about different departments. The proposed model is solved by using Artificial Bee Colony (ABC), which is a very effective algorithm. The paper describes an application of the model, dealing with a public hospital in Iraq. The results related that fuzzy multi-objective model was presented suitable framework for bed allocation and optimum use.Keywords: bed allocation problem, fuzzy logic, artificial bee colony, multi-objective optimization
Procedia PDF Downloads 32416883 Business or Enjoyment: Study of Affected Dimensions on Lifestyle Entrepreneurship
Authors: Sarah Irani, Meisam Modarresi
Abstract:
Lifestyle entrepreneurship allows the business owner to create a business activity that aligns with their values, interests, and motivations. Examining the views and experiences of lifestyle entrepreneurs has an essential impact on the growth of the entrepreneurial economy and the concept of entrepreneurship. The primary purpose of this research is to discover the main and secondary influencing aspects of lifestyle entrepreneurship. This research is qualitative and tries to develop research in this field by presenting a framework from the literature. This study can provide a clear picture of lifestyle entrepreneurship. The results showed that lifestyle entrepreneurship is influenced by four main aspects.Keywords: entrepreneurship, entrepreneurs, innovation, lifestyle entrepreneurship, small businesses development
Procedia PDF Downloads 18216882 Impact of Cultural Intelligence on Decision Making Styles of Managers: A Turkish Case
Authors: Fusun Akdag
Abstract:
Today, as business becomes increasingly global, managers/leaders of multinational companies or local companies work with employees or customers from a variety of cultural backgrounds. To do this effectively, they need to develop cultural competence. Therefore, cultural intelligence (CQ) becomes a vitally important aptitude and skill, especially for leaders. The organizational success or failure depends upon the way, the kind of leadership which has been provided to its members. The culture we are born into deeply effects our values, beliefs, and behavior. Cultural intelligence (CQ) focuses on how well individuals can relate and work across cultures. CQ helps minimize conflict and maximize performance of a diverse workforce. The term 'decision,' refers to a commitment to a course of action that is intended to serve the interests and values of particular people. One dimension of culture that has received attention is individualism-collectivism or, independence-interdependence. These dimensions are associated with different conceptualizations of the 'self.' Individualistic cultures tend to value personal goal pursuit as opposed to pursuit of others’ goals. Collectivistic cultures, by contrast, view the 'self' as part of a whole. Each person is expected to work with his or her in-group toward goals, generally pursue group harmony. These differences underlie cross-cultural variation in decision-making, such as the decision modes people use, their preferences, negotiation styles, creativity, and more. The aim of this study is determining the effect of CQ on decision making styles of male and female managers in Turkey, an emergent economy framework. The survey is distributed to gather data from managers at various companies. The questionnaire consists of three parts: demographics, The Cultural Intelligence Scale (CQS) to measure the four dimensions of cultural intelligence and General Decision Making Style (GMDS) Inventory to measure the five subscales of decision making. The results will indicate the Turkish managers’ score at metacognitive, cognitive, motivational and behavioral aspects of cultural intelligence and to what extent these scores affect their rational, avoidant, dependent, intuitive and spontaneous decision making styles since business leaders make dozens of decisions every day that influence the success of the company and also having an impact on employees, customers, shareholders and the market.Keywords: cultural intelligence, decision making, gender differences, management styles,
Procedia PDF Downloads 37016881 The Six 'P' Model: Principles of Inclusive Practice for Inclusion Coaches
Authors: Tiffany Gallagher, Sheila Bennett
Abstract:
Based on data from a larger study, this research is based in a small school district in Ontario, Canada, that has made a transition from self-contained classes for students with exceptionalities to inclusive classroom placements for all students with their age-appropriate peers. The school board aided this transition by hiring Inclusion Coaches with a background in special education to work alongside teachers as partners and inform their inclusive practice. Based on qualitative data from four focus groups conducted with Inclusion Coaches, as well as four blog-style reflections collected at various points over two years, six principles of inclusive practice were identified for coaches. The six principles form a model during transition: pre-requisite, process, precipice, promotion, proof, and promise. These principles are encapsulated in a visual model of a spiraling staircase displaying the conditions that exist prior to coaching, during coaching interactions and considerations for the sustainability of coaching. These six principles are re-iterative and should be re-visited each time a coaching interaction is initiated. Exploring inclusion coaching as a model emulates coaching in other contexts and allows us to examine an established process through a new lens. This research becomes increasingly important as more school boards transition toward inclusive classrooms, The Six ‘P’ Model: Principles of Inclusive Practice for Inclusion Coaches allows for a unique look into a scaffolding model of building educator capacity in an inclusive setting.Keywords: capacity building, coaching, inclusion, special education
Procedia PDF Downloads 25016880 Space Tourism Pricing Model Revolution from Time Independent Model to Time-Space Model
Authors: Kang Lin Peng
Abstract:
Space tourism emerged in 2001 and became famous in 2021, following the development of space technology. The space market is twisted because of the excess demand. Space tourism is currently rare and extremely expensive, with biased luxury product pricing, which is the seller’s market that consumers can not bargain with. Spaceship companies such as Virgin Galactic, Blue Origin, and Space X have been charged space tourism prices from 200 thousand to 55 million depending on various heights in space. There should be a reasonable price based on a fair basis. This study aims to derive a spacetime pricing model, which is different from the general pricing model on the earth’s surface. We apply general relativity theory to deduct the mathematical formula for the space tourism pricing model, which covers the traditional time-independent model. In the future, the price of space travel will be different from current flight travel when space travel is measured in lightyear units. The pricing of general commodities mainly considers the general equilibrium of supply and demand. The pricing model considers risks and returns with the dependent time variable as acceptable when commodities are on the earth’s surface, called flat spacetime. Current economic theories based on the independent time scale in the flat spacetime do not consider the curvature of spacetime. Current flight services flying the height of 6, 12, and 19 kilometers are charging with a pricing model that measures time coordinate independently. However, the emergence of space tourism is flying heights above 100 to 550 kilometers that have enlarged the spacetime curvature, which means tourists will escape from a zero curvature on the earth’s surface to the large curvature of space. Different spacetime spans should be considered in the pricing model of space travel to echo general relativity theory. Intuitively, this spacetime commodity needs to consider changing the spacetime curvature from the earth to space. We can assume the value of each spacetime curvature unit corresponding to the gradient change of each Ricci or energy-momentum tensor. Then we know how much to spend by integrating the spacetime from the earth to space. The concept is adding a price p component corresponding to the general relativity theory. The space travel pricing model degenerates into a time-independent model, which becomes a model of traditional commodity pricing. The contribution is that the deriving of the space tourism pricing model will be a breakthrough in philosophical and practical issues for space travel. The results of the space tourism pricing model extend the traditional time-independent flat spacetime mode. The pricing model embedded spacetime as the general relativity theory can better reflect the rationality and accuracy of space travel on the universal scale. The universal scale from independent-time scale to spacetime scale will bring a brand-new pricing concept for space traveling commodities. Fair and efficient spacetime economics will also bring to humans’ travel when we can travel in lightyear units in the future.Keywords: space tourism, spacetime pricing model, general relativity theory, spacetime curvature
Procedia PDF Downloads 12916879 Examining the Changes in Complexity, Accuracy, and Fluency in Japanese L2 Writing Over an Academic Semester
Authors: Robert Long
Abstract:
The results of a one-year study on the evolution of complexity, accuracy, and fluency (CAF) in the compositions of Japanese L2 university students throughout a semester are presented in this study. One goal was to determine if any improvement in writing abilities over this academic term had occurred, while another was to examine methods of editing. Participants had 30 minutes to write each essay with an additional 10 minutes allotted for editing. As for editing, participants were divided into two groups, one of which utilized an online grammar checker, while the other half self-edited their initial manuscripts. From the three different institutions, there was a total of 159 students. Research questions focused on determining if the CAF had evolved over the previous year, identifying potential variations in editing techniques, and describing the connections between the CAF dimensions. According to the findings, there was some improvement in accuracy (fewer errors) in all three of the measures), whereas there was a marked decline in complexity and fluency. As for the second research aim relating to the interaction among the three dimensions (CAF) and of possible increases in fluency being offset by decreases in grammatical accuracy, results showed (there is a logical high correlation with clauses and word counts, and mean length of T-unit (MLT) and (coordinate phrase of T-unit (CP/T) as well as MLT and clause per T-unit (C/T); furthermore, word counts and error/100 ratio correlated highly with error-free clause totals (EFCT). Issues of syntactical complexity had a negative correlation with EFCT, indicating that more syntactical complexity relates to decreased accuracy. Concerning a difference in error correction between those who self-edited and those who used an online grammar correction tool, results indicated that the variable of errors-free clause ratios (EFCR) had the greatest difference regarding accuracy, with fewer errors noted with writers using an online grammar checker. As for possible differences between the first and second (edited) drafts regarding CAF, results indicated there were positive changes in accuracy, the most significant change seen in complexity (CP/T and MLT), while there were relatively insignificant changes in fluency. Results also indicated significant differences among the three institutions, with Fujian University of Technology having the most fluency and accuracy. These findings suggest that to raise students' awareness of their overall writing development, teachers should support them in developing more complex syntactic structures, improving their fluency, and making more effective use of online grammar checkers.Keywords: complexity, accuracy, fluency, writing
Procedia PDF Downloads 4116878 A Physically-Based Analytical Model for Reduced Surface Field Laterally Double Diffused MOSFETs
Authors: M. Abouelatta, A. Shaker, M. El-Banna, G. T. Sayah, C. Gontrand, A. Zekry
Abstract:
In this paper, a methodology for physically modeling the intrinsic MOS part and the drift region of the n-channel Laterally Double-diffused MOSFET (LDMOS) is presented. The basic physical effects like velocity saturation, mobility reduction, and nonuniform impurity concentration in the channel are taken into consideration. The analytical model is implemented using MATLAB. A comparison of the simulations from technology computer aided design (TCAD) and that from the proposed analytical model, at room temperature, shows a satisfactory accuracy which is less than 5% for the whole voltage domain.Keywords: LDMOS, MATLAB, RESURF, modeling, TCAD
Procedia PDF Downloads 19916877 Viability of EBT3 Film in Small Dimensions to Be Use for in-Vivo Dosimetry in Radiation Therapy
Authors: Abdul Qadir Jangda, Khadija Mariam, Usman Ahmed, Sharib Ahmed
Abstract:
The Gafchromic EBT3 film has the characteristic of high spatial resolution, weak energy dependence and near tissue equivalence which makes them viable to be used for in-vivo dosimetry in External Beam and Brachytherapy applications. The aim of this study is to assess the smallest film dimension that may be feasible for the use in in-vivo dosimetry. To evaluate the viability, the film sizes from 3 x 3 mm to 20 x 20 mm were calibrated with 6 MV Photon and 6 MeV electron beams. The Gafchromic EBT3 (Lot no. A05151201, Make: ISP) film was cut into five different sizes in order to establish the relationship between absorbed dose vs. film dimensions. The film dimension were 3 x 3, 5 x 5, 10 x 10, 15 x 15, and 20 x 20 mm. The films were irradiated on Varian Clinac® 2100C linear accelerator for dose range from 0 to 1000 cGy using PTW solid water phantom. The irradiation was performed as per clinical absolute dose rate calibratin setup, i.e. 100 cm SAD, 5.0 cm depth and field size of 10x10 cm2 and 100 cm SSD, 1.4 cm depth and 15x15 cm2 applicator for photon and electron respectively. The irradiated films were scanned with the landscape orientation and a post development time of 48 hours (minimum). Film scanning accomplished using Epson Expression 10000 XL Flatbed Scanner and quantitative analysis carried out with ImageJ freeware software. Results show that the dose variation with different film dimension ranging from 3 x 3 mm to 20 x 20 mm is very minimal with a maximum standard deviation of 0.0058 in Optical Density for a dose level of 3000 cGy and the the standard deviation increases with the increase in dose level. So the precaution must be taken while using the small dimension films for higher doses. Analysis shows that there is insignificant variation in the absorbed dose with a change in film dimension of EBT3 film. Study concludes that the film dimension upto 3 x 3 mm can safely be used up to a dose level of 3000 cGy without the need of recalibration for particular dimension in use for dosimetric application. However, for higher dose levels, one may need to calibrate the films for a particular dimension in use for higher accuracy. It was also noticed that the crystalline structure of the film got damage at the edges while cutting the film, which can contribute to the wrong dose if the region of interest includes the damage area of the filmKeywords: external beam radiotherapy, film calibration, film dosimetery, in-vivo dosimetery
Procedia PDF Downloads 49416876 Invasive Ranges of Gorse (Ulex europaeus) in South Australia and Sri Lanka Using Species Distribution Modelling
Authors: Champika S. Kariyawasam
Abstract:
The distribution of gorse (Ulex europaeus) plants in South Australia has been modelled using 126 presence-only location data as a function of seven climate parameters. The predicted range of U. europaeus is mainly along the Mount Lofty Ranges in the Adelaide Hills and on Kangaroo Island. Annual precipitation and yearly average aridity index appeared to be the highest contributing variables to the final model formulation. The Jackknife procedure was employed to identify the contribution of different variables to gorse model outputs and response curves were used to predict changes with changing environmental variables. Based on this analysis, it was revealed that the combined effect of one or more variables could make a completely different impact to the original variables on their own to the model prediction. This work also demonstrates the need for a careful approach when selecting environmental variables for projecting correlative models to climatically distinct area. Maxent acts as a robust model when projecting the fitted species distribution model to another area with changing climatic conditions, whereas the generalized linear model, bioclim, and domain models to be less robust in this regard. These findings are important not only for predicting and managing invasive alien gorse in South Australia and Sri Lanka but also in other countries of the invasive range.Keywords: invasive species, Maxent, species distribution modelling, Ulex europaeus
Procedia PDF Downloads 13416875 Elastoplastic and Ductile Damage Model Calibration of Steels for Bolt-Sphere Joints Used in China’s Space Structure Construction
Authors: Huijuan Liu, Fukun Li, Hao Yuan
Abstract:
The bolted spherical node is a common type of joint in space steel structures. The bolt-sphere joint portion almost always controls the bearing capacity of the bolted spherical node. The investigation of the bearing performance and progressive failure in service often requires high-fidelity numerical models. This paper focuses on the constitutive models of bolt steel and sphere steel used in China’s space structure construction. The elastoplastic model is determined by a standard tensile test and calibrated Voce saturated hardening rule. The ductile damage is found dominant based on the fractography analysis. Then Rice-Tracey ductile fracture rule is selected and the model parameters are calibrated based on tensile tests of notched specimens. These calibrated material models can benefit research or engineering work in similar fields.Keywords: bolt-sphere joint, steel, constitutive model, ductile damage, model calibration
Procedia PDF Downloads 13716874 Market Competition and the Adoption of Clean Technology: Evidence from the Taxi Industry
Authors: Raúl Bajo-Buenestado
Abstract:
This paper studies the impact of the intensity of market competition on firms' willingness to adopt green technologies —which has become particularly relevant in the light of the debate on whether competition policies should be relaxed to achieve certain environmental targets. We exploit the staggered rollout of different rail-hailing platforms (most notably, Uber) across different metropolitan areas in Spain as a natural experiment that provides time and city-specific exogenous variation in the intensity of competition to study the impact on taxi drivers' decisions to purchase “green” or “dirty” vehicles. It was shown that the entry of these platforms significantly increased the takeout of green vehicles among professional drivers in incumbent (dominant) conventional taxi companies and decreased that of dirty vehicles. The exact opposite effect is observed in the cities where these platforms were extremely unlikely to enter. Back of the envelope calculations suggest that the entry of Uber is associated with an extra green vehicle purchase in every four among taxi drivers, resulting in a substantial drop in the level of emissions from the taxi fleet —still mostly dominated diesel vehicles.Keywords: technological change, green technology adoption, market competition, diffusion of technology, environmental externalities
Procedia PDF Downloads 13816873 Internet-Of-Things and Ergonomics, Increasing Productivity and Reducing Waste: A Case Study
Authors: V. Jaime Contreras, S. Iliana Nunez, S. Mario Sanchez
Abstract:
Inside a manufacturing facility, we can find innumerable automatic and manual operations, all of which are relevant to the production process. Some of these processes add more value to the products more than others. Manual operations tend to add value to the product since they can be found in the final assembly area o final operations of the process. In this areas, where a mistake or accident can increase the cost of waste exponentially. To reduce or mitigate these costly mistakes, one approach is to rely on automation to eliminate the operator from the production line - requires a hefty investment and development of specialized machinery. In our approach, the center of the solution is the operator through sufficient and adequate instrumentation, real-time reporting and ergonomics. Efficiency and reduced cycle time can be achieved thorough the integration of Internet-of-Things (IoT) ready technologies into assembly operations to enhance the ergonomics of the workstations. Augmented reality visual aids, RFID triggered personalized workstation dimensions and real-time data transfer and reporting can help achieve these goals. In this case study, a standard work cell will be used for real-life data acquisition and a simulation software to extend the data points beyond the test cycle. Three comparison scenarios will run in the work cell. Each scenario will introduce a dimension of the ergonomics to measure its impact independently. Furthermore, the separate test will determine the limitations of the technology and provide a reference for operating costs and investment required. With the ability, to monitor costs, productivity, cycle time and scrap/waste in real-time the ROI (return on investment) can be determined at the different levels to integration. This case study will help to show that ergonomics in the assembly lines can make significant impact when IoT technologies are introduced. Ergonomics can effectively reduce waste and increase productivity with minimal investment if compared with setting up to custom machine.Keywords: augmented reality visual aids, ergonomics, real-time data acquisition and reporting, RFID triggered workstation dimensions
Procedia PDF Downloads 21416872 Drape Simulation by Commercial Software and Subjective Assessment of Virtual Drape
Authors: Evrim Buyukaslan, Simona Jevsnik, Fatma Kalaoglu
Abstract:
Simulation of fabrics is more difficult than any other simulation due to complex mechanics of fabrics. Most of the virtual garment simulation software use mass-spring model and incorporate fabric mechanics into simulation models. The accuracy and fidelity of these virtual garment simulation software is a question mark. Drape is a subjective phenomenon and evaluation of drape has been studied since 1950’s. On the other hand, fabric and garment simulation is relatively new. Understanding drape perception of subjects when looking at fabric simulations is critical as virtual try-on becomes more of an issue by enhanced online apparel sales. Projected future of online apparel retailing is that users may view their avatars and try-on the garment on their avatars in the virtual environment. It is a well-known fact that users will not be eager to accept this innovative technology unless it is realistic enough. Therefore, it is essential to understand what users see when they are displaying fabrics in a virtual environment. Are they able to distinguish the differences between various fabrics in virtual environment? The purpose of this study is to investigate human perception when looking at a virtual fabric and determine the most visually noticeable drape parameter. To this end, five different fabrics are mechanically tested, and their drape simulations are generated by commercial garment simulation software (Optitex®). The simulation images are processed by an image analysis software to calculate drape parameters namely; drape coefficient, node severity, and peak angles. A questionnaire is developed to evaluate drape properties subjectively in a virtual environment. Drape simulation images are shown to 27 subjects and asked to rank the samples according to their questioned drape property. The answers are compared to the calculated drape parameters. The results show that subjects are quite sensitive to drape coefficient changes while they are not very sensitive to changes in node dimensions and node distributions.Keywords: drape simulation, drape evaluation, fabric mechanics, virtual fabric
Procedia PDF Downloads 339