Search results for: empirical validation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3812

Search results for: empirical validation

2012 Measuring the Effect of Intercollegiate Athletic Success on Private Giving and Enrollment

Authors: Jamie L. Stangel

Abstract:

Increased popularity and visibility of college athletics have contributed to an environment in which institutions -most of which lack self-sufficient athletics department budgets- reallocate monies from the university general fund and seek additional funding sources to keep up with increasing levels of spending on athletics. Given the prevalence of debates on student debt levels, coach salaries, and athlete pay, empirical evidence on whether this spending yields an expected return on investment is necessary. This study considered the relationship between the independent variable of the winning percentage of the men’s basketball team at a mid-major university, moderated by NCAA tournament appearance, number of applicants, number of enrollments, average SAT score of students, and donor giving to the university general and athletic funds. The results indicate that, other than a small correlation between athletic success and number of applicants, only when NCAA tournament appearance is used as a moderating variable, these purported benefits are not supported, suggesting the need for a reevaluation of athletic department spending and perceptions on tangible and intangible benefits for universities.

Keywords: athletic success, enrollment, NCAA, private giving

Procedia PDF Downloads 116
2011 How Does the Interaction between Environmental and Intellectual Property Rights Affect Environmental Innovation? A Study of Seven OECD Countries

Authors: Aneeq Sarwar

Abstract:

This study assesses the interaction between environmental and intellectual property policy on the rate of invention of environmental inventions and specifically tests for whether there is a synergy between stricter IP regimes and stronger environmental policies. The empirical analysis uses firm and industry-level data from seven OECD countries from 2009 to 2015. We also introduce a new measure of environmental inventions using a Natural Language Processing Topic Modelling technique. We find that intellectual property policy strictness demonstrates greater effectiveness in encouraging inventiveness in environmental inventions when used in combination with stronger environmental policies. This study contributes to existing literature in two ways. First, it devises a method for better identification of environmental technologies, we demonstrate how our method is more comprehensive than existing methods as we are better able to identify not only environmental inventions, but also major components of said inventions. Second, we test how various policy regimes affect the development of environmental technologies, we are the first study to examine the interaction of the environmental and intellectual property policy on firm level innovation.

Keywords: environmental economics, economics of innovation, environmental policy, firm level

Procedia PDF Downloads 144
2010 Empirical Study From Final Exams of Graduate Courses in Computer Science to Demystify the Notion of an Average Software Engineer and Offer a Direction to Address Diversity of Professional Backgrounds of a Student Body

Authors: Alex Elentukh

Abstract:

The paper is based on data collected from final exams administered during five years of teaching the graduate course in software engineering. The visualization instrument with four distinct personas has been used to improve the effectiveness of each class. The study offers a plethora of clues toward students' behavioral preferences. Diversity among students (professional background, physical proximity) is too significant to assume a single face of a learner. This is particularly true for a body of online graduate students in computer science. Conclusions of the study (each learner is unique, and each class is unique) are extrapolated to demystify the notion of an 'average software engineer.' An immediate direction for an educator is to ensure a course applies to a wide audience of very different individuals. On the other hand, a student should be clear about his/her abilities and preferences - to follow the most effective learning path.

Keywords: K.3.2 computer and information science education, learner profiling, adaptive learning, software engineering

Procedia PDF Downloads 87
2009 Analysis of Complex Business Negotiations: Contributions from Agency-Theory

Authors: Jan Van Uden

Abstract:

The paper reviews classical agency-theory and its contributions to the analysis of complex business negotiations and gives an approach for the modification of the basic agency-model in order to examine the negotiation specific dimensions of agency-problems. By illustrating fundamental potentials for the modification of agency-theory in context of business negotiations the paper highlights recent empirical research that investigates agent-based negotiations and inter-team constellations. A general theoretical analysis of complex negotiation would be based on a two-level approach. First, the modification of the basic agency-model in order to illustrate the organizational context of business negotiations (i.e., multi-agent issues, common-agencies, multi-period models and the concept of bounded rationality). Second, the application of the modified agency-model on complex business negotiations to identify agency-problems and relating areas of risk in the negotiation process. The paper is placed on the first level of analysis – the modification. The method builds on the one hand on insights from behavior decision research (BRD) and on the other hand on findings from agency-theory as normative directives to the modification of the basic model. Through neoclassical assumptions concerning the fundamental aspects of agency-relationships in business negotiations (i.e., asymmetric information, self-interest, risk preferences and conflict of interests), agency-theory helps to draw solutions on stated worst-case-scenarios taken from the daily negotiation routine. As agency-theory is the only universal approach able to identify trade-offs between certain aspects of economic cooperation, insights obtained provide a deeper understanding of the forces that shape business negotiation complexity. The need for a modification of the basic model is illustrated by highlighting selected issues of business negotiations from agency-theory perspective: Negotiation Teams require a multi-agent approach under the condition that often decision-makers as superior-agents are part of the team. The diversity of competences and decision-making authority is a phenomenon that overrides the assumptions of classical agency-theory and varies greatly in context of certain forms of business negotiations. Further, the basic model is bound to dyadic relationships preceded by the delegation of decision-making authority and builds on a contractual created (vertical) hierarchy. As a result, horizontal dynamics within the negotiation team playing an important role for negotiation success are therefore not considered in the investigation of agency-problems. Also, the trade-off between short-term relationships within the negotiation sphere and the long-term relationships of the corporate sphere calls for a multi-period perspective taking into account the sphere-specific governance-mechanisms already established (i.e., reward and monitoring systems). Within the analysis, the implementation of bounded rationality is closely related to findings from BRD to assess the impact of negotiation behavior on underlying principal-agent-relationships. As empirical findings show, the disclosure and reservation of information to the agent affect his negotiation behavior as well as final negotiation outcomes. Last, in context of business negotiations, asymmetric information is often intended by decision-makers acting as superior-agents or principals which calls for a bilateral risk-approach to agency-relations.

Keywords: business negotiations, agency-theory, negotiation analysis, interteam negotiations

Procedia PDF Downloads 128
2008 A Linear Autoregressive and Non-Linear Regime Switching Approach in Identifying the Structural Breaks Caused by Anti-Speculation Measures: The Case of Hong Kong

Authors: Mengna Hu

Abstract:

This paper examines the impact of an anti-speculation tax policy on the trading activities and home price movements in the housing market in Hong Kong. The study focuses on the secondary residential property market where transactions dominate. The policy intervention substantially raised the transaction cost to speculators as well as genuine homeowners who dispose their homes within a certain period. Through the demonstration of structural breaks, our empirical results show that the rise in transaction cost effectively reduced speculative trading activities. However, it accelerated price increase in the small-sized segment by vastly demotivating existing homeowners from trading up to better homes, causing congestion in the lower-end market where the demand from first-time buyers is still strong. Apart from that, by employing regime switching approach, we further show that the unintended consequences are likely to be persistent due to this policy together with other strengthened cooling measures.

Keywords: transaction costs, housing market, structural breaks, regime switching

Procedia PDF Downloads 251
2007 Information Technology and Professional Behavior: An Empirical Examination of Auditing and Accounting Tasks

Authors: Michael C. Nwaohia

Abstract:

Whereas anecdotal evidence supports the notion that increase in information technology (IT) know-how may enhance output of professionals in the accounting sector, this has not been systematically explored in the Nigerian context. Against this background, this paper examines the correlation between knowledgeability of IT and level of performance at everyday auditing and accounting tasks. It utilizes primary and secondary data from selected business organizations in Lagos, Nigeria. Accounting staff were administered structured questionnaires which, amongst other things, sought to examine knowledge and exposure to information technology prior to joining the firms and current level of performance based on self-reporting and supervisor comments. In addition, exposure to on-the-job IT training and current level of performance was examined. The statistical analysis of the data was done using the SPSS package. The results strongly suggest that prior exposure to IT skills enabled accounting professionals to better flexibly fit into the dynamic environment in which contemporary business takes place. Ultimately, the paper attempts to explicate some of the implications of these findings for individuals and business firms.

Keywords: accounting, firms, information technology, professional behavior

Procedia PDF Downloads 222
2006 Walkability and Urban Social Identity

Authors: Reihaneh Rafiemanzelat

Abstract:

One of the most recent fields of investigation in urban issues focuses on the walkability in urban spaces. The paper aims to establish the theoretical relationship between the people's link with definite urban public spaces and the social identity processes derived from the relation with these places. The theoretical aspects which are examined for this purpose are: the concept of walkability and its developments and the social identity theories derived from walkable spaces. In fact, the paper presents the main results obtained from an empirical investigation which concern to the genesis of urban social identity in particular street as one of the main elements of public spaces in cities. İsmet İnönü Blvd which known as Salamis Street in Famagusta, North Cyprus is one of the main street in city whit high level of physical and social activities all the time. The urban social identity of users was analyzed, focusing on three main factors: walkability of space, social identification, and image of the space. These three factors were analyzed in relation to a series of items in the initial questionnaire, evaluation of existing natural resources, and environmental attitudes.

Keywords: walkability, urban public space, pedestrian, social activity, social identity

Procedia PDF Downloads 421
2005 Models Comparison for Solar Radiation

Authors: Djelloul Benatiallah

Abstract:

Due to the current high consumption and recent industry growth, the depletion of fossil and natural energy supplies like oil, gas, and uranium is declining. Due to pollution and climate change, there needs to be a swift switch to renewable energy sources. Research on renewable energy is being done to meet energy needs. Solar energy is one of the renewable resources that can currently meet all of the world's energy needs. In most parts of the world, solar energy is a free and unlimited resource that can be used in a variety of ways, including photovoltaic systems for the generation of electricity and thermal systems for the generation of heatfor the residential sector's production of hot water. In this article, we'll conduct a comparison. The first step entails identifying the two empirical models that will enable us to estimate the daily irradiations on a horizontal plane. On the other hand, we compare it using the data obtained from measurements made at the Adrar site over the four distinct seasons. The model 2 provides a better estimate of the global solar components, with an absolute mean error of less than 7% and a correlation coefficient of more than 0.95, as well as a relative coefficient of the bias error that is less than 6% in absolute value and a relative RMSE that is less than 10%, according to a comparison of the results obtained by simulating the two models.

Keywords: solar radiation, renewable energy, fossil, photovoltaic systems

Procedia PDF Downloads 68
2004 Bible of Hospitality: Considering the Hotel Business through the Prism of the Evangelical Approach

Authors: Rimma Kiseleva

Abstract:

The hotel business has a long history. The basis of the service of hospitality industry enterprises is the service, attitude, and consciousness of employees as hospitable “hosts of the house”. It is generally accepted that the founder and main expert of quality service is Caesar Ritz, “the king of hoteliers and the hotelier of kings.” However when deeply immersed in the history of the universe, it turns out that the very first book about hospitality, standardization of guest reception processes and the basics of better service is nothing more than the Bible. A unique study on the topic of considering the Church as a hotel, as well as the hotel business itself as the most gracious work of Jesus Christ Himself, which is confirmed by verses from the Gospel, includes the following approaches: analytical, comparative, empirical. The study shows that it was Jesus Christ who became the founder of the rules of the most sacrificial service, real service to people, filled with brotherly love, humility, love for strangers, those qualities that are the foundation, the “three pillars” of the hospitality industry. And also that the hotel is the most charitable cause, which is still relevant today.

Keywords: Augustine Aurelius, Bible, Gospel, guest house, hospitality, hotel, humility, inn, Jesus Christ, Joseph Fletcher, New Testament, Paul Tillich, service, strangeness

Procedia PDF Downloads 37
2003 The Vicissitudes of Monetary Policy Rates and Macro-Economic Variables in the West African Monetary Zone

Authors: Jonathan Olusegun Famoroti, Mathew Ekundayo Rotimi, Mishelle Doorasamy

Abstract:

This study offers an empirical investigation into some selected macroeconomic drivers of the monetary policy rate in member countries of the West African Monetary Zone (WAMZ), considering both internal and external variables. We employed Autoregressive Distributed Lag (ARDL) to carry out the investigation between monetary policy and some macroeconomic variables in both the long-run and short-run relationship. The results suggest that the drivers of the policy rate in this zone, in the long run, include, among others, global oil price, exchange rate, inflation rate, and gross domestic product, while in the short run, federal fund rate, trade openness, exchange rate, inflation rate, and gross domestic product are core determinants of the policy rate. Therefore, in order to ensure long-run stability in the policy rate among the members’ states, these drivers should be given closer consideration so that the trajectory for effective structure can be designed and fused into the economic structure and policy frameworks accordingly.

Keywords: monetary policy rate, macroeconomic variables, WAMZ, ARDL

Procedia PDF Downloads 50
2002 Shaft Friction of Bored Pile Socketed in Weathered Limestone in Qatar

Authors: Thanawat Chuleekiat

Abstract:

Socketing of bored piles in rock is always seen as a matter of debate on construction sites between consultants and contractors. The socketing depth normally depends on the type of rock, depth at which the rock is available below the pile cap and load carrying capacity of the pile. In this paper, the review of field load test data of drilled shaft socketed in weathered limestone conducted using conventional static pile load test and dynamic pile load test was made to evaluate a unit shaft friction for the bored piles socketed in weathered limestone (weak rock). The borehole drilling data were also reviewed in conjunction with the pile test result. In addition, the back-calculated unit shaft friction was reviewed against various empirical methods for bored piles socketed in weak rock. The paper concludes with an estimated ultimate unit shaft friction from the case study in Qatar for preliminary design.

Keywords: piled foundation, weathered limestone, shaft friction, rock socket, pile load test

Procedia PDF Downloads 164
2001 Application of Seasonal Autoregressive Integrated Moving Average Model for Forecasting Monthly Flows in Waterval River, South Africa

Authors: Kassahun Birhanu Tadesse, Megersa Olumana Dinka

Abstract:

Reliable future river flow information is basic for planning and management of any river systems. For data scarce river system having only a river flow records like the Waterval River, a univariate time series models are appropriate for river flow forecasting. In this study, a univariate Seasonal Autoregressive Integrated Moving Average (SARIMA) model was applied for forecasting Waterval River flow using GRETL statistical software. Mean monthly river flows from 1960 to 2016 were used for modeling. Different unit root tests and Mann-Kendall trend analysis were performed to test the stationarity of the observed flow time series. The time series was differenced to remove the seasonality. Using the correlogram of seasonally differenced time series, different SARIMA models were identified, their parameters were estimated, and diagnostic check-up of model forecasts was performed using white noise and heteroscedasticity tests. Finally, based on minimum Akaike Information (AIc) and Hannan-Quinn (HQc) criteria, SARIMA (3, 0, 2) x (3, 1, 3)12 was selected as the best model for Waterval River flow forecasting. Therefore, this model can be used to generate future river information for water resources development and management in Waterval River system. SARIMA model can also be used for forecasting other similar univariate time series with seasonality characteristics.

Keywords: heteroscedasticity, stationarity test, trend analysis, validation, white noise

Procedia PDF Downloads 196
2000 Trusting Smart Speakers: Analysing the Different Levels of Trust between Technologies

Authors: Alec Wells, Aminu Bello Usman, Justin McKeown

Abstract:

The growing usage of smart speakers raises many privacy and trust concerns compared to other technologies such as smart phones and computers. In this study, a proxy measure of trust is used to gauge users’ opinions on three different technologies based on an empirical study, and to understand which technology most people are most likely to trust. The collected data were analysed using the Kruskal-Wallis H test to determine the statistical differences between the users’ trust level of the three technologies: smart speaker, computer and smart phone. The findings of the study revealed that despite the wide acceptance, ease of use and reputation of smart speakers, people find it difficult to trust smart speakers with their sensitive information via the Direct Voice Input (DVI) and would prefer to use a keyboard or touchscreen offered by computers and smart phones. Findings from this study can inform future work on users’ trust in technology based on perceived ease of use, reputation, perceived credibility and risk of using technologies via DVI.

Keywords: direct voice input, risk, security, technology, trust

Procedia PDF Downloads 177
1999 Fiscal Size and Composition Effects on Growth: Empirical Evidence from Asian Economies

Authors: Jeeban Amgain

Abstract:

This paper investigates the impact of the size and composition of government expenditure and tax on GDP per capita growth in 36 Asian economies over the period of 1991-2012. The research employs the technique of panel regression; Fixed Effects and Generalized Method of Moments (GMM) as well as other statistical and descriptive approaches. The finding concludes that the size of government expenditure and tax revenue are generally low in this region. GDP per capita growth is strongly negative in response to Government expenditure, however, no significant relationship can be measured in case of size of taxation although it is positively correlated with economic growth. Panel regression of decomposed fiscal components also shows that the pattern of allocation of expenditure and taxation really matters on growth. Taxes on international trade and property have a significant positive impact on growth. In contrast, a major portion of expenditure, i.e. expenditure on general public services, health and education are found to have significant negative impact on growth, implying that government expenditures are not being productive in the Asian region for some reasons. Comparatively smaller and efficient government size would enhance the growth.

Keywords: government expenditure, tax, GDP per capita growth, composition

Procedia PDF Downloads 459
1998 The Influence of Air Temperature Controls in Estimation of Air Temperature over Homogeneous Terrain

Authors: Fariza Yunus, Jasmee Jaafar, Zamalia Mahmud, Nurul Nisa’ Khairul Azmi, Nursalleh K. Chang, Nursalleh K. Chang

Abstract:

Variation of air temperature from one place to another is cause by air temperature controls. In general, the most important control of air temperature is elevation. Another significant independent variable in estimating air temperature is the location of meteorological stations. Distances to coastline and land use type are also contributed to significant variations in the air temperature. On the other hand, in homogeneous terrain direct interpolation of discrete points of air temperature work well to estimate air temperature values in un-sampled area. In this process the estimation is solely based on discrete points of air temperature. However, this study presents that air temperature controls also play significant roles in estimating air temperature over homogenous terrain of Peninsular Malaysia. An Inverse Distance Weighting (IDW) interpolation technique was adopted to generate continuous data of air temperature. This study compared two different datasets, observed mean monthly data of T, and estimation error of T–T’, where T’ estimated value from a multiple regression model. The multiple regression model considered eight independent variables of elevation, latitude, longitude, coastline, and four land use types of water bodies, forest, agriculture and build up areas, to represent the role of air temperature controls. Cross validation analysis was conducted to review accuracy of the estimation values. Final results show, estimation values of T–T’ produced lower errors for mean monthly mean air temperature over homogeneous terrain in Peninsular Malaysia.

Keywords: air temperature control, interpolation analysis, peninsular Malaysia, regression model, air temperature

Procedia PDF Downloads 366
1997 Development of a Matlab® Program for the Bi-Dimensional Truss Analysis Using the Stiffness Matrix Method

Authors: Angel G. De Leon Hernandez

Abstract:

A structure is defined as a physical system or, in certain cases, an arrangement of connected elements, capable of bearing certain loads. The structures are presented in every part of the daily life, e.g., in the designing of buildings, vehicles and mechanisms. The main goal of a structure designer is to develop a secure, aesthetic and maintainable system, considering the constraint imposed to every case. With the advances in the technology during the last decades, the capabilities of solving engineering problems have increased enormously. Nowadays the computers, play a critical roll in the structural analysis, pitifully, for university students the vast majority of these software are inaccessible due to the high complexity and cost they represent, even when the software manufacturers offer student versions. This is exactly the reason why the idea of developing a more reachable and easy-to-use computing tool. This program is designed as a tool for the university students enrolled in courser related to the structures analysis and designs, as a complementary instrument to achieve a better understanding of this area and to avoid all the tedious calculations. Also, the program can be useful for graduated engineers in the field of structural design and analysis. A graphical user interphase is included in the program to make it even simpler to operate it and understand the information requested and the obtained results. In the present document are included the theoretical basics in which the program is based to solve the structural analysis, the logical path followed in order to develop the program, the theoretical results, a discussion about the results and the validation of those results.

Keywords: stiffness matrix method, structural analysis, Matlab® applications, programming

Procedia PDF Downloads 110
1996 Supporting Young Emergent Multilingual Learners in Prekindergarten Classrooms: Policy Implications

Authors: Tiedan Huang, Chun Zhang, Caitlin Coe

Abstract:

This study investigated the quality of instructional support for young Emergent Multilingual Learners (EMLs) in 50 Universal Prekindergarten (UPK) classroom in New York City (NYC). This is one of the first empirical studies examining the instructional support for this student population. We collected data using a mixed method of structured observations of teacher-child interactions in 50 classrooms, and surveys and interviews with program leaders and the teaching teams. We found that NYC’s UPK classrooms offered warm and supportive environments for EMLs. Nevertheless, in general, instructional support was relatively low. This study identified large mindset, knowledge, and practice gaps—and a real opportunity—among NYC early childhood professionals, specifically in the areas of providing adequate instructional and linguistic support for EMLs as well as partnering with families in capturing their cultural and home literacy assets. Consistent, rigorous, and meaningful use of data is necessary to support both EMLs’ language and literacy development and teachers’/leaders’ professional development.

Keywords: high quality instruction, culturally and linguistically responsive practices, professional development, workforce development

Procedia PDF Downloads 69
1995 Liver and Liver Lesion Segmentation From Abdominal CT Scans

Authors: Belgherbi Aicha, Hadjidj Ismahen, Bessaid Abdelhafid

Abstract:

The interpretation of medical images benefits from anatomical and physiological priors to optimize computer- aided diagnosis applications. Segmentation of liver and liver lesion is regarded as a major primary step in computer aided diagnosis of liver diseases. Precise liver segmentation in abdominal CT images is one of the most important steps for the computer-aided diagnosis of liver pathology. In this papers, a semi- automated method for medical image data is presented for the liver and liver lesion segmentation data using mathematical morphology. Our algorithm is currency in two parts. In the first, we seek to determine the region of interest by applying the morphological filters to extract the liver. The second step consists to detect the liver lesion. In this task; we proposed a new method developed for the semi-automatic segmentation of the liver and hepatic lesions. Our proposed method is based on the anatomical information and mathematical morphology tools used in the image processing field. At first, we try to improve the quality of the original image and image gradient by applying the spatial filter followed by the morphological filters. The second step consists to calculate the internal and external markers of the liver and hepatic lesions. Thereafter we proceed to the liver and hepatic lesions segmentation by the watershed transform controlled by markers. The validation of the developed algorithm is done using several images. Obtained results show the good performances of our proposed algorithm

Keywords: anisotropic diffusion filter, CT images, hepatic lesion segmentation, Liver segmentation, morphological filter, the watershed algorithm

Procedia PDF Downloads 440
1994 A Linear Regression Model for Estimating Anxiety Index Using Wide Area Frontal Lobe Brain Blood Volume

Authors: Takashi Kaburagi, Masashi Takenaka, Yosuke Kurihara, Takashi Matsumoto

Abstract:

Major depressive disorder (MDD) is one of the most common mental illnesses today. It is believed to be caused by a combination of several factors, including stress. Stress can be quantitatively evaluated using the State-Trait Anxiety Inventory (STAI), one of the best indices to evaluate anxiety. Although STAI scores are widely used in applications ranging from clinical diagnosis to basic research, the scores are calculated based on a self-reported questionnaire. An objective evaluation is required because the subject may intentionally change his/her answers if multiple tests are carried out. In this article, we present a modified index called the “multi-channel Laterality Index at Rest (mc-LIR)” by recording the brain activity from a wider area of the frontal lobe using multi-channel functional near-infrared spectroscopy (fNIRS). The presented index aims to measure multiple positions near the Fpz defined by the international 10-20 system positioning. Using 24 subjects, the dependencies on the number of measuring points used to calculate the mc-LIR and its correlation coefficients with the STAI scores are reported. Furthermore, a simple linear regression was performed to estimate the STAI scores from mc-LIR. The cross-validation error is also reported. The experimental results show that using multiple positions near the Fpz will improve the correlation coefficients and estimation than those using only two positions.

Keywords: frontal lobe, functional near-infrared spectroscopy, state-trait anxiety inventory score, stress

Procedia PDF Downloads 241
1993 A Mixed-Integer Nonlinear Program to Optimally Pace and Fuel Ultramarathons

Authors: Kristopher A. Pruitt, Justin M. Hill

Abstract:

The purpose of this research is to determine the pacing and nutrition strategies which minimize completion time and carbohydrate intake for athletes competing in ultramarathon races. The model formulation consists of a two-phase optimization. The first-phase mixed-integer nonlinear program (MINLP) determines the minimum completion time subject to the altitude, terrain, and distance of the race, as well as the mass and cardiovascular fitness of the athlete. The second-phase MINLP determines the minimum total carbohydrate intake required for the athlete to achieve the completion time prescribed by the first phase, subject to the flow of carbohydrates through the stomach, liver, and muscles. Consequently, the second phase model provides the optimal pacing and nutrition strategies for a particular athlete for each kilometer of a particular race. Validation of the model results over a wide range of athlete parameters against completion times for real competitive events suggests strong agreement. Additionally, the kilometer-by-kilometer pacing and nutrition strategies, the model prescribes for a particular athlete suggest unconventional approaches could result in lower completion times. Thus, the MINLP provides prescriptive guidance that athletes can leverage when developing pacing and nutrition strategies prior to competing in ultramarathon races. Given the highly-variable topographical characteristics common to many ultramarathon courses and the potential inexperience of many athletes with such courses, the model provides valuable insight to competitors who might otherwise fail to complete the event due to exhaustion or carbohydrate depletion.

Keywords: nutrition, optimization, pacing, ultramarathons

Procedia PDF Downloads 176
1992 LEED Empirical Evidence in Northern and Southern Europe

Authors: Svetlana Pushkar

Abstract:

The Leadership in Energy and Environmental Design (LEED) green building rating system is recognized in Europe. LEED uses regional priority (RP) points that are adapted to different environmental conditions. However, the appropriateness of the RP points is still a controversial question. To clarify this issue, two different parts of Europe: northern Europe (Finland and Sweden) and southern Europe (Turkey and Spain) were considered. Similarities and differences in the performances of LEED 2009-new construction (LEED-NC 2009) in these four countries were analyzed. It was found that LEED-NC 2009 performances in northern and southern parts of Europe in terms of Sustainable Sites (SS), Water Efficiency (WE), Materials and Resources (MR), and Indoor Environmental Quality (EQ) were similar, whereas in Energy and Atmosphere (EA), their performances were different. WE and SS revealed high performances (70-100%); EA and EQ demonstrated intermediate performance (40-60%); and MR displayed low performance (20-40%). It should be recommended introducing the following new RP points: for Turkey - water-related points and for all four observed countries - green power-related points for improving the LEED adaptation in Europe.

Keywords: green building, Europe, LEED, leadership in energy and environmental design, regional priority points

Procedia PDF Downloads 242
1991 Geosynthetic Reinforced Unpaved Road: Literature Study and Design Example

Authors: D. Jayalakshmi, S. S. Bhosale

Abstract:

This paper, in its first part, presents the state-of-the-art literature of design approaches for geosynthetic reinforced unpaved roads. The literature starting since 1970 and the critical appraisal of flexible pavement design by Giroud and Han (2004) and Jonathan Fannin (2006) is presented. The design example is illustrated for Indian conditions. The example emphasizes the results computed by Giroud and Han's (2004) design method with the Indian road congress guidelines by IRC SP 72 -2015. The input data considered are related to the subgrade soil condition of Maharashtra State in India. The unified soil classification of the subgrade soil is inorganic clay with high plasticity (CH), which is expansive with a California bearing ratio (CBR) of 2% to 3%. The example exhibits the unreinforced case and geotextile as reinforcement by varying the rut depth from 25 mm to 100 mm. The present result reveals the base thickness for the unreinforced case from the IRC design catalogs is in good agreement with Giroud and Han (2004) approach for a range of 75 mm to 100 mm rut depth. Since Giroud and Han (2004) method is applicable for both reinforced and unreinforced cases, for the same data with appropriate Nc factor, for the same rut depth, the base thickness for the reinforced case has arrived for the Indian condition. From this trial, for the CBR of 2%, the base thickness reduction due to geotextile inclusion is 35%. For the CBR range of 2% to 5% with different stiffness in geosynthetics, the reduction in base course thickness will be evaluated, and the validation will be executed by the full-scale accelerated pavement testing set up at the College of Engineering Pune (COE), India.

Keywords: base thickness, design approach, equation, full scale accelerated pavement set up, Indian condition

Procedia PDF Downloads 180
1990 Parameters Identification of Granular Soils around PMT Test by Inverse Analysis

Authors: Younes Abed

Abstract:

The successful application of in-situ testing of soils heavily depends on development of interpretation methods of tests. The pressuremeter test simulates the expansion of a cylindrical cavity and because it has well defined boundary conditions, it is more unable to rigorous theoretical analysis (i. e. cavity expansion theory) then most other in-situ tests. In this article, and in order to make the identification process more convenient, we propose a relatively simple procedure which involves the numerical identification of some mechanical parameters of a granular soil, especially, the elastic modulus and the friction angle from a pressuremeter curve. The procedure, applied here to identify the parameters of generalised prager model associated to the Drucker & Prager criterion from a pressuremeter curve, is based on an inverse analysis approach, which consists of minimizing the function representing the difference between the experimental curve and the curve obtained by integrating the model along the loading path in in-situ testing. The numerical process implemented here is based on the established finite element program. We present a validation of the proposed approach by a database of tests on expansion of cylindrical cavity. This database consists of four types of tests; thick cylinder tests carried out on the Hostun RF sand, pressuremeter tests carried out on the Hostun sand, in-situ pressuremeter tests carried out at the site of Fos with marine self-boring pressuremeter and in-situ pressuremeter tests realized on the site of Labenne with Menard pressuremeter.

Keywords: granular soils, cavity expansion, pressuremeter test, finite element method, identification procedure

Procedia PDF Downloads 283
1989 Unsteady Flow Simulations for Microchannel Design and Its Fabrication for Nanoparticle Synthesis

Authors: Mrinalini Amritkar, Disha Patil, Swapna Kulkarni, Sukratu Barve, Suresh Gosavi

Abstract:

Micro-mixers play an important role in the lab-on-a-chip applications and micro total analysis systems to acquire the correct level of mixing for any given process. The mixing process can be classified as active or passive according to the use of external energy. Literature of microfluidics reports that most of the work is done on the models of steady laminar flow; however, the study of unsteady laminar flow is an active area of research at present. There are wide applications of this, out of which, we consider nanoparticle synthesis in micro-mixers. In this work, we have developed a model for unsteady flow to study the mixing performance of a passive micro mixer for reactants used for such synthesis. The model is developed in Finite Volume Method (FVM)-based software, OpenFOAM. The model is tested by carrying out the simulations at Re of 0.5. Mixing performance of the micro-mixer is investigated using simulated concentration values of mixed species across the width of the micro-mixer and calculating the variance across a line profile. Experimental validation is done by passing dyes through a Y shape micro-mixer fabricated using polydimethylsiloxane (PDMS) polymer and comparing variances with the simulated ones. Gold nanoparticles are later synthesized through the micro-mixer and collected at two different times leading to significantly different size distributions. These times match with the time scales over which reactant concentrations vary as obtained from simulations. Our simulations could thus be used to create design aids for passive micro-mixers used in nanoparticle synthesis.

Keywords: Lab-on-chip, LOC, micro-mixer, OpenFOAM, PDMS

Procedia PDF Downloads 148
1988 Instructional Consequences of the Transiency of Spoken Words

Authors: Slava Kalyuga, Sujanya Sombatteera

Abstract:

In multimedia learning, written text is often transformed into spoken (narrated) text. This transient information may overwhelm limited processing capacity of working memory and inhibit learning instead of improving it. The paper reviews recent empirical studies in modality and verbal redundancy effects within a cognitive load framework and outlines conditions under which negative effects of transiency may occur. According to the modality effect, textual information accompanying pictures should be presented in an auditory rather than visual form in order to engage two available channels of working memory – auditory and visual - instead of only one of them. However, some studies failed to replicate the modality effect and found differences opposite to those expected. Also, according to the multimedia redundancy effect, the same information should not be presented simultaneously in different modalities to avoid unnecessary cognitive load imposed by the integration of redundant sources of information. However, a few studies failed to replicate the multimedia redundancy effect too. Transiency of information is used to explain these controversial results.

Keywords: cognitive load, transient information, modality effect, verbal redundancy effect

Procedia PDF Downloads 372
1987 Rules in Policy Integration, Case Study: Victoria Catchment Management

Authors: Ratri Werdiningtyas, Yongping Wei, Andrew Western

Abstract:

This paper contributes to on-going attempts at bringing together land, water and environmental policy in catchment management. A tension remains in defining the boundaries of policy integration. Most of Integrated Water Resource Management is valued as rhetoric policy. It is far from being achieved on the ground because the socio-ecological system has not been understood and developed into complete and coherent problem representation. To clarify the feature of integration, this article draws on institutional fit for public policy integration and uses these insights in an empirical setting to identify the mechanism that can facilitate effective public integration for catchment management. This research is based on the journey of Victoria’s government from 1890-2016. A total of 274 Victorian Acts related to land, water, environment management published in those periods has been investigated. Four conditions of integration have been identified in their co-evolution: (1) the integration policy based on reserves, (2) the integration policy based on authority interest, (3) policy based on integrated information and, (4) policy based coordinated resource, authority and information. Results suggest that policy coordination among their policy instrument is superior rather than policy integration in the case of catchment management.

Keywords: catchment management, co-evolution, policy integration, phase

Procedia PDF Downloads 236
1986 Political Economy of Development Induced Re-Territorialization: A South African Uppercut

Authors: K. Lekshmi

Abstract:

Land becomes a predominant constituent of transitional justice paradigm subsequent to the apartheid inspired land grabs and conflict induced forceful evictions in South Africa effecting land encroachment, expropriation, and alienation. In this pretext, post-Apartheid regime initiated land reconciliation measures which presume to overcome the politically appropriated historical injustices in conjunction with reconstructing transitional justice. As land grabs became one of the quintessential repercussions followed by ethnic cleansing in South Africa, it is prominent to study how land reconciliation becomes necessary in imparting transitional justice to the victims. The study also looks into the nature of developmental pattern after re- territorialization process in a post-conflict country like South Africa and, tries to look how re-territorialization process construed the functional distribution of income vis-a-vis income inequality in particular. Further the paper attempts to study how far land distribution and equal access as part of the land reconciliation process juxtaposed the principle of restitution. Research methodology applied is empirical followed by analytical research.

Keywords: development, land reconciliation, transitional justice, income inequality and displacement, re-territorialization

Procedia PDF Downloads 190
1985 Convergence Analysis of Training Two-Hidden-Layer Partially Over-Parameterized ReLU Networks via Gradient Descent

Authors: Zhifeng Kong

Abstract:

Over-parameterized neural networks have attracted a great deal of attention in recent deep learning theory research, as they challenge the classic perspective of over-fitting when the model has excessive parameters and have gained empirical success in various settings. While a number of theoretical works have been presented to demystify properties of such models, the convergence properties of such models are still far from being thoroughly understood. In this work, we study the convergence properties of training two-hidden-layer partially over-parameterized fully connected networks with the Rectified Linear Unit activation via gradient descent. To our knowledge, this is the first theoretical work to understand convergence properties of deep over-parameterized networks without the equally-wide-hidden-layer assumption and other unrealistic assumptions. We provide a probabilistic lower bound of the widths of hidden layers and proved linear convergence rate of gradient descent. We also conducted experiments on synthetic and real-world datasets to validate our theory.

Keywords: over-parameterization, rectified linear units ReLU, convergence, gradient descent, neural networks

Procedia PDF Downloads 132
1984 Large Eddy Simulation of Hydrogen Deflagration in Open Space and Vented Enclosure

Authors: T. Nozu, K. Hibi, T. Nishiie

Abstract:

This paper discusses the applicability of the numerical model for a damage prediction method of the accidental hydrogen explosion occurring in a hydrogen facility. The numerical model was based on an unstructured finite volume method (FVM) code “NuFD/FrontFlowRed”. For simulating unsteady turbulent combustion of leaked hydrogen gas, a combination of Large Eddy Simulation (LES) and a combustion model were used. The combustion model was based on a two scalar flamelet approach, where a G-equation model and a conserved scalar model expressed a propagation of premixed flame surface and a diffusion combustion process, respectively. For validation of this numerical model, we have simulated the previous two types of hydrogen explosion tests. One is open-space explosion test, and the source was a prismatic 5.27 m3 volume with 30% of hydrogen-air mixture. A reinforced concrete wall was set 4 m away from the front surface of the source. The source was ignited at the bottom center by a spark. The other is vented enclosure explosion test, and the chamber was 4.6 m × 4.6 m × 3.0 m with a vent opening on one side. Vent area of 5.4 m2 was used. Test was performed with ignition at the center of the wall opposite the vent. Hydrogen-air mixtures with hydrogen concentrations close to 18% vol. were used in the tests. The results from the numerical simulations are compared with the previous experimental data for the accuracy of the numerical model, and we have verified that the simulated overpressures and flame time-of-arrival data were in good agreement with the results of the previous two explosion tests.

Keywords: deflagration, large eddy simulation, turbulent combustion, vented enclosure

Procedia PDF Downloads 234
1983 Development of Methods for Plastic Injection Mold Weight Reduction

Authors: Bita Mohajernia, R. J. Urbanic

Abstract:

Mold making techniques have focused on meeting the customers’ functional and process requirements; however, today, molds are increasing in size and sophistication, and are difficult to manufacture, transport, and set up due to their size and mass. Presently, mold weight saving techniques focus on pockets to reduce the mass of the mold, but the overall size is still large, which introduces costs related to the stock material purchase, processing time for process planning, machining and validation, and excess waste materials. Reducing the overall size of the mold is desirable for many reasons, but the functional requirements, tool life, and durability cannot be compromised in the process. It is proposed to use Finite Element Analysis simulation tools to model the forces, and pressures to determine where the material can be removed. The potential results of this project will reduce manufacturing costs. In this study, a light weight structure is defined by an optimal distribution of material to carry external loads. The optimization objective of this research is to determine methods to provide the optimum layout for the mold structure. The topology optimization method is utilized to improve structural stiffness while decreasing the weight using the OptiStruct software. The optimized CAD model is compared with the primary geometry of the mold from the NX software. Results of optimization show an 8% weight reduction while the actual performance of the optimized structure, validated by physical testing, is similar to the original structure.

Keywords: finite element analysis, plastic injection molding, topology optimization, weight reduction

Procedia PDF Downloads 280