Search results for: topic modelling
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3085

Search results for: topic modelling

505 A First Step towards Automatic Evolutionary for Gas Lifts Allocation Optimization

Authors: Younis Elhaddad, Alfonso Ortega

Abstract:

Oil production by means of gas lift is a standard technique in oil production industry. To optimize the total amount of oil production in terms of the amount of gas injected is a key question in this domain. Different methods have been tested to propose a general methodology. Many of them apply well-known numerical methods. Some of them have taken into account the power of evolutionary approaches. Our goal is to provide the experts of the domain with a powerful automatic searching engine into which they can introduce their knowledge in a format close to the one used in their domain, and get solutions comprehensible in the same terms, as well. These proposals introduced in the genetic engine the most expressive formal models to represent the solutions to the problem. These algorithms have proven to be as effective as other genetic systems but more flexible and comfortable for the researcher although they usually require huge search spaces to justify their use due to the computational resources involved in the formal models. The first step to evaluate the viability of applying our approaches to this realm is to fully understand the domain and to select an instance of the problem (gas lift optimization) in which applying genetic approaches could seem promising. After analyzing the state of the art of this topic, we have decided to choose a previous work from the literature that faces the problem by means of numerical methods. This contribution includes details enough to be reproduced and complete data to be carefully analyzed. We have designed a classical, simple genetic algorithm just to try to get the same results and to understand the problem in depth. We could easily incorporate the well mathematical model, and the well data used by the authors and easily translate their mathematical model, to be numerically optimized, into a proper fitness function. We have analyzed the 100 curves they use in their experiment, similar results were observed, in addition, our system has automatically inferred an optimum total amount of injected gas for the field compatible with the addition of the optimum gas injected in each well by them. We have identified several constraints that could be interesting to incorporate to the optimization process but that could be difficult to numerically express. It could be interesting to automatically propose other mathematical models to fit both, individual well curves and also the behaviour of the complete field. All these facts and conclusions justify continuing exploring the viability of applying the approaches more sophisticated previously proposed by our research group.

Keywords: evolutionary automatic programming, gas lift, genetic algorithms, oil production

Procedia PDF Downloads 147
504 Drug Design Modelling and Molecular Virtual Simulation of an Optimized BSA-Based Nanoparticle Formulation Loaded with Di-Berberine Sulfate Acid Salt

Authors: Eman M. Sarhan, Doaa A. Ghareeb, Gabriella Ortore, Amr A. Amara, Mohamed M. El-Sayed

Abstract:

Drug salting and nanoparticle-based drug delivery formulations are considered to be an effective means for rendering the hydrophobic drugs’ nano-scale dispersion in aqueous media, and thus circumventing the pitfalls of their poor solubility as well as enhancing their membrane permeability. The current study aims to increase the bioavailability of quaternary ammonium berberine through acid salting and biodegradable bovine serum albumin (BSA)-based nanoparticulate drug formulation. Berberine hydroxide (BBR-OH) that was chemically synthesized by alkalization of the commercially available berberine hydrochloride (BBR-HCl) was then acidified to get Di-berberine sulfate (BBR)₂SO₄. The purified crystals were spectrally characterized. The desolvation technique was optimized for the preparation of size-controlled BSA-BBR-HCl, BSA-BBR-OH, and BSA-(BBR)₂SO₄ nanoparticles. Particle size, zeta potential, drug release, encapsulation efficiency, Fourier transform infrared spectroscopy (FTIR), tandem MS-MS spectroscopy, energy-dispersive X-ray spectroscopy (EDX), scanning and transmitting electron microscopic examination (SEM, TEM), in vitro bioactivity, and in silico drug-polymer interaction were determined. BSA (PDB ID; 4OR0) protonation state at different pH values was predicted using Amber12 molecular dynamic simulation. Then blind docking was performed using Lamarkian genetic algorithm (LGA) through AutoDock4.2 software. Results proved the purity and the size-controlled synthesis of berberine-BSA-nanoparticles. The possible binding poses, hydrophobic and hydrophilic interactions of berberine on BSA at different pH values were predicted. Antioxidant, anti-hemolytic, and cell differentiated ability of tested drugs and their nano-formulations were evaluated. Thus, drug salting and the potentially effective albumin berberine nanoparticle formulations can be successfully developed using a well-optimized desolvation technique and exhibiting better in vitro cellular bioavailability.

Keywords: berberine, BSA, BBR-OH, BBR-HCl, BSA-BBR-HCl, BSA-BBR-OH, (BBR)₂SO₄, BSA-(BBR)₂SO₄, FTIR, AutoDock4.2 Software, Lamarkian genetic algorithm, SEM, TEM, EDX

Procedia PDF Downloads 153
503 The Impact of Floods and Typhoons on Housing Welfare: Case Study of Thua Thien Hue Province, Vietnam

Authors: Seyeon Lee, Suyeon Lee, Julia Rogers

Abstract:

This research investigates and records post-flood and typhoon conditions of low income housing in the Thua Thien Hue Province, Vietnam; area prone to extreme flooding in Central Vietnam. The cost of rebuilding houses after flood and typhoon has been always a burden for low income households. These costs often lead to the elimination of essential construction practices for disaster resistance. Despite relief efforts from international non-profit organizations and Vietnam government, the impacts of flood and typhoon damages to residential construction has been reoccurring to the same neighborhood annually. Notwithstanding its importance, this topic has not been systematically investigated. The study is limited to assistance provided to low income households documenting existing conditions of low income homes impacted by post flood and typhoon conditions in the Thua Thien Hue Province. The research identifies leading causes of the building failure from the natural disasters. Relief efforts and progress made since the last typhoon is documented. The quality of construction and repairs are assessed based on Home Builders Guide to Coastal Construction by Federal Emergency Management Agency. Focus group discussions and individual interviews with local residents from four different communities were conducted to get incites on repair effort by the non-profit organizations and Vietnam government, and their needs post flood and typhoon. The findings from the field study informed that many of the local people are now aware of the importance of improving housing conditions as one of the key coping strategies to withstand flood and typhoon events as it makes housing and community more resilient to future events. While there has been a remarkable improvement of housing and infrastructure with the support from the local government as well as the non-profit organizations, many households in the study areas are found to still live in weak and fragile housing conditions without gaining access to the aid to repair and strengthen the houses. Given that the major immediate recovery action taken by the local people tends to focus on repairing damaged houses, and on this ground, low-income households spend a considerable amount of their income on housing repair, providing proper and applicable construction practices will not only improve the housing condition, but also contribute to reducing poverty in Vietnam.

Keywords: disaster coping mechanism, housing welfare, low-income housing, recovery reduction

Procedia PDF Downloads 259
502 The Emergence of Memory at the Nanoscale

Authors: Victor Lopez-Richard, Rafael Schio Wengenroth Silva, Fabian Hartmann

Abstract:

Memcomputing is a computational paradigm that combines information processing and storage on the same physical platform. Key elements for this topic are devices with an inherent memory, such as memristors, memcapacitors, and meminductors. Despite the widespread emergence of memory effects in various solid systems, a clear understanding of the basic microscopic mechanisms that trigger them is still a puzzling task. We report basic ingredients of the theory of solid-state transport, intrinsic to a wide range of mechanisms, as sufficient conditions for a memristive response that points to the natural emergence of memory. This emergence should be discernible under an adequate set of driving inputs, as highlighted by our theoretical prediction and general common trends can be thus listed that become a rule and not the exception, with contrasting signatures according to symmetry constraints, either built-in or induced by external factors at the microscopic level. Explicit analytical figures of merit for the memory modulation of the conductance are presented, unveiling very concise and accessible correlations between general intrinsic microscopic parameters such as relaxation times, activation energies, and efficiencies (encountered throughout various fields in Physics) with external drives: voltage pulses, temperature, illumination, etc. These building blocks of memory can be extended to a vast universe of materials and devices, with combinations of parallel and independent transport channels, providing an efficient and unified physical explanation for a wide class of resistive memory devices that have emerged in recent years. Its simplicity and practicality have also allowed a direct correlation with reported experimental observations with the potential of pointing out the optimal driving configurations. The main methodological tools used to combine three quantum transport approaches, Drude-like model, Landauer-Buttiker formalism, and field-effect transistor emulators, with the microscopic characterization of nonequilibrium dynamics. Both qualitative and quantitative agreements with available experimental responses are provided for validating the main hypothesis. This analysis also shades light on the basic universality of complex natural impedances of systems out of equilibrium and might help pave the way for new trends in the area of memory formation as well as in its technological applications.

Keywords: memories, memdevices, memristors, nonequilibrium states

Procedia PDF Downloads 78
501 Leadership Lessons from Female Executives in the South African Oil Industry

Authors: Anthea Carol Nefdt

Abstract:

In this article, observations are drawn from a number of interviews conducted with female executives in the South African Oil Industry in 2017. Globally, the oil industry represents one of the most male-dominated organisational structures as well as cultures in the business world. Some of the remarkable women, who hold upper management positions, have not only emerged from the science and finance spheres (equally gendered organisations) but also navigated their way through an aggressive, patriarchal atmosphere of rivalry and competition. We examine various mythology associated with the industry, such as the cowboy myth, the frontier ideology and the queen bee syndrome directed at female executives. One of the themes to emerge from my interviews was the almost unanimous rejection of the ‘glass ceiling’ metaphor favoured by some Feminists. The women of the oil industry rather affirmed a picture of their rise to leadership positions through a strategic labyrinth of challenges and obstacles both in terms of gender and race. This article aims to share the insights of women leaders in a complex industry through both their reflections and a theoretical Feminist lens. The study is located within the South African context and given our historical legacy, it was optimal to use an intersectional approach which would allow issues of race, gender, ethnicity and language to emerge. A qualitative research methodological approach was employed as well as a thematic interpretative analysis to analyse and interpret the data. This research methodology was used precisely because it encourages and acknowledged the experiences women have and places these experiences at the centre of the research. Multiple methods of recruitment of the research participants was utilised. The initial method of recruitment was snowballing sampling, the second method used was purposive sampling. In addition to this, semi-structured interviews gave the participants an opportunity to ask questions, add information and have discussions on issues or aspects of the research area which was of interest to them. One of the key objectives of the study was to investigate if there was a difference in the leadership styles of men and women. Findings show that despite the wealth of literature on the topic, to the contrary some women do not perceive a significant difference in men and women’s leadership style. However other respondents felt that there were some important differences in the experiences of men and women superiors although they hesitated to generalise from these experiences Further findings suggest that although the oil industry provides unique challenges to women as a gendered organization, it also incorporates various progressive initiatives for their advancement.

Keywords: petroleum industry, gender, feminism, leadership

Procedia PDF Downloads 138
500 Optimization of Bills Assignment to Different Skill-Levels of Data Entry Operators in a Business Process Outsourcing Industry

Authors: M. S. Maglasang, S. O. Palacio, L. P. Ogdoc

Abstract:

Business Process Outsourcing has been one of the fastest growing and emerging industry in the Philippines today. Unlike most of the contact service centers, more popularly known as "call centers", The BPO Industry’s primary outsourced service is performing audits of the global clients' logistics. As a service industry, manpower is considered as the most important yet the most expensive resource in the company. Because of this, there is a need to maximize the human resources so people are effectively and efficiently utilized. The main purpose of the study is to optimize the current manpower resources through effective distribution and assignment of different types of bills to the different skill-level of data entry operators. The assignment model parameters include the average observed time matrix gathered from through time study, which incorporates the learning curve concept. Subsequently, a simulation model was made to duplicate the arrival rate of demand which includes the different batches and types of bill per day. Next, a mathematical linear programming model was formulated. Its objective is to minimize direct labor cost per bill by allocating the different types of bills to the different skill-levels of operators. Finally, a hypothesis test was done to validate the model, comparing the actual and simulated results. The analysis of results revealed that the there’s low utilization of effective capacity because of its failure to determine the product-mix, skill-mix, and simulated demand as model parameters. Moreover, failure to consider the effects of learning curve leads to overestimation of labor needs. From 107 current number of operators, the proposed model gives a result of 79 operators. This results to an increase of utilization of effective capacity to 14.94%. It is recommended that the excess 28 operators would be reallocated to the other areas of the department. Finally, a manpower capacity planning model is also recommended in support to management’s decisions on what to do when the current capacity would reach its limit with the expected increasing demand.

Keywords: optimization modelling, linear programming, simulation, time and motion study, capacity planning

Procedia PDF Downloads 495
499 Research Trends in Fine Arts Education Dissertations in Turkey

Authors: Suzan Duygu Bedir Erişti

Abstract:

The present study tried to make a general evaluation of the dissertations conducted in the last decade in the field of art education in the Department of Fine Arts Education in the Institutes of Education Sciences in Turkey. In the study, most of the universities which involved an Institute of Education Sciences within their bodies in Turkey were reached. As a result, a total of a hundred dissertations conducted in the departments of Fine Arts Education at several universities (Anadolu, Gazi, Ankara, Marmara, Dokuz Eylul, Ondokuz Mayıs, Selcuk and Necmettin Erbakan) were determined via the open access systems of universities as well as via the Thesis Search System of Higher Education Council. Most of the dissertations were reached via the latter system, and in cases of failure, the dissertations were reached via the former system. Consequently, most of the dissertations which did not have any access restriction and which had appropriate content were reached. The dissertations reached were examined based on document analysis in terms of their research topics, research paradigms, contents, purposes, methodologies, data collection tools, and analysis techniques. The dissertations conducted in institutes of Education Sciences could be said to have demonstrated a development, especially in recent years with respect to their qualities. It was also found that a great majority of the dissertations were carried out at Gazi University and Marmara University and that a similar number of dissertations were conducted in other universities. When all the dissertations were taken into account, in general, they were found to differ a lot in their subject areas. In most of the dissertations, the quantitative paradigm was adopted, while especially in recent years, more importance has been given to methods based on the qualitative paradigm. In addition, most of the dissertations conducted with quantitative paradigm were structured based on the general survey model and experimental research model. In terms of statistical techniques, university-focused approaches were used. In some universities, advanced statistical techniques were applied, while in some other universities, there was a moderate use of statistical techniques. Most of the studies produced results generalizable to the levels of postgraduate education and elementary school education. The studies were generally structured in face-to-face teaching processes, while some of them were designed in environments which did not include results generalizable to the face-to-face education system. In the present study, it was seen that the dissertations conducted in the departments of Fine Arts Education at the Institutes of Education Sciences in Turkey did not involve application-based approaches which included art-based or visual research in terms of either research topic or methodology.

Keywords: fine arts education, dissertations, evaluation of dissertations, research trends in fine arts education

Procedia PDF Downloads 181
498 Optimization of a High-Growth Investment Portfolio for the South African Market Using Predictive Analytics

Authors: Mia Françoise

Abstract:

This report aims to develop a strategy for assisting short-term investors to benefit from the current economic climate in South Africa by utilizing technical analysis techniques and predictive analytics. As part of this research, value investing and technical analysis principles will be combined to maximize returns for South African investors while optimizing volatility. As an emerging market, South Africa offers many opportunities for high growth in sectors where other developed countries cannot grow at the same rate. Investing in South African companies with significant growth potential can be extremely rewarding. Although the risk involved is more significant in countries with less developed markets and infrastructure, there is more room for growth in these countries. According to recent research, the offshore market is expected to outperform the local market over the long term; however, short-term investments in the local market will likely be more profitable, as the Johannesburg Stock Exchange is predicted to outperform the S&P500 over the short term. The instabilities in the economy contribute to increased market volatility, which can benefit investors if appropriately utilized. Price prediction and portfolio optimization comprise the two primary components of this methodology. As part of this process, statistics and other predictive modeling techniques will be used to predict the future performance of stocks listed on the Johannesburg Stock Exchange. Following predictive data analysis, Modern Portfolio Theory, based on Markowitz's Mean-Variance Theorem, will be applied to optimize the allocation of assets within an investment portfolio. By combining different assets within an investment portfolio, this optimization method produces a portfolio with an optimal ratio of expected risk to expected return. This methodology aims to provide a short-term investment with a stock portfolio that offers the best risk-to-return profile for stocks listed on the JSE by combining price prediction and portfolio optimization.

Keywords: financial stocks, optimized asset allocation, prediction modelling, South Africa

Procedia PDF Downloads 73
497 The Evolution of Deformation in the Southern-Central Tunisian Atlas: Parameters and Modelling

Authors: Mohamed Sadok Bensalem, Soulef Amamria, Khaled Lazzez, Mohamed Ghanmi

Abstract:

The southern-central Tunisian Atlas presents a typical example of external zone. It occupies a particular position in the North African chains: firstly, it is the eastern limit of atlassicstructures; secondly, it is the edges between the belts structures to the north and the stable Saharan platform in the south. The evolution of deformation studyis based on several methods such as classical or numerical methods. The principals parameters controlling the genesis of folds in the southern central Tunisian Atlas are; the reactivation of pre-existing faults during later compressive phase, the evolution of decollement level, and the relation between thin and thick-skinned. One of the more principal characters of the southern-central Tunisian Atlas is the variation of belts structures directions determined by: NE-SW direction named the attlassic direction in Tunisia, the NW-SE direction carried along the Gafsa fault (the oriental limit of southern atlassic accident), and the E-W direction defined in the southern Tunisian Atlas. This variation of direction is the result of an important variation of deformation during different tectonics phases. A classical modeling of the Jebel ElKebar anticline, based on faults throw of the pre-existing faults and its reactivation during compressive phases, shows the importance of extensional deformation, particular during Aptian-Albian period, comparing with that of later compression (Alpine phases). A numerical modeling, based on the software Rampe E.M. 1.5.0, applied on the anticline of Jebel Orbata confirms the interpretation of “fault related fold” with decollement level within the Triassic successions. The other important parameter of evolution of deformation is the vertical migration of decollement level; indeed, more than the decollement level is in the recent series, most that the deformation is accentuated. The evolution of deformation is marked the development of duplex structure in Jebel AtTaghli (eastern limit of Jebel Orbata). Consequently, the evolution of deformation is proportional to the depth of the decollement level, the most important deformation is in the higher successions; thus is associated to the thin-skinned deformation; the decollement level permit the passive transfer of deformation in the cover.

Keywords: evolution of deformation, pre-existing faults, decollement level, thin-skinned

Procedia PDF Downloads 110
496 Optimizing 3D Shape Parameters of Sports Bra Pads in Motion by Finite Element Dynamic Modelling with Inverse Problem Solution

Authors: Jiazhen Chen, Yue Sun, Joanne Yip, Kit-Lun Yick

Abstract:

The design of sports bras poses a considerable challenge due to the difficulty in accurately predicting the wearing result after computer-aided design (CAD). It needs repeated physical try-on or virtual try-on to obtain a comfortable pressure range during motion. Specifically, in the context of running, the exact support area and force exerted on the breasts remain unclear. Consequently, obtaining an effective method to design the sports bra pads shape becomes particularly challenging. This predicament hinders the successful creation and production of sports bras that cater to women's health needs. The purpose of this study is to propose an effective method to obtain the 3D shape of sports bra pads and to understand the relationship between the supporting force and the 3D shape parameters of the pads. Firstly, the static 3D shape of the sports bra pad and human motion data (Running) are obtained by using the 3D scanner and advanced 4D scanning technology. The 3D shape of the sports bra pad is parameterised and simplified by Free-form Deformation (FFD). Then the sub-models of sports bra and human body are constructed by segmenting and meshing them with MSC Apex software. The material coefficient of sports bras is obtained by material testing. The Marc software is then utilised to establish a dynamic contact model between the human breast and the sports bra pad. To realise the reverse design of the sports bra pad, this contact model serves as a forward model for calculating the inverse problem. Based on the forward contact model, the inverse problem of the 3D shape parameters of the sports bra pad with the target bra-wearing pressure range as the boundary condition is solved. Finally, the credibility and accuracy of the simulation are validated by comparing the experimental results with the simulations by the FE model on the pressure distribution. On the one hand, this research allows for a more accurate understanding of the support area and force distribution on the breasts during running. On the other hand, this study can contribute to the customization of sports bra pads for different individuals. It can help to obtain sports bra pads with comfortable dynamic pressure.

Keywords: sports bra design, breast motion, running, inverse problem, finite element dynamic model

Procedia PDF Downloads 28
495 Human Capital Divergence and Team Performance: A Study of Major League Baseball Teams

Authors: Yu-Chen Wei

Abstract:

The relationship between organizational human capital and organizational effectiveness have been a common topic of interest to organization researchers. Much of this research has concluded that higher human capital can predict greater organizational outcomes. Whereas human capital research has traditionally focused on organizations, the current study turns to the team level human capital. In addition, there are no known empirical studies assessing the effect of human capital divergence on team performance. Team human capital refers to the sum of knowledge, ability, and experience embedded in team members. Team human capital divergence is defined as the variation of human capital within a team. This study is among the first to assess the role of human capital divergence as a moderator of the effect of team human capital on team performance. From the traditional perspective, team human capital represents the collective ability to solve problems and reducing operational risk of all team members. Hence, the higher team human capital, the higher the team performance. This study further employs social learning theory to explain the relationship between team human capital and team performance. According to this theory, the individuals will look for progress by way of learning from teammates in their teams. They expect to have upper human capital, in turn, to achieve high productivity, obtain great rewards and career success eventually. Therefore, the individual can have more chances to improve his or her capability by learning from peers of the team if the team members have higher average human capital. As a consequence, all team members can develop a quick and effective learning path in their work environment, and in turn enhance their knowledge, skill, and experience, leads to higher team performance. This is the first argument of this study. Furthermore, the current study argues that human capital divergence is negative to a team development. For the individuals with lower human capital in the team, they always feel the pressure from their outstanding colleagues. Under the pressure, they cannot give full play to their own jobs and lose more and more confidence. For the smart guys in the team, they are reluctant to be colleagues with the teammates who are not as intelligent as them. Besides, they may have lower motivation to move forward because they are prominent enough compared with their teammates. Therefore, human capital divergence will moderate the relationship between team human capital and team performance. These two arguments were tested in 510 team-seasons drawn from major league baseball (1998–2014). Results demonstrate that there is a positive relationship between team human capital and team performance which is consistent with previous research. In addition, the variation of human capital within a team weakens the above relationships. That is to say, an individual working with teammates who are comparable to them can produce better performance than working with people who are either too smart or too stupid to them.

Keywords: human capital divergence, team human capital, team performance, team level research

Procedia PDF Downloads 220
494 The Sources of Anti-Immigrant Sentiments in Russia

Authors: Anya Glikman, Anastasia Gorodzeisky

Abstract:

Since the late 1990th labor immigration and its consequences on the society have become one of the most frequently discussed and debated issues in Russia. Social scientists point that the negative attitudes towards immigrants among Russian majority population is widespread, and their level, at least, twice as high as their level in most other European countries. Moreover, recent study by Gorodzeisky, Glikman and Maskyleison (2014) demonstrates that the two sets of individual level predictors of anti-foreigner sentiment – socio-economic status and conservative views and ideologies – that have been repeatedly proved in research in Western countries are not effective in predicting of anti-foreigner sentiment in Post-Socialist Russia. Apparently, the social mechanisms underlying anti-foreigner sentiment in Western countries, which are characterized by stable regimes and relatively long immigration histories, do not play a significant role in the explanation of anti-foreigner sentiment in Post-Socialist Russia. The present study aims to examine alternative possible sources of anti-foreigner sentiment in Russia while controlling for socio-economic position of individuals and conservative views. More specifically, following the research literature on the topic worldwide, we aim to examine whether and to what extent human values (such as tradition, universalism, safety and power), ethnic residential segregation, fear of crime and exposure to mass media affect anti-foreigner sentiments in Russia. To do so, we estimate a series of multivariate regression equations using the data obtained from 2012 European Social Survey. The national representative sample consists of 2337 Russian born respondents. Descriptive results reveal that about 60% percent of Russians view the impact of immigrants on the country in negative terms. Further preliminary analysis show that anti-foreigner sentiments are associated with exposer to mass media as well as with fear of crime. Specifically, respondents who devoted more time watching news on TV channels and respondents who express higher levels of fear of crime tend to report higher levels of anti-immigrants sentiments. The findings would be discussed in light of sociological perspective and the context of Russian society.

Keywords: anti-immigrant sentiments, fear of crime, human values, mass media, Russia

Procedia PDF Downloads 442
493 Development of a Coupled Thermal-Mechanical-Biological Model to Simulate Impacts of Temperature on Waste Stabilization at a Landfill in Quebec, Canada

Authors: Simran Kaur, Paul J. Van Geel

Abstract:

A coupled Thermal-Mechanical-Biological (TMB) model was developed for the analysis of impacts of temperatures on waste stabilization at a Municipal Solid Waste (MSW) landfill in Quebec, Canada using COMSOL Multiphysics, a finite element-based software. For waste placed in landfills in Northern climates during winter months, it can take months or even years before the waste approaches ideal temperatures for biodegradation to occur. Therefore, the proposed model links biodegradation induced strain in MSW to waste temperatures and corresponding heat generation rates as a result of anaerobic degradation. This provides a link between the thermal-biological and mechanical behavior of MSW. The thermal properties of MSW are further linked to density which is tracked and updated in the mechanical component of the model, providing a mechanical-thermal link. The settlement of MSW is modelled based on the concept of viscoelasticity. The specific viscoelastic model used is a single Kelvin – Voight viscoelastic body in which the finite element response is controlled by the elastic material parameters – Young’s Modulus and Poisson’s ratio. The numerical model was validated with 10 years of temperature and settlement data collected from a landfill in Ste. Sophie, Quebec. The coupled TMB modelling framework, which simulates placement of waste lifts as they are placed progressively in the landfill, allows for optimization of several thermal and mechanical parameters throughout the depth of the waste profile and helps in better understanding of temperature dependence of MSW stabilization. The model is able to illustrate how waste placed in the winter months can delay biodegradation-induced settlement and generation of landfill gas. A delay in waste stabilization will impact the utilization of the approved airspace prior to the placement of a final cover and impact post-closure maintenance. The model provides a valuable tool to assess different waste placement strategies in order to increase airspace utilization within landfills operating under different climates, in addition to understanding conditions for increased gas generation for recovery as a green and renewable energy source.

Keywords: coupled model, finite element modeling, landfill, municipal solid waste, waste stabilization

Procedia PDF Downloads 114
492 Analysis of the Statistical Characterization of Significant Wave Data Exceedances for Designing Offshore Structures

Authors: Rui Teixeira, Alan O’Connor, Maria Nogal

Abstract:

The statistical theory of extreme events is progressively a topic of growing interest in all the fields of science and engineering. The changes currently experienced by the world, economic and environmental, emphasized the importance of dealing with extreme occurrences with improved accuracy. When it comes to the design of offshore structures, particularly offshore wind turbines, the importance of efficiently characterizing extreme events is of major relevance. Extreme events are commonly characterized by extreme values theory. As an alternative, the accurate modeling of the tails of statistical distributions and the characterization of the low occurrence events can be achieved with the application of the Peak-Over-Threshold (POT) methodology. The POT methodology allows for a more refined fit of the statistical distribution by truncating the data with a minimum value of a predefined threshold u. For mathematically approximating the tail of the empirical statistical distribution the Generalised Pareto is widely used. Although, in the case of the exceedances of significant wave data (H_s) the 2 parameters Weibull and the Exponential distribution, which is a specific case of the Generalised Pareto distribution, are frequently used as an alternative. The Generalized Pareto, despite the existence of practical cases where it is applied, is not completely recognized as the adequate solution to model exceedances over a certain threshold u. References that set the Generalised Pareto distribution as a secondary solution in the case of significant wave data can be identified in the literature. In this framework, the current study intends to tackle the discussion of the application of statistical models to characterize exceedances of wave data. Comparison of the application of the Generalised Pareto, the 2 parameters Weibull and the Exponential distribution are presented for different values of the threshold u. Real wave data obtained in four buoys along the Irish coast was used in the comparative analysis. Results show that the application of the statistical distributions to characterize significant wave data needs to be addressed carefully and in each particular case one of the statistical models mentioned fits better the data than the others. Depending on the value of the threshold u different results are obtained. Other variables of the fit, as the number of points and the estimation of the model parameters, are analyzed and the respective conclusions were drawn. Some guidelines on the application of the POT method are presented. Modeling the tail of the distributions shows to be, for the present case, a highly non-linear task and, due to its growing importance, should be addressed carefully for an efficient estimation of very low occurrence events.

Keywords: extreme events, offshore structures, peak-over-threshold, significant wave data

Procedia PDF Downloads 253
491 A Sociolinguistic Study of the Outcomes of Arabic-French Contact in the Algerian Dialect Tlemcen Speech Community as a Case Study

Authors: R. Rahmoun-Mrabet

Abstract:

It is acknowledged that our style of speaking changes according to a wide range of variables such as gender, setting, the age of both the addresser and the addressee, the conversation topic, and the aim of the interaction. These differences in style are noticeable in monolingual and multilingual speech communities. Yet, they are more observable in speech communities where two or more codes coexist. The linguistic situation in Algeria reflects a state of bilingualism because of the coexistence of Arabic and French. Nevertheless, like all Arab countries, it is characterized by diglossia i.e. the concomitance of Modern Standard Arabic (MSA) and Algerian Arabic (AA), the former standing for the ‘high variety’ and the latter for the ‘low variety’. The two varieties are derived from the same source but are used to fulfil distinct functions that is, MSA is used in the domains of religion, literature, education and formal settings. AA, on the other hand, is used in informal settings, in everyday speech. French has strongly affected the Algerian language and culture because of the historical background of Algeria, thus, what can easily be noticed in Algeria is that everyday speech is characterized by code-switching from dialectal Arabic and French or by the use of borrowings. Tamazight is also very present in many regions of Algeria and is the mother tongue of many Algerians. Yet, it is not used in the west of Algeria, where the study has been conducted. The present work, which was directed in the speech community of Tlemcen-Algeria, aims at depicting some of the outcomes of the contact of Arabic with French such as code-switching, borrowing and interference. The question that has been asked is whether Algerians are aware of their use of borrowings or not. Three steps are followed in this research; the first one is to depict the sociolinguistic situation in Algeria and to describe the linguistic characteristics of the dialect of Tlemcen, which are specific to this city. The second one is concerned with data collection. Data have been collected from 57 informants who were given questionnaires and who have then been classified according to their age, gender and level of education. Information has also been collected through observation, and note taking. The third step is devoted to analysis. The results obtained reveal that most Algerians are aware of their use of borrowings. The present work clarifies how words are borrowed from French, and then adapted to Arabic. It also illustrates the way in which singular words inflect into plural. The results expose the main characteristics of borrowing as opposed to code-switching. The study also clarifies how interference occurs at the level of nouns, verbs and adjectives.

Keywords: bilingualism, borrowing, code-switching, interference, language contact

Procedia PDF Downloads 259
490 Estimating the Relationship between Education and Political Polarization over Immigration across Europe

Authors: Ben Tappin, Ryan McKay

Abstract:

The political left and right appear to disagree not only over questions of value but, also, over questions of fact—over what is true “out there” in society and the world. Alarmingly, a large body of survey data collected during the past decade suggests that this disagreement tends to be greatest among the most educated and most cognitively sophisticated opposing partisans. In other words, the data show that these individuals display the widest political polarization in their reported factual beliefs. Explanations of this polarization pattern draw heavily on cultural and political factors; yet, the large majority of the evidence originates from one cultural and political context—the United States, a country with a rather unique cultural and political history. One consequence is that widening political polarization conditional on education and cognitive sophistication may be due to idiosyncratic cultural, political or historical factors endogenous to US society—rather than a more general, international phenomenon. We examined widening political polarization conditional on education across Europe, over a topic that is culturally and politically contested; immigration. To do so, we analyzed data from the European Social Survey, a premier survey of countries in and around the European area conducted biennially since 2002. Our main results are threefold. First, we see widening political polarization conditional on education over beliefs about the economic impact of immigration. The foremost countries showing this pattern are the most influential in Europe: Germany and France. However, we also see heterogeneity across countries, with some—such as Belgium—showing no evidence of such polarization. Second, we find that widening political polarization conditional on education is a product of sorting. That is, highly educated partisans exhibit stronger within-group consensus in their beliefs about immigration—the data do not support the view that the more educated partisans are more polarized simply because the less educated fail to adopt a position on the question. Third, and finally, we find some evidence that shocks to the political climate of countries in the European area—for example, the “refugee crisis” of summer 2015—were associated with a subsequent increase in political polarization over immigration conditional on education. The largest increase was observed in Germany, which was at the centre of the so-called refugee crisis in 2015. These results reveal numerous insights: they show that widening political polarization conditional on education is not restricted to the US or native English-speaking culture; that such polarization emerges in the domain of immigration; that it is a product of within-group consensus among the more educated; and, finally, that exogenous shocks to the political climate may be associated with subsequent increases in political polarization conditional on education.

Keywords: beliefs, Europe, immigration, political polarization

Procedia PDF Downloads 132
489 Adsorptive Media Selection for Bilirubin Removal: An Adsorption Equilibrium Study

Authors: Vincenzo Piemonte

Abstract:

The liver is a complex, large-scale biochemical reactor which plays a unique role in the human physiology. When liver ceases to perform its physiological activity, a functional replacement is required. Actually, liver transplantation is the only clinically effective method of treating severe liver disease. Anyway, the aforementioned therapeutic approach is hampered by the disparity between organ availability and the number of patients on the waiting list. In order to overcome this critical issue, research activities focused on liver support device systems (LSDs) designed to bridging patients to transplantation or to keep them alive until the recovery of native liver function. In recirculating albumin dialysis devices, such as MARS (Molecular Adsorbed Recirculating System), adsorption is one of the fundamental steps in albumin-dialysate regeneration. Among the albumin-bound toxins that must be removed from blood during liver-failure therapy, bilirubin and tryptophan can be considered as representative of two different toxin classes. The first one, not water soluble at physiological blood pH and strongly bounded to albumin, the second one, loosely albumin bound and partially water soluble at pH 7.4. Fixed bed units are normally used for this task, and the design of such units requires information both on toxin adsorption equilibrium and kinetics. The most common adsorptive media used in LSDs are activated carbon, non-ionic polymeric resins and anionic resins. In this paper, bilirubin adsorption isotherms on different adsorptive media, such as polymeric resin, albumin-coated resin, anionic resin, activated carbon and alginate beads with entrapped albumin are presented. By comparing all the results, it can be stated that the adsorption capacity for bilirubin of the five different media increases in the following order: Alginate beads < Polymeric resin < Albumin-coated resin < Activated carbon < Anionic resin. The main focus of this paper is to provide useful guidelines for the optimization of liver support devices which implement adsorption columns to remove albumin-bound toxins from albumin dialysate solutions.

Keywords: adsorptive media, adsorption equilibrium, artificial liver devices, bilirubin, mathematical modelling

Procedia PDF Downloads 246
488 Improving Student Retention: Enhancing the First Year Experience through Group Work, Research and Presentation Workshops

Authors: Eric Bates

Abstract:

Higher education is recognised as being of critical importance in Ireland and has been linked as a vital factor to national well-being. Statistics show that Ireland has one of the highest rates of higher education participation in Europe. However, student retention and progression, especially in Institutes of Technology, is becoming an issue as rates on non-completion rise. Both within Ireland and across Europe student retention is seen as a key performance indicator for higher education and with these increasing rates the Irish higher education system needs to be flexible and adapt to the situation it now faces. The author is a Programme Chair on a Level 6 full time undergraduate programme and experience to date has shown that the first year undergraduate students take some time to identify themselves as a group within the setting of a higher education institute. Despite being part of a distinct class on a specific programme some individuals can feel isolated as he or she take the first step into higher education. Such feelings can contribute to students eventually dropping out. This paper reports on an ongoing initiative that aims to accelerate the bonding experience of a distinct group of first year undergraduates on a programme which has a high rate of non-completion. This research sought to engage the students in dynamic interactions with their peers to quickly evolve a group sense of coherence. Two separate modules – a Research Module and a Communications module - delivered by the researcher were linked across two semesters. Students were allocated into random groups and each group was given a topic to be researched. There were six topics – essentially the six sub-headings on the DIT Graduate Attribute Statement. The research took place in a computer lab and students also used the library. The output from this was a document that formed part of the submission for the Research Module. In the second semester the groups then had to make a presentation of their findings where each student spoke for a minimum amount of time. Presentation workshops formed part of that module and students were given the opportunity to practice their presentation skills. These presentations were video recorded to enable feedback to be given. Although this was a small scale study preliminary results found a strong sense of coherence among this particular cohort and feedback from the students was very positive. Other findings indicate that spreading the initiative across two semesters may have been an inhibitor. Future challenges include spreading such Initiatives College wide and indeed sector wide.

Keywords: first year experience, student retention, group work, presentation workshops

Procedia PDF Downloads 218
487 Modelling the Physicochemical Properties of Papaya Based-Cookies Using Response Surface Methodology

Authors: Mayowa Saheed Sanusi A, Musiliu Olushola Sunmonua, Abdulquadri Alakab Owolabi Raheema, Adeyemi Ikimot Adejokea

Abstract:

The development of healthy cookies for health-conscious consumers cannot be overemphasized in the present global health crisis. This study was aimed to evaluate and model the influence of ripeness levels of papaya puree (unripe, ripe and overripe), oven temperature (130°C, 150°C and 170°C) and oven rack speed (stationary, 10 and 20 rpm) on physicochemical properties of papaya-based cookies using Response Surface Methodology (RSM). The physicochemical properties (baking time, cookies mass, cookies thickness, spread ratio, proximate composition, Calcium, Vitamin C and Total Phenolic Content) were determined using standard procedures. The data obtained were statistically analysed at p≤0.05 using ANOVA. The polynomial regression model of response surface methodology was used to model the physicochemical properties. The adequacy of the models was determined using the coefficient of determination (R²) and the response optimizer of RSM was used to determine the optimum physicochemical properties for the papaya-based cookies. Cookies produced from overripe papaya puree were observed to have the shortest baking time; ripe papaya puree favors cookies spread ratio, while the unripe papaya puree gives cookies with the highest mass and thickness. The highest crude protein content, fiber content, calcium content, Vitamin C and Total Phenolic Content (TPC) were observed in papaya based-cookies produced from overripe puree. The models for baking time, cookies mass, cookies thickness, spread ratio, moisture content, crude protein and TPC were significant, with R2 ranging from 0.73 – 0.95. The optimum condition for producing papaya based-cookies with desirable physicochemical properties was obtained at 149°C oven temperature, 17 rpm oven rack speed and with the use of overripe papaya puree. The Information on the use of puree from unripe, ripe and overripe papaya can help to increase the use of underutilized unripe or overripe papaya and also serve as a strategic means of obtaining a fat substitute to produce new products with lower production cost and health benefit.

Keywords: papaya based-cookies, modeling, response surface methodology, physicochemical properties

Procedia PDF Downloads 145
486 Assessment of Mediation of Community-Based Disputes in Selected Barangays of Batangas City

Authors: Daisyree S. Arrieta

Abstract:

The purpose of this study was to assess the mediation process applied on community-based disputes in the selected barangays of Batangas City, namely: Barangay Sta. Rita Karsada, Barangay Bolbok, and Barangay Alangilan. The researcher initially speculated that the required procedures under Republic Act No. 7160 were not religiously followed and satisfied by the Lupong Tagapamayapa members in most of the barangays in the subject locality and this prompted the researcher to conduct an investigation about this research topic. In this study, the subject barangays and their Lupon members still resorted to mediation processes to amicably settle conflicts among community members. It can also be appreciated among the Lupon Tagapamayapa members that they are aware of the purpose and processes required in the mediation of cases brought before them. However, the manner in which they conduct this mediation processes seems to be dependent on the general characteristics of their respective barangays and of the people situated therein. It also very noticeable that the strategies applied by the Lupon members on these cases depend on the ways and means the parties in dispute may arrive into agreements and conciliations. It is concluded by the researcher that the Lupong Tagapamayapa members in Barangay Sta. Rita Karsada, Barangay Bolbok, and Barangay Alangilan are aware and are applying the objectives and procedures of mediation. Also, the success and failure of the mediation processes applied by the Lupong Tagapamayapa members of the subject barangays on community-based disputes brought before them are generally attributed on the attitude and perspective of the parties in dispute towards the entire process of mediation and not on the capacity or capability of the Lupon members to subject them into amicable settlements. In view of the above, the researcher humbly recommends the following: (1) that the composition of the Lupong Tagapamayapa should include individuals from various sectors of the barangay; (2) that the Lupong Tagapamayapa members should undergo various trainings that may enhance their capability to mediate any type of community-based disputes at the expense of the barangay fund or budget; (3) that the Punong Barangay and the Sangguniang Pambarangay, in their own discretion, should allocate budget that will consistently provide regular honoraria for the Lupong Tagapamayapa members; (4) that the Punong Barangay and the Sangguniang Pambarangay should provide an ideal venue for the hearing of community-based disputes; (5) that the City/ Municipal Governments should allocate necessary financial assistance to the barangays under their jurisdiction in honing eligible Lupong Tagapamayapa members; and (6) that the Punong Barangay and other officials should initiate series of information campaigns for their constituents to be informed on the objectives, advantages, and procedures of mediation.

Keywords: amicable settlement, community-based disputes, dispute resolution, mediation

Procedia PDF Downloads 361
485 The Human Rights Code: Fundamental Rights as the Basis of Human-Robot Coexistence

Authors: Gergely G. Karacsony

Abstract:

Fundamental rights are the result of thousand years’ progress of legislation, adjudication and legal practice. They serve as the framework of peaceful cohabitation of people, protecting the individual from any abuse by the government or violation by other people. Artificial intelligence, however, is the development of the very recent past, being one of the most important prospects to the future. Artificial intelligence is now capable of communicating and performing actions the same way as humans; such acts are sometimes impossible to tell from actions performed by flesh-and-blood people. In a world, where human-robot interactions are more and more common, a new framework of peaceful cohabitation is to be found. Artificial intelligence, being able to take part in almost any kind of interaction where personal presence is not necessary without being recognized as a non-human actor, is now able to break the law, violate people’s rights, and disturb social peace in many other ways. Therefore, a code of peaceful coexistence is to be found or created. We should consider the issue, whether human rights can serve as the code of ethical and rightful conduct in the new era of artificial intelligence and human coexistence. In this paper, we will examine the applicability of fundamental rights to human-robot interactions as well as to the actions of artificial intelligence performed without human interaction whatsoever. Robot ethics has been a topic of discussion and debate of philosophy, ethics, computing, legal sciences and science fiction writing long before the first functional artificial intelligence has been introduced. Legal science and legislation have approached artificial intelligence from different angles, regulating different areas (e.g. data protection, telecommunications, copyright issues), but they are only chipping away at the mountain of legal issues concerning robotics. For a widely acceptable and permanent solution, a more general set of rules would be preferred to the detailed regulation of specific issues. We argue that human rights as recognized worldwide are able to be adapted to serve as a guideline and a common basis of coexistence of robots and humans. This solution has many virtues: people don’t need to adjust to a completely unknown set of standards, the system has proved itself to withstand the trials of time, legislation is easier, and the actions of non-human entities are more easily adjudicated within their own framework. In this paper we will examine the system of fundamental rights (as defined in the most widely accepted source, the 1966 UN Convention on Human Rights), and try to adapt each individual right to the actions of artificial intelligence actors; in each case we will examine the possible effects on the legal system and the society of such an approach, finally we also examine its effect on the IT industry.

Keywords: human rights, robot ethics, artificial intelligence and law, human-robot interaction

Procedia PDF Downloads 226
484 Regeneration of Geological Models Using Support Vector Machine Assisted by Principal Component Analysis

Authors: H. Jung, N. Kim, B. Kang, J. Choe

Abstract:

History matching is a crucial procedure for predicting reservoir performances and making future decisions. However, it is difficult due to uncertainties of initial reservoir models. Therefore, it is important to have reliable initial models for successful history matching of highly heterogeneous reservoirs such as channel reservoirs. In this paper, we proposed a novel scheme for regenerating geological models using support vector machine (SVM) and principal component analysis (PCA). First, we perform PCA for figuring out main geological characteristics of models. Through the procedure, permeability values of each model are transformed to new parameters by principal components, which have eigenvalues of large magnitude. Secondly, the parameters are projected into two-dimensional plane by multi-dimensional scaling (MDS) based on Euclidean distances. Finally, we train an SVM classifier using 20% models which show the most similar or dissimilar well oil production rates (WOPR) with the true values (10% for each). Then, the other 80% models are classified by trained SVM. We select models on side of low WOPR errors. One hundred channel reservoir models are initially generated by single normal equation simulation. By repeating the classification process, we can select models which have similar geological trend with the true reservoir model. The average field of the selected models is utilized as a probability map for regeneration. Newly generated models can preserve correct channel features and exclude wrong geological properties maintaining suitable uncertainty ranges. History matching with the initial models cannot provide trustworthy results. It fails to find out correct geological features of the true model. However, history matching with the regenerated ensemble offers reliable characterization results by figuring out proper channel trend. Furthermore, it gives dependable prediction of future performances with reduced uncertainties. We propose a novel classification scheme which integrates PCA, MDS, and SVM for regenerating reservoir models. The scheme can easily sort out reliable models which have similar channel trend with the reference in lowered dimension space.

Keywords: history matching, principal component analysis, reservoir modelling, support vector machine

Procedia PDF Downloads 142
483 Economic Factors Affecting Greenfield Petroleum Refinery and Petrochemical Projects in Africa

Authors: Daniel Muwooya

Abstract:

This paper analyses economic factors that have affected the competitiveness of petroleum refinery and petrochemical projects in sub-Saharan Africa in the past and continue to plague greenfield projects today. Traditional factors like plant sizing and complexity, low-capacity utilization, changing regulatory environment, and tighter product specifications have been important in the past. Additional factors include the development of excess refinery capacity in Asia and the growth of renewable sources of energy – especially for transportation. These factors create both challenges and opportunities for the development of greenfield refineries and petrochemical projects in areas of increased demand growth and new low-cost crude oil production – like sub-Saharan Africa. This paper evaluates the strategies available to project developers and host countries to address contemporary issues of energy transition and the apparent reduction of funds available for greenfield oil and gas projects. The paper also evaluates the structuring of greenfield refinery and petrochemical projects for limited recourse project finance bankability. The methodology of this paper includes analysis of current industry data, conference proceedings, academic papers, and academic books on the subjects of petroleum refinery economics, refinery financing, refinery operations, and project finance generally and specifically in the oil and gas industry; evaluation of expert opinions from journal articles; working papers from international bodies like the World Bank and the International Energy Agency; and experience from playing an active role in the development and financing of US$ 10 Billion greenfield oil development project in Uganda. The paper also applies the discounted cash flow modelling to illustrate the circumstances of an inland greenfield refinery project in Uganda. Greenfield refinery and petrochemical projects are still necessary in sub-Saharan Africa to, among other aspirations, support the transition from traditional sources of energy like biomass to such modern forms as liquefied petroleum gas. Project developers and host governments will be required to structure projects that support global climate change goals without occasioning undue delays to project execution.

Keywords: financing, refinery and petrochemical economics, Africa, project finance

Procedia PDF Downloads 43
482 Green Ports: Innovation Adopters or Innovation Developers

Authors: Marco Ferretti, Marcello Risitano, Maria Cristina Pietronudo, Lina Ozturk

Abstract:

A green port is the result of a sustainable long-term strategy adopted by an entire port infrastructure, therefore by the set of actors involved in port activities. The strategy aims to realise the development of sustainable port infrastructure focused on the reduction of negative environmental impacts without jeopardising economic growth. Green technology represents the core tool to implement sustainable solutions, however, they are not a magic bullet. Ports have always been integrated in the local territory affecting the environment in which they operate, therefore, the sustainable strategy should fit with the entire local systems. Therefore, adopting a sustainable strategy means to know how to involve and engage a wide stakeholders’ network (industries, production, markets, citizens, and public authority). The existing research on the topic has not well integrated this perspective with those of sustainability. Research on green ports have mixed the sustainability aspects with those on the maritime industry, neglecting dynamics that lead to the development of the green port phenomenon. We propose an analysis of green ports adopting the lens of ecosystem studies in the field of management. The ecosystem approach provides a way to model relations that enable green solutions and green practices in a port ecosystem. However, due to the local dimension of a port and the port trend on innovation, i.e., sustainable innovation, we draw to a specific concept of ecosystem, those on local innovation systems. More precisely, we explore if a green port is a local innovation system engaged in developing sustainable innovation with a large impact on the territory or merely an innovation adopter. To address this issue, we adopt a comparative case study selecting two innovative ports in Europe: Rotterdam and Genova. The case study is a research method focused on understanding the dynamics in a specific situation and can be used to provide a description of real circumstances. Preliminary results show two different approaches in supporting sustainable innovation: one represented by Rotterdam, a pioneer in competitiveness and sustainability, and the second one represented by Genoa, an example of technology adopter. The paper intends to provide a better understanding of how sustainable innovations are developed and in which manner a network of port and local stakeholder support this process. Furthermore, it proposes a taxonomy of green ports as developers and adopters of sustainable innovation, suggesting also best practices to model relationships that enable the port ecosystem in applying a sustainable strategy.

Keywords: green port, innovation, sustainability, local innovation systems

Procedia PDF Downloads 101
481 Geometric, Energetic and Topological Analysis of (Ethanol)₉-Water Heterodecamers

Authors: Jennifer Cuellar, Angie L. Parada, Kevin N. S. Chacon, Sol M. Mejia

Abstract:

The purification of bio-ethanol through distillation methods is an unresolved issue at the biofuel industry because of the ethanol-water azeotrope formation, which increases the steps of the purification process and subsequently increases the production costs. Therefore, understanding the mixture nature at the molecular level could provide new insights for improving the current methods and/or designing new and more efficient purification methods. For that reason, the present study focuses on the evaluation and analysis of (ethanol)₉-water heterodecamers, as the systems with the minimum molecular proportion that represents the azeotropic concentration (96 %m/m in ethanol). The computational modelling was carried out with B3LYP-D3/6-311++G(d,p) in Gaussian 09. Initial explorations of the potential energy surface were done through two methods: annealing simulated runs and molecular dynamics trajectories besides intuitive structures obtained from smaller (ethanol)n-water heteroclusters, n = 7, 8 and 9. The energetic order of the seven stable heterodecamers determines the most stable heterodecamer (Hdec-1) as a structure forming a bicyclic geometry with the O-H---O hydrogen bonds (HBs) where the water is a double proton donor molecule. Hdec-1 combines 1 water molecule and the same quantity of every ethanol conformer; this is, 3 trans, 3 gauche 1 and 3 gauche 2; its abundance is 89%, its decamerization energy is -80.4 kcal/mol, i.e. 13 kcal/mol most stable than the less stable heterodecamer. Besides, a way to understand why methanol does not form an azeotropic mixture with water, analogous systems ((ethanol)10, (methanol)10, and (methanol)9-water)) were optimized. Topologic analysis of the electron density reveals that Hec-1 forms 33 weak interactions in total: 11 O-H---O, 8 C-H---O, 2 C-H---C hydrogen bonds and 12 H---H interactions. The strength and abundance of the most unconventional interactions (H---H, C-H---O and C-H---O) seem to explain the preference of the ethanol for forming heteroclusters instead of clusters. Besides, O-H---O HBs present a significant covalent character according to topologic parameters as the Laplacian of electron density and the relationship between potential and kinetic energy densities evaluated at the bond critical points; obtaining negatives values and values between 1 and 2, for those two topological parameters, respectively.

Keywords: ADMP, DFT, ethanol-water azeotrope, Grimme dispersion correction, simulated annealing, weak interactions

Procedia PDF Downloads 91
480 A Professional Learning Model for Schools Based on School-University Research Partnering That Is Underpinned and Structured by a Micro-Credentialing Regime

Authors: David Lynch, Jake Madden

Abstract:

There exists a body of literature that reports on the many benefits of partnerships between universities and schools, especially in terms of teaching improvement and school reform. This is because such partnerships can build significant teaching capital, by deepening and expanding the skillsets and mindsets needed to create the connections that support ongoing and embedded teacher professional development and career goals. At the same time, this literature is critical of such initiatives when the partnership outcomes are short- term or one-sided, misaligned to fundamental problems, and not expressly focused on building the desired teaching capabilities. In response to this situation, research conducted by Professor David Lynch and his TeachLab research team, has begun to shed light on the strengths and limitations of school/university partnerships, via the identification of key conceptual elements that appear to act as critical partnership success factors. These elements are theorised as an inter-play between professional knowledge acquisition, readiness, talent management and organisational structure. However, knowledge of how these elements are established, and how they manifest within the school and its teaching workforce as an overall system, remains incomplete. Therefore, research designed to more clearly delineate these elements in relation to their impact on school/university partnerships is thus required. It is within this context that this paper reports on the development and testing of a Professional Learning (PL) model for schools and their teachers that incorporates school-university research partnering within a systematic, whole-of-school PL strategy that is underpinned and structured by a micro-credentialing (MC) regime. MC involves learning a narrow-focused certificate (a micro-credential) in a specific topic area (e.g., 'How to Differentiate Instruction for English as a second language Students') and embedded in the teacher’s day-to-day teaching work. The use of MC is viewed as important to the efficacy and sustainability of teacher PL because it (1) provides an evidence-based framework for teacher learning, (2) has the ability to promote teacher social capital and (3) engender lifelong learning in keeping professional skills current in an embedded and seamless to work manner. The associated research is centred on a primary school in Australia (P-6) that acted as an arena to co-develop, test/investigate and report on outcomes for teacher PL that uses MC to support a whole-of-school partnership with a university.

Keywords: teaching improvement, teacher professional learning, talent management, education partnerships, school-university research

Procedia PDF Downloads 68
479 D-Lysine Assisted 1-Ethyl-3-(3-Dimethylaminopropyl)Carbodiimide / N-Hydroxy Succinimide Initiated Crosslinked Collagen Scaffold with Controlled Structural and Surface Properties

Authors: G. Krishnamoorthy, S. Anandhakumar

Abstract:

The effect of D-Lysine (D-Lys) on collagen with 1-ethyl-3-(3-dimethylaminopropyl) carbodiimide(EDC)/N-hydroxysuccinimide(NHS) initiated cross linking using experimental and modelling tools are evaluated. The results of the Coll-D-Lys-EDC/NHS scaffold also indicate an increase in the tensile strength (TS), percentage of elongation (% E), denaturation temperature (Td), and decrease the decomposition rate compared to L-Lys-EDC/NHS. Scanning electron microscopic (SEM) and atomic force microscopic (AFM) analyses revealed a well ordered with properly oriented and well-aligned structure of scaffold. The D-Lys stabilizes the scaffold against degradation by collagenase than L-Lys. The cell assay showed more than 98% fibroblast viability (NIH3T3) and improved cell adhesions, protein adsorption after 72h of culture when compared with native scaffold. Cell attachment after 74h was robust, with cytoskeletal analysis showing that the attached cells were aligned along the fibers assuming a spindle-shape appearance, despite, gene expression analyses revealed no apparent alterations in mRNA levels, although cell proliferation was not adversely affected. D-Lysine (D-Lys) plays a pivotal role in the self-assembly and conformation of collagen fibrils. The D-Lys assisted EDC/NHS initiated cross-linking induces the formation of an carboxamide by the activation of the side chain -COOH group, followed by aminolysis of the O-iso acylurea intermediates by the -NH2 groups are directly joined via an isopeptides bond. This leads to the formation of intra- and inter-helical cross links. Modeling studies indicated that D-Lys bind with collagen-like peptide (CLP) through multiple H-bonding and hydrophobic interactions. Orientational changes in collagenase on CLP-D-Lys are observed which may decrease its accessibility to degradation and stabilize CLP against the action of the former. D-Lys has lowest binding energy and improved fibrillar-assembly and staggered alignment without the undesired structural stiffness and aggregations. The proteolytic machinery is not well equipped to deal with Coll-D-Lys than Coll-L-Lys scaffold. The information derived from the present study could help in designing collagenolytically stable heterochiral collagen based scaffold for biomedical applications.

Keywords: collagen, collagenase, collagen like peptide, D-lysine, heterochiral collagen scaffold

Procedia PDF Downloads 372
478 Attitudes Towards the Supernatural in Benjamin Britten’s The Turn of the Screw

Authors: Yaou Zhang

Abstract:

Background: Relatively little scholarly attention has been paid to the production of Benjamin Britten’s chamber opera The Turn of the Screw. As one of Britten’s most remarkable operas. The story of the libretto was from Henry James’s novella of the same name. The novella was created in 1898 and one of the primary questions addressed to people in the story is “how real the ghosts are,” which leads the story to a huge ambiguity in readers’ minds. Aims: This research focuses on the experience of seeing the opera on stage over several decades. This study of opera productions over time not only provides insight into how stage performances can alter audience members' perceptions of the opera in the present but also reveals a landscape of shifting aesthetics and receptions. Methods: To examine the hypotheses in interpretation and reception, the qualitative analysis is used to examine the figures of ghosts in different productions across the time from 1954 to 2021 in the UK: by accessing recordings, newspapers, and reviews for the productions that are sourced from online and physical archives. For instance, the field research is conducted on the topic by arranging interviews with the creative team and visiting Opera North in Leeds and Britten-Pears Foundation. The collected data reveals the “hidden identity” in creative teams’ interpretations, social preferences, and rediscover that have previously remained unseen. Results: This research presents an angle of Britten’s Screw by using the third position; it shows how the attention moved from the stage of “do the ghosts really exist” to “traumatised children.” Discussion: Critics and audiences have debated whether the governess hallucinates the ghosts in the opera for decades. While, in recent years, directors of new productions have given themselves the opportunity to go deeper into Britten's musical structure and offer the opera more space to be interpreted, rather than debating if "ghosts actually exist" or "the psychological problems of the governess." One can consider and reflect that the questionable actions of the children are because they are suffering from trauma, whether the trauma comes from the ghosts, the hallucinating governess, or some prior experiences: various interpretations cause one result that children are the recipients of trauma. Arguably, the role of the supernatural is neither simply one of the elements of a ghost story nor simply one of the parts of the ambiguity between the supernatural and the hallucination of the governess; rather, the ghosts and the hallucinating governess can exist at the same time - the combination of the supernatural’s and the governess’s behaviours on stage generates a sharper and more serious angle that draws our attention to the traumatized children.

Keywords: benjamin britten, chamber opera, production, reception, staging, the turn of the screw

Procedia PDF Downloads 93
477 Measuring the Resilience of e-Governments Using an Ontology

Authors: Onyekachi Onwudike, Russell Lock, Iain Phillips

Abstract:

The variability that exists across governments, her departments and the provisioning of services has been areas of concern in the E-Government domain. There is a need for reuse and integration across government departments which are accompanied by varying degrees of risks and threats. There is also the need for assessment, prevention, preparation, response and recovery when dealing with these risks or threats. The ability of a government to cope with the emerging changes that occur within it is known as resilience. In order to forge ahead with concerted efforts to manage reuse and integration induced risks or threats to governments, the ambiguities contained within resilience must be addressed. Enhancing resilience in the E-Government domain is synonymous with reducing risks governments face with provisioning of services as well as reuse of components across departments. Therefore, it can be said that resilience is responsible for the reduction in government’s vulnerability to changes. In this paper, we present the use of the ontology to measure the resilience of governments. This ontology is made up of a well-defined construct for the taxonomy of resilience. A specific class known as ‘Resilience Requirements’ is added to the ontology. This class embraces the concept of resilience into the E-Government domain ontology. Considering that the E-Government domain is a highly complex one made up of different departments offering different services, the reliability and resilience of the E-Government domain have become more complex and critical to understand. We present questions that can help a government access how prepared they are in the face of risks and what steps can be taken to recover from them. These questions can be asked with the use of queries. The ontology focuses on developing a case study section that is used to explore ways in which government departments can become resilient to the different kinds of risks and threats they may face. A collection of resilience tools and resources have been developed in our ontology to encourage governments to take steps to prepare for emergencies and risks that a government may face with the integration of departments and reuse of components across government departments. To achieve this, the ontology has been extended by rules. We present two tools for understanding resilience in the E-Government domain as a risk analysis target and the output of these tools when applied to resilience in the E-Government domain. We introduce the classification of resilience using the defined taxonomy and modelling of existent relationships based on the defined taxonomy. The ontology is constructed on formal theory and it provides a semantic reference framework for the concept of resilience. Key terms which fall under the purview of resilience with respect to E-Governments are defined. Terms are made explicit and the relationships that exist between risks and resilience are made explicit. The overall aim of the ontology is to use it within standards that would be followed by all governments for government-based resilience measures.

Keywords: E-Government, Ontology, Relationships, Resilience, Risks, Threats

Procedia PDF Downloads 321
476 Model Order Reduction of Complex Airframes Using Component Mode Synthesis for Dynamic Aeroelasticity Load Analysis

Authors: Paul V. Thomas, Mostafa S. A. Elsayed, Denis Walch

Abstract:

Airframe structural optimization at different design stages results in new mass and stiffness distributions which modify the critical design loads envelop. Determination of aircraft critical loads is an extensive analysis procedure which involves simulating the aircraft at thousands of load cases as defined in the certification requirements. It is computationally prohibitive to use a Global Finite Element Model (GFEM) for the load analysis, hence reduced order structural models are required which closely represent the dynamic characteristics of the GFEM. This paper presents the implementation of Component Mode Synthesis (CMS) method for the generation of high fidelity Reduced Order Model (ROM) of complex airframes. Here, sub-structuring technique is used to divide the complex higher order airframe dynamical system into a set of subsystems. Each subsystem is reduced to fewer degrees of freedom using matrix projection onto a carefully chosen reduced order basis subspace. The reduced structural matrices are assembled for all the subsystems through interface coupling and the dynamic response of the total system is solved. The CMS method is employed to develop the ROM of a Bombardier Aerospace business jet which is coupled with an aerodynamic model for dynamic aeroelasticity loads analysis under gust turbulence. Another set of dynamic aeroelastic loads is also generated employing a stick model of the same aircraft. Stick model is the reduced order modelling methodology commonly used in the aerospace industry based on stiffness generation by unitary loading application. The extracted aeroelastic loads from both models are compared against those generated employing the GFEM. Critical loads Modal participation factors and modal characteristics of the different ROMs are investigated and compared against those of the GFEM. Results obtained show that the ROM generated using Craig Bampton CMS reduction process has a superior dynamic characteristics compared to the stick model.

Keywords: component mode synthesis, craig bampton reduction method, dynamic aeroelasticity analysis, model order reduction

Procedia PDF Downloads 192