Search results for: hand length
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6195

Search results for: hand length

4935 Comparison of Steel and Composite Analysis of a Multi-Storey Building

Authors: Çiğdem Avcı Karataş

Abstract:

Mitigation of structural damage caused by earthquake and reduction of fatality is one of the main concerns of engineers in seismic prone zones of the world. To achieve this aim many technologies have been developed in the last decades and applied in construction and retrofit of structures. On the one hand Turkey is well-known a country of high level of seismicity; on the other hand steel-composite structures appear competitive today in this country by comparison with other types of structures, for example only-steel or concrete structures. Composite construction is the dominant form of construction for the multi-storey building sector. The reason why composite construction is often so good can be expressed in one simple way - concrete is good in compression and steel is good in tension. By joining the two materials together structurally these strengths can be exploited to result in a highly efficient design. The reduced self-weight of composite elements has a knock-on effect by reducing the forces in those elements supporting them, including the foundations. The floor depth reductions that can be achieved using composite construction can also provide significant benefits in terms of the costs of services and the building envelope. The scope of this paper covers analysis, materials take-off, cost analysis and economic comparisons of a multi-storey building with composite and steel frames. The aim of this work is to show that designing load carrying systems as composite is more economical than designing as steel. Design of the nine stories building which is under consideration is done according to the regulation of the 2007, Turkish Earthquake Code and by using static and dynamic analysis methods. For the analyses of the steel and composite systems, plastic analysis methods have been used and whereas steel system analyses have been checked in compliance with EC3 and composite system analyses have been checked in compliance with EC4. At the end of the comparisons, it is revealed that composite load carrying systems analysis is more economical than the steel load carrying systems analysis considering the materials to be used in the load carrying system and the workmanship to be spent for this job.

Keywords: composite analysis, earthquake, steel, multi-storey building

Procedia PDF Downloads 564
4934 A Game-Based Methodology to Discriminate Executive Function – a Pilot Study With Institutionalized Elderly People

Authors: Marlene Rosa, Susana Lopes

Abstract:

There are few studies that explore the potential of board games as a performance measure, despite it can be an interesting strategy in the context of frailty populations. In fact, board games are immersive strategies than can inhibit the pressure of being evaluated. This study aimed to test the ability of gamed-base strategies to assess executive function in elderly population. Sixteen old participants were included: 10 with affected executive functions (G1 – 85.30±6.00 yrs old; 10 male); 6 with executive functions with non-clinical important modifications (G2 - 76.30±5.19 yrs old; 6 male). Executive tests were assessed using the Frontal Assessment Battery (FAB), which is a quick-applicable cognitive screening test (score<12 means impairment). The board game used in this study was the TATI Hand Game, specifically for training rhythmic coordination of the upper limbs with multiple cognitive stimuli. This game features 1 table grid, 1 set of Single Game cards (to play with one hand); Double Game cards (to play simultaneously with two hands); 1 dice to plan Single Game mode; cards to plan the Double Game mode; 1 bell; 2 cups. Each participant played 3 single game cards, and the following data were collected: (i) variability in time during board game challenges (SD); (ii) number of errors; (iii) execution speed (sec). G1 demonstrated: high variability in execution time during board game challenges (G1 – 13.0s vs G2- 0.5s); a higher number of errors (1.40 vs 0.67); higher execution velocity (607.80s vs 281.83s). These results demonstrated the potential of implementing board games as a functional assessment strategy in geriatric care. Future studies might include larger samples and statistical methodologies to find cut-off values for impairment in executive functions during performance in TATI game.

Keywords: board game, aging, executive function, evaluation

Procedia PDF Downloads 137
4933 Performance Analysis of Next Generation OCDM-RoF-Based Hybrid Network under Diverse Conditions

Authors: Anurag Sharma, Rahul Malhotra, Love Kumar, Harjit Pal Singh

Abstract:

This paper demonstrates OCDM-ROF based hybrid architecture where data/voice communication is enabled via a permutation of Optical Code Division Multiplexing (OCDM) and Radio-over-Fiber (RoF) techniques under various diverse conditions. OCDM-RoF hybrid network of 16 users with DPSK modulation format has been designed and performance of proposed network is analyzed for 100, 150, and 200 km fiber span length under the influence of linear and nonlinear effect. It has been reported that Polarization Mode Dispersion (PMD) has the least effect while other nonlinearity affects the performance of proposed network.

Keywords: OCDM, RoF, DPSK, PMD, eye diagram, BER, Q factor

Procedia PDF Downloads 627
4932 Methodology of Automation and Supervisory Control and Data Acquisition for Restructuring Industrial Systems

Authors: Lakhoua Najeh

Abstract:

Introduction: In most situations, an industrial system already existing, conditioned by its history, its culture and its context are in difficulty facing the necessity to restructure itself in an organizational and technological environment in perpetual evolution. This is why all operations of restructuring first of all require a diagnosis based on a functional analysis. After a presentation of the functionality of a supervisory system for complex processes, we present the concepts of industrial automation and supervisory control and data acquisition (SCADA). Methods: This global analysis exploits the various available documents on the one hand and takes on the other hand in consideration the various testimonies through investigations, the interviews or the collective workshops; otherwise, it also takes observations through visits as a basis and even of the specific operations. The exploitation of this diagnosis enables us to elaborate the project of restructuring thereafter. Leaving from the system analysis for the restructuring of industrial systems, and after a technical diagnosis based on visits, an analysis of the various technical documents and management as well as on targeted interviews, a focusing retailing the various levels of analysis has been done according a general methodology. Results: The methodology adopted in order to contribute to the restructuring of industrial systems by its participative and systemic character and leaning on a large consultation a lot of human resources that of the documentary resources, various innovating actions has been proposed. These actions appear in the setting of the TQM gait requiring applicable parameter quantification and a treatment valorising some information. The new management environment will enable us to institute an information and communication system possibility of migration toward an ERP system. Conclusion: Technological advancements in process monitoring, control and industrial automation over the past decades have contributed greatly to improve the productivity of virtually all industrial systems throughout the world. This paper tries to identify the principles characteristics of a process monitoring, control and industrial automation in order to provide tools to help in the decision-making process.

Keywords: automation, supervision, SCADA, TQM

Procedia PDF Downloads 168
4931 The Effect of Core Training on Physical Fitness Characteristics in Male Volleyball Players

Authors: Sibel Karacaoglu, Fatma Ç. Kayapinar

Abstract:

The aim of the study is to investigate the effect of the core training program on physical fitness characteristics and body composition in male volleyball players. 26 male university volleyball team players aged between 19 to 24 years who had no health problems and injury participated in the study. Subjects were divided into training (TG) and control groups (CG) as randomly. Data from twenty-one players who completed all training sessions were used for statistical analysis (TG,n=11; CG,n=10). A core training program was applied to the training group three days a week for 10 weeks. On the other hand, the control group did not receive any training. Before and after the 10-week training program, pre- and post-testing comprised of body composition measurements (weight, BMI, bioelectrical impedance analysis) and physical fitness measurements including flexibility (sit and reach test), muscle strength (back, leg and grip strength by dynamometer), muscle endurance (sit-ups and push-ups tests), power (one-legged jump and vertical jump tests), speed (20m sprint, 30m sprint) and balance tests (one-legged standing test) were performed. Changes of pre- and post- test values of the groups were determined by using dependent t test. According to the statistical analysis of data, no significant difference was found in terms of body composition in the both groups for pre- and post- test values. In the training group, all physical fitness measurements improved significantly after core training program (p<0.05) except 30m speed and handgrip strength (p>0.05). On the hand, only 20m speed test values improved after post-test period (p<0.05), but the other physical fitness tests values did not differ (p>0.05) between pre- and post- test measurement in the control group. The results of the study suggest that the core training program has positive effect on physical fitness characteristics in male volleyball players.

Keywords: body composition, core training, physical fitness, volleyball

Procedia PDF Downloads 342
4930 Numerical Static and Seismic Evaluation of Pile Group Settlement: A Case Study

Authors: Seyed Abolhassan Naeini, Hamed Yekehdehghan

Abstract:

Shallow foundations cannot be used when the bedding soil is soft. A suitable method for constructing foundations on soft soil is to employ pile groups to transfer the load to the bottom layers. The present research used results from tests carried out in northern Iran (Langarud) and the FLAC3D software to model a pile group for investigating the effects of various parameters on pile cap settlement under static and seismic conditions. According to the results, changes in the strength parameters of the soil, groundwater level, and the length of and distance between the piles affect settlement differently.

Keywords: FLACD 3D software, pile group, settlement, soil

Procedia PDF Downloads 120
4929 Teaching Translation in Brazilian Universities: A Study about the Possible Impacts of Translators’ Comments on the Cyberspace about Translator Education

Authors: Erica Lima

Abstract:

The objective of this paper is to discuss relevant points about teaching translation in Brazilian universities and the possible impacts of blogs and social networks to translator education today. It is intended to analyze the curricula of Brazilian translation courses, contrasting them to information obtained from two social networking groups of great visibility in the area concerning essential characteristics to become a successful profession. Therefore, research has, as its main corpus, a few undergraduate translation programs’ syllabuses, as well as a few postings on social networks groups that specifically share professional opinions regarding the necessity for a translator to obtain a degree in translation to practice the profession. To a certain extent, such comments and their corresponding responses lead to the propagation of discourses which influence the ideas that aspiring translators and recent graduates end up having towards themselves and their undergraduate courses. The postings also show that many professionals do not have a clear position regarding the translator education; while refuting it, they also encourage “free” courses. It is thus observed that cyberspace constitutes, on the one hand, a place of mobilization of people in defense of similar ideas. However, on the other hand, it embodies a place of tension and conflict, in view of the fact that there are many participants and, as in any other situation of interlocution, disagreements may arise. From the postings, aspects related to professionalism were analyzed (including discussions about regulation), as well as questions about the classic dichotomies: theory/practice; art/technique; self-education/academic training. As partial result, the common interest regarding the valorization of the profession could be mentioned, although there is no consensus on the essential characteristics to be a good translator. It was also possible to observe that the set of socially constructed representations in the group reflects characteristics of the world situation of the translation courses (especially in some European countries and in the United States), which, in the first instance, does not accurately reflect the Brazilian idiosyncrasies of the area.

Keywords: cyberspace, teaching translation, translator education, university

Procedia PDF Downloads 383
4928 Problem Gambling in the Conceptualization of Health Professionals: A Qualitative Analysis of the Discourses Produced by Psychologists, Psychiatrists and General Practitioners

Authors: T. Marinaci, C. Venuleo

Abstract:

Different conceptualizations of disease affect patient care. This study aims to address this gap. It explores how health professionals conceptualize gambling problem, addiction and the goals of recovery process. In-depth, semi-structured, open-ended interviews were conducted with Italian psychologists, psychiatrists, general practitioners, and support staff (N= 114), working within health centres for the treatment of addiction (public health services or therapeutic communities) or medical offices. A Lexical Correspondence Analysis (LCA) was applied to the verbatim transcripts. LCA allowed to identify two main factorial dimensions, which organize similarity and dissimilarity in the discourses of the interviewed. The first dimension labelled 'Models of relationship with the problem', concerns two different models of relationship with the health problem: one related to the request for help and the process of taking charge and the other related to the identification of the psychopathology underlying the disorder. The second dimension, labelled 'Organisers of the intervention' reflects the dialectic between two ways to address the problem. On the one hand, they are the gambling dynamics and its immediate life-consequences to organize the intervention (whatever the request of the user is); on the other hand, they are the procedures and the tools which characterize the health service to organize the way the professionals deal with the user’ s problem (whatever it is and despite the specify of the user’s request). The results highlight how, despite the differences, the respondents share a central assumption: understanding gambling problem implies the reference to the gambler’s identity, more than, for instance, to the relational, social, cultural or political context where the gambler lives. A passive stance is attributed to the user, who does not play any role in the definition of the goal of the intervention. The results will be discussed to highlight the relationship between professional models and users’ ways to understand and deal with the problems related to gambling.

Keywords: cultural models, health professionals, intervention models, problem gambling

Procedia PDF Downloads 148
4927 Performance Analysis of BPJLT with Different Gate and Spacer Materials

Authors: Porag Jyoti Ligira, Gargi Khanna

Abstract:

The paper presents a simulation study of the electrical characteristic of Bulk Planar Junctionless Transistor (BPJLT) using spacer. The BPJLT is a transistor without any PN junctions in the vertical direction. It is a gate controlled variable resistor. The characteristics of BPJLT are analyzed by varying the oxide material under the gate. It can be shown from the simulation that an ideal subthreshold slope of ~60 mV/decade can be achieved by using highk dielectric. The effects of variation of spacer length and material on the electrical characteristic of BPJLT are also investigated in the paper. The ION / IOFF ratio improvement is of the order of 107 and the OFF current reduction of 10-4 is obtained by using gate dielectric of HfO2 instead of SiO2.

Keywords: spacer, BPJLT, high-k, double gate

Procedia PDF Downloads 425
4926 Effect of Yogurt on Blood and Liver Lipids Lavel in Rats

Authors: Nora Mohammed Al-Kehayez

Abstract:

This present investigation was performed to study the effect of low fat yogurt on serum and liver lipids profile of male albino rats (weighing 100 g+or- 5 gram) when fed balanced or high fat high cholesterol diets and given yogurt ad libitum compared with control groups. Rats were divided into 4 groups, each group contains 6 rats. The groups of rats were fed as follows: Group(1) was fed balanced diet + water(control). Group(2) was fed balanced diet + low fat yogurt. Group(3) was fed high fat high cholesterol diet + water(Control). Group(4) was fed high fat high cholesterol diet + low fat yogurt. The obtained results could be summarized as follows: When rats were given low fat yogurt and fed balanced or high fat high cholesterol diets a significantly greater weight gains resulted in comparison with the control groups given water instead of yogurt. The data on the weights of liver and heart expressed' as percentage increased the body weight in case of rats which were fed balanced diet with low fat yogurt while in case of rats which were fed high fat high cholesterol diet with low fat yogurt the increment scenes to be less. Results of serum cholesterol levels in serum of rats were given balanced or high fat high cholesterol diets and consuming low fat yogurt was showed a significant reduction values. However the low fat yogurt produced the highest significant decrease values. The values of serum cholesterol go hand in hand with serum lipoprotein fractions in rats given low fat yogurt with both balanced or high fat high cholesterol diets. An increase of high density lipoprotein HDL-C and a decrease of low density lipoprotein LDL-C values were obtained. When rats ingested low fat yogurt a significant decrease in serum and liver triglycerides content was obtained wether with balanced or high fat high cholesterol diets. Rats consuming high fat high cholesterol diets with water showed a significant increase in liver total lipids, total cholesterol and phospholipides levels in comparison with the same liver parameters in rats given balanced diet with water. Supplement with low fat yogurt significantly suppressed these effects.

Keywords: yogurt, lipids profile, albino, rats

Procedia PDF Downloads 413
4925 A Numerical Study on Micromechanical Aspects in Short Fiber Composites

Authors: I. Ioannou, I. M. Gitman

Abstract:

This study focused on the contribution of micro-mechanical parameters on the macro-mechanical response of short fiber composites, namely polypropylene matrix reinforced by glass fibers. In the framework of this paper, an attention has been given to the glass fibers length, as micromechanical parameter influences the overall macroscopic material’s behavior. Three dimensional numerical models were developed and analyzed through the concept of a Representative Volume Element (RVE). Results of the RVE-based approach were compared with analytical Halpin-Tsai’s model.

Keywords: effective properties, homogenization, representative volume element, short fiber reinforced composites

Procedia PDF Downloads 260
4924 Disaster Victim Identification: A Social Science Perspective

Authors: Victor Toom

Abstract:

Albeit it is never possible to anticipate the full range of difficulties after a catastrophe, efforts to identify victims of mass casualty events have become institutionalized and standardized with the aim of effectively and efficiently addressing the many challenges and contingencies. Such ‘disaster victim identification’ (DVI) practices are dependent on the forensic sciences, are subject of national legislation, and are reliant on technical and organizational protocols to mitigate the many complexities in the wake of catastrophe. Apart from such technological, legal and bureaucratic elements constituting a DVI operation, victims’ families and their emotions are also part and parcel of any effort to identify casualties of mass human fatality incidents. Take for example the fact that forensic experts require (antemortem) information from the group of relatives to make identification possible. An identified body or body part is also repatriated to kin. Relatives are thus main stakeholders in DVI operations. Much has been achieved in years past regarding facilitating victims’ families’ issues and their emotions. Yet, how families are dealt with by experts and authorities is still considered a difficult topic. Due to sensitivities and required emphatic interaction with families on the one hand, and the rationalized DVI efforts, on the other hand, there is still scope for improving communication, providing information and meaningful inclusion of relatives in the DVI effort. This paper aims to bridge the standardized world of DVI efforts and families’ experienced realities and makes suggestions to further improve DVI efforts through inclusion of victims’ families. Based on qualitative interviews, the paper narrates involvement and experiences of inter alia DVI practitioners, victims’ families, advocates and clergy in the wake of the 1995 Srebrenica genocide which killed approximately 8,000 men, and the 9/11 in New York City with 2,750 victims. The paper shows that there are several models of including victims’ families into a DVI operation, and it argues for a model of where victims’ families become a partner in DVI operations.

Keywords: disaster victim identification (DVI), victims’ families, social science (qualitative), 9/11 attacks, Srebrenica genocide

Procedia PDF Downloads 229
4923 Impact of Surface Roughness on Light Absorption

Authors: V. Gareyan, Zh. Gevorkian

Abstract:

We study oblique incident light absorption in opaque media with rough surfaces. An analytical approach with modified boundary conditions taking into account the surface roughness in metallic or dielectric films has been discussed. Our approach reveals interference-linked terms that modify the absorption dependence on different characteristics. We have discussed the limits of our approach that hold valid from the visible to the microwave region. Polarization and angular dependences of roughness-induced absorption are revealed. The existence of an incident angle or a wavelength for which the absorptance of a rough surface becomes equal to that of a flat surface is predicted. Based on this phenomenon, a method of determining roughness correlation length is suggested.

Keywords: light, absorption, surface, roughness

Procedia PDF Downloads 48
4922 An Alternative Proof for the Topological Entropy of the Motzkin Shift

Authors: Fahad Alsharari, Mohd Salmi Md. Noorani

Abstract:

A Motzkin shift is a mathematical model for constraints on genetic sequences. In terms of the theory of symbolic dynamics, the Motzkin shift is nonsofic, and therefore, we cannot use the Perron-Frobenius theory to calculate its topological entropy. The Motzkin shift M(M,N) which comes from language theory, is defined to be the shift system over an alphabet A that consists of N negative symbols, N positive symbols and M neutral symbols. For an x in the full shift AZ, x is in M(M,N) if and only if every finite block appearing in x has a non-zero reduced form. Therefore, the constraint for x cannot be bounded in length. K. Inoue has shown that the entropy of the Motzkin shift M(M,N) is log(M + N + 1). In this paper, we find a new method of calculating the topological entropy of the Motzkin shift M(M,N) without any measure theoretical discussion.

Keywords: entropy, Motzkin shift, mathematical model, theory

Procedia PDF Downloads 471
4921 On Space Narrative and American Dream in Martin Eden

Authors: Yangyang Zhang

Abstract:

Martin Eden tells about the tragedy of the protagonist Martin Eden’s suicide after his disillusion about American dream. The author Jack London presents various spatial routines of Martin Eden and reveals the involvement of space in realizing American dream: on the one hand, the Berkeley and Oakland cities contribute to Martin’s material success, making his American dream practical, and on the other hand, the two cities involve in the oppression of bourgeoisie ideology to Martin, promoting the domestic imperialization of bourgeoisie ideology represented by American dream. Molded by bourgeoisie ideology in the city, Martin constructed the oriental South Sea, revealing the oversea imperialization of bourgeoisie ideology behind American dream. By exploring the social, historical and political aspects of space, Martin Eden tries to demonstrate the mere material success and imperialism represented by American dream, revealing the fact of the involvement of American dreamin ideological oppression. When Jack London wrote Martin Eden, he had become a famous writer and realized his personal "American dream". He integrated his struggle experience into the protagonist Martin Eden, and also put his critique of the nature of "American dream" in the novel. The concept of the "American Dream" made the United States the land of dreams, and it also made many Americans believe that through personal struggle, they could climb the social ladder. Under the context of rapid growth in economy in the late 19 th century, American dream was reduced to the satisfaction on a material level. When material wealth was fulfilled, many people felt shattered for a variety of reasons, and such a phenomenon was reflected in the literature of disillusionment in 19th-century America.Martin Eden is such a work about disillusion, in which the geographical space becomes the witness of the realization and disillusionment of the protagonist Martin's "American dream". By analyzing the spatial narrative in Martin Eden, this paper reveals that the "American dream" only represents material success for individuals and the imperialization of capitalist ideology, and exposes the ideological nature of the "American Dream".

Keywords: Martin Eden, space, American dream, ideology of imperialism

Procedia PDF Downloads 136
4920 Starting the Hospitalization Procedure with a Medicine Combination in the Cardiovascular Department of the Imam Reza (AS) Mashhad Hospital

Authors: Maryamsadat Habibi

Abstract:

Objective: pharmaceutical errors are avoidable occurrences that can result in inappropriate pharmaceutical use, patient harm, treatment failure, increased hospital costs and length of stay, and other outcomes that affect both the individual receiving treatment and the healthcare provider. This study aimed to perform a reconciliation of medications in the cardiovascular ward of Imam Reza Hospital in Mashhad, Iran, and evaluate the prevalence of medication discrepancies between the best medication list created for the patient by the pharmacist and the medication order of the treating physician there. Materials & Methods: The 97 patients in the cardiovascular ward of the Imam Reza Hospital in Mashhad were the subject of a cross-sectional study from June to September of 2021. After giving their informed consent and being admitted to the ward, all patients with at least one underlying condition and at least two medications being taken at home were included in the study. A medical reconciliation form was used to record patient demographics and medical histories during the first 24 hours of admission, and the information was contrasted with the doctors' orders. The doctor then discovered medication inconsistencies between the two lists and double-checked them to separate the intentional from the accidental anomalies. Finally, using SPSS software version 22, it was determined how common medical discrepancies are and how different sorts of discrepancies relate to various variables. Results: The average age of the participants in this study was 57.6915.84 years, with 57.7% of men and 42.3% of women. 95.9% of the patients among these people encountered at least one medication discrepancy, and 58.9% of them suffered at least one unintentional drug cessation. Out of the 659 medications registered in the study, 399 cases (60.54%) had inconsistencies, of which 161 cases (40.35%) involved the intentional stopping of a medication, 123 cases (30.82%) involved the stopping of a medication unintentionally, and 115 cases (28.82%) involved the continued use of a medication by adjusting the dose. Additionally, the category of cardiovascular pharmaceuticals and the category of gastrointestinal medications were found to have the highest medical inconsistencies in the current study. Furthermore, there was no correlation between the frequency of medical discrepancies and the following variables: age, ward, date of visit, type, and number of underlying diseases (P=0.13), P=0.61, P=0.72, P=0.82, P=0.44, and so forth. On the other hand, there was a statistically significant correlation between the number of medications taken at home (P=0.037) and the prevalence of medical discrepancies with gender (P=0.029). The results of this study revealed that 96% of patients admitted to the cardiovascular unit at Imam Reza Hospital had at least one medication error, which was typically an intentional drug discontinuance. According to the study's findings, patients admitted to Imam Reza Hospital's cardiovascular ward have a great potential for identifying and correcting various medication discrepancies as well as for avoiding prescription errors when the medication reconciliation method is used. As a result, it is essential to carry out a precise assessment to achieve the best treatment outcomes and avoid unintended medication discontinuation, unwanted drug-related events, and drug interactions between the patient's home medications and those prescribed in the hospital.

Keywords: drug combination, drug side effects, drug incompatibility, cardiovascular department

Procedia PDF Downloads 79
4919 Linear Codes Afforded by the Permutation Representations of Finite Simple Groups and Their Support Designs

Authors: Amin Saeidi

Abstract:

Using a representation-theoretic approach and considering G to be a finite primitive permutation group of degree n, our aim is to determine linear codes of length n that admit G as a permutation automorphism group. We can show that in some cases, every binary linear code admitting G as a permutation automorphism group is a submodule of a permutation module defined by a primitive action of G. As an illustration of the method, we consider the sporadic simple group M₁₁ and the unitary group U(3,3). We also construct some point- and block-primitive 1-designs from the supports of some codewords of the codes in the discussion.

Keywords: linear code, permutation representation, support design, simple group

Procedia PDF Downloads 73
4918 Law, Resistance, and Development in Georgia: A Case of Namakhvani HPP

Authors: Konstantine Eristavi

Abstract:

The paper will contribute to the discussion on the pitfalls, limits, and possibilities of legal and rights discourse in opposing large infrastructural projects in the context of neoliberal globalisation. To this end, the paper will analyse the struggle against the Namakhvani HPP project in Georgia. The latter has been hailed by the government as one of the largest energy projects in the history of the country, with an enormous potential impact on energy security, energy independence, economic growth, and development. This takes place against the backdrop of decades of market-led -or neoliberal- model of development in Georgia, characterised by structural adjustments, deregulation, privatisation, and Laissez-Fair approach to foreign investment. In this context, the Georgian state vies with other low and middle-income countries for foreign capital by offering to potential investors, on the one hand, exemptions from social and environmental regulations and, on the other hand, huge legal concessions and safeguards, thereby participating in what is often called a “race to the bottom.” The Namakhvani project is a good example of this. At every stage, the project has been marred with violations of laws and regulations concerning transparency, participation, social and environmental regulations, and so on. Moreover, the leaked contract between the state and the developer reveals the contractual safeguards which effectively insulate the investment throughout the duration of the contract from the changes in the national law that might adversely affect investors’ rights and returns. These clauses, aimed at preserving investors' economic position, place the contract above national law in many respects and even conflict with fundamental constitutional rights. In response to the perceived deficiencies of the project, one of the largest and most diverse social movements in the history of post-soviet Georgia has been assembled, consisting of the local population, conservative and leftist groups, human rights and environmental NGOs, etc. Crucially, the resistance movement is actively using legal tools. In order to analyse both the limitations and possibilities of legal discourse, the paper will distinguish between internal and immanent critiques. Law as internal critique, in the context of the struggles around the Namakhvani project, while potentially fruitful in hindering the project, risks neglecting and reproducing those factors -e.g., the particular model of development- that made such contractual concessions and safeguards and concomitant rights violations possible in the first place. On the other hand, the use of rights and law as part of immanent critique articulates a certain incapacity on the part of the addressee government to uphold existing laws and rights due to structural factors, hence, pointing to a need for a fundamental change. This 'ruptural' form of legal discourse that the movement employs makes it possible to go beyond the discussion around the breaches of law and enables a critical deliberation on the development model within which these violations and extraordinary contractual safeguards become necessary. It will be argued that it is this form of immanent critique that expresses the emancipatory potential of legal discourse.

Keywords: law, resistance, development, rights

Procedia PDF Downloads 76
4917 An Analysis of the Representation of the Translator and Translation Process into Brazilian Social Networking Groups

Authors: Érica Lima

Abstract:

In the digital era, in which we have an avalanche of information, it is not new that the Internet has brought new modes of communication and knowledge access. Characterized by the multiplicity of discourses, opinions, beliefs and cultures, the web is a space of political-ideological dimensions where people (who often do not know each other) interact and create representations, deconstruct stereotypes, and redefine identities. Currently, the translator needs to be able to deal with digital spaces ranging from specific software to social media, which inevitably impact on his professional life. One of the most impactful ways of being seen in cyberspace is the participation in social networking groups. In addition to its ability to disseminate information among participants, social networking groups allow a significant personal and social exposure. Such exposure is due to the visibility of each participant achieved not only on its personal profile page, but also in each comment or post the person makes in the groups. The objective of this paper is to study the representations of translators and translation process on the Internet, more specifically in publications in two Brazilian groups of great influence on the Facebook: "Translators/Interpreters" and "Translators, Interpreters and Curious". These chosen groups represent the changes the network has brought to the profession, including the way translators are seen and see themselves. The analyzed posts allowed a reading of what common sense seems to think about the translator as opposed to what the translators seem to think about themselves as a professional class. The results of the analysis lead to the conclusion that these two positions are antagonistic and sometimes represent conflict of interests: on the one hand, the society in general consider the translator’s work something easy, therefore it is not necessary to be well remunerated; on the other hand, the translators who know how complex a translation process is and how much it takes to be a good professional. The results also reveal that social networking sites such as Facebook provide more visibility, but it takes a more active role from the translator to achieve a greater appreciation of the profession and more recognition of the role of the translator, especially in face of increasingly development of automatic translation programs.

Keywords: Facebook, social representation, translation, translator

Procedia PDF Downloads 144
4916 Profit Share in Income: An Analysis of Its Influence on Macroeconomic Performance

Authors: Alain Villemeur

Abstract:

The relationships between the profit share in income on the one hand and the growth rates of output and employment on the other hand have been studied for 17 advanced economies since 1961. The vast majority (98%) of annual values for the profit share fall between 20% and 40%, with an average value of 33.9%. For the 17 advanced economies, Gross Domestic Product and productivity growth rates tend to fall as the profit share in income rises. For the employment growth rates, the relationships are complex; nevertheless, over long periods (1961-2000), it appears that the more job-creating economies are Australia, Canada, and the United States; they have experienced a profit share close to 1/3. This raises a number of questions, not least the value of 1/3 for the profit share and its role in macroeconomic fundamentals. To explain these facts, an endogenous growth model is developed. This growth and distribution model reconciles the great ideas of Kaldor (economic growth as a chain reaction), of Keynes (effective demand and marginal efficiency of capital) and of Ricardo (importance of the wage-profit distribution) in an economy facing creative destruction. A production function is obtained, depending mainly on the growth of employment, the rate of net investment and the profit share in income. In theory, we show the existence of incentives: an incentive for job creation when the profit share is less than 1/3 and another incentive for job destruction in the opposite case. Thus, increasing the profit share can boost the employment growth rate until it reaches the value of 1/3; otherwise lowers the employment growth rate. Three key findings can be drawn from these considerations. The first reveals that the best GDP and productivity growth rates are obtained with a profit share of less than 1/3. The second is that maximum job growth is associated with a 1/3 profit share, given the existence of incentives to create more jobs when the profit share is less than 1/3 or to destroy more jobs otherwise. The third is the decline in performance (GDP growth rate and productivity growth rate) when the profit share increases. In conclusion, increasing the profit share in income weakens GDP growth or productivity growth as a long-term trend, contrary to the trickle-down hypothesis. The employment growth rate is maximum for a profit share in income of 1/3. All these lessons suggest macroeconomic policies considering the profit share in income.

Keywords: advanced countries, GDP growth, employment growth, profit share, economic policies

Procedia PDF Downloads 58
4915 Review of the Safety of Discharge on the First Postoperative Day Following Carotid Surgery: A Retrospective Analysis

Authors: John Yahng, Hansraj Riteesh Bookun

Abstract:

Objective: This was a retrospective cross-sectional study evaluating the safety of discharge on the first postoperative day following carotid surgery - principally carotid endarterectomy. Methods: Between January 2010 to October 2017, 252 patients with mean age of 72 years, underwent carotid surgery by seven surgeons. Their medical records were consulted and their operative as well as complication timelines were databased. Descriptive statistics were used to analyse pooled responses and our indicator variables. The statistical package used was STATA 13. Results: There were 183 males (73%) and the comorbid burden was as follows: ischaemic heart disease (54%), diabetes (38%), hypertension (92%), stage 4 kidney impairment (5%) and current or ex-smoking (77%). The main indications were transient ischaemic attacks (42%), stroke (31%), asymptomatic carotid disease (16%) and amaurosis fugax (8%). 247 carotid endarterectomies (109 with patch arterioplasty, 88 with eversion and transection technique, 50 with endarterectomy only) were performed. 2 carotid bypasses, 1 embolectomy, 1 thrombectomy with patch arterioplasty and 1 excision of a carotid body tumour were also performed. 92% of the cases were performed under general anaesthesia. A shunt was used in 29% of cases. The mean length of stay was 5.1 ± 3.7days with the range of 2 to 22 days. No patient was discharged on day 1. The mean time from admission to surgery was 1.4 ± 2.8 days, ranging from 0 to 19 days. The mean time from surgery to discharge was 2.7 ± 2.0 days with the of range 0 to 14 days. 36 complications were encountered over this period, with 12 failed repairs (5 major strokes, 2 minor strokes, 3 transient ischaemic attacks, 1 cerebral bleed, 1 occluded graft), 11 bleeding episodes requiring a return to the operating theatre, 5 adverse cardiac events, 3 cranial nerve injuries, 2 respiratory complications, 2 wound complications and 1 acute kidney injury. There were no deaths. 17 complications occurred on postoperative day 0, 11 on postoperative day 1, 6 on postoperative day 2 and 2 on postoperative day 3. 78% of all complications happened before the second postoperative day. Out of the complications which occurred on the second or third postoperative day, 4 (1.6%) were bleeding episodes, 1 (0.4%) failed repair , 1 respiratory complication (0.4%) and 1 wound complication (0.4%). Conclusion: Although it has been common practice to discharge patients on the second postoperative day following carotid endarterectomy, we find here that discharge on the first operative day is safe. The overall complication rate is low and most complications are captured before the second postoperative day. We suggest that patients having an uneventful first 24 hours post surgery be discharged on the first day. This should reduce hospital length of stay and the health economic burden.

Keywords: carotid, complication, discharge, surgery

Procedia PDF Downloads 161
4914 Predicting Provider Service Time in Outpatient Clinics Using Artificial Intelligence-Based Models

Authors: Haya Salah, Srinivas Sharan

Abstract:

Healthcare facilities use appointment systems to schedule their appointments and to manage access to their medical services. With the growing demand for outpatient care, it is now imperative to manage physician's time effectively. However, high variation in consultation duration affects the clinical scheduler's ability to estimate the appointment duration and allocate provider time appropriately. Underestimating consultation times can lead to physician's burnout, misdiagnosis, and patient dissatisfaction. On the other hand, appointment durations that are longer than required lead to doctor idle time and fewer patient visits. Therefore, a good estimation of consultation duration has the potential to improve timely access to care, resource utilization, quality of care, and patient satisfaction. Although the literature on factors influencing consultation length abound, little work has done to predict it using based data-driven approaches. Therefore, this study aims to predict consultation duration using supervised machine learning algorithms (ML), which predicts an outcome variable (e.g., consultation) based on potential features that influence the outcome. In particular, ML algorithms learn from a historical dataset without explicitly being programmed and uncover the relationship between the features and outcome variable. A subset of the data used in this study has been obtained from the electronic medical records (EMR) of four different outpatient clinics located in central Pennsylvania, USA. Also, publicly available information on doctor's characteristics such as gender and experience has been extracted from online sources. This research develops three popular ML algorithms (deep learning, random forest, gradient boosting machine) to predict the treatment time required for a patient and conducts a comparative analysis of these algorithms with respect to predictive performance. The findings of this study indicate that ML algorithms have the potential to predict the provider service time with superior accuracy. While the current approach of experience-based appointment duration estimation adopted by the clinic resulted in a mean absolute percentage error of 25.8%, the Deep learning algorithm developed in this study yielded the best performance with a MAPE of 12.24%, followed by gradient boosting machine (13.26%) and random forests (14.71%). Besides, this research also identified the critical variables affecting consultation duration to be patient type (new vs. established), doctor's experience, zip code, appointment day, and doctor's specialty. Moreover, several practical insights are obtained based on the comparative analysis of the ML algorithms. The machine learning approach presented in this study can serve as a decision support tool and could be integrated into the appointment system for effectively managing patient scheduling.

Keywords: clinical decision support system, machine learning algorithms, patient scheduling, prediction models, provider service time

Procedia PDF Downloads 118
4913 3D Numerical Simulation on Annular Diffuser Temperature Distribution Enhancement by Different Twist Arrangement

Authors: Ehan Sabah Shukri, Wirachman Wisnoe

Abstract:

The influence of twist arrangement on the temperature distribution in an annular diffuser fitted with twisted rectangular hub is investigated. Different pitches (Y = 120 mm, 100 mm, 80 mm, and 60 mm) for the twist arrangements are simulated to be compared. The geometry of the annular diffuser and the inlet condition for the hub arrangements are kept constant. The result reveals that using twisted rectangular hub insert with different pitches will force the temperature to distribute in a circular direction. However, temperature distribution will be enhanced with the length pitch increases.

Keywords: numerical simulation, twist arrangement, annular diffuser, temperature distribution, swirl flow, pitches

Procedia PDF Downloads 405
4912 Impact of Diabetes Mellitus Type 2 on Clinical In-Stent Restenosis in First Elective Percutaneous Coronary Intervention Patients

Authors: Leonard Simoni, Ilir Alimehmeti, Ervina Shirka, Endri Hasimi, Ndricim Kallashi, Verona Beka, Suerta Kabili, Artan Goda

Abstract:

Background: Diabetes Mellitus type 2, small vessel calibre, stented length of vessel, complex lesion morphology, and prior bypass surgery have resulted risk factors for In-Stent Restenosis (ISR). However, there are some contradictory results about body mass index (BMI) as a risk factor for ISR. Purpose: We want to identify clinical, lesional and procedural factors that can predict clinical ISR in our patients. Methods: Were enrolled 759 patients who underwent first-time elective PCI with Bare Metal Stents (BMS) from September 2011 to December 2013 in our Department of Cardiology and followed them for at least 1.5 years with a median of 862 days (2 years and 4 months). Only the patients re-admitted with ischemic heart disease underwent control coronary angiography but no routine angiographic control was performed. Patients were categorized in ISR and non-ISR groups and compared between them. Multivariate analysis - Binary Logistic Regression: Forward Conditional Method was used to identify independent predictive risk factors. P was considered statistically significant when <0.05. Results: ISR compared to non-ISR individuals had a significantly lower BMI (25.7±3.3 vs. 26.9±3.7, p=0.004), higher risk anatomy (LM + 3-vessel CAD) (23% vs. 14%, p=0.03), higher number of stents/person used (2.1±1.1 vs. 1.75±0.96, p=0.004), greater length of stents/person used (39.3±21.6 vs. 33.3±18.5, p=0.01), and a lower use of clopidogrel and ASA (together) (95% vs. 99%, p=0.012). They also had a higher, although not statistically significant, prevalence of Diabetes Mellitus (42% vs. 32%, p=0.072) and a greater number of treated vessels (1.36±0.5 vs. 1.26±0.5, p=0.08). In the multivariate analysis, Diabetes Mellitus type 2 and multiple stents used were independent predictors risk factors for In-Stent Restenosis, OR 1.66 [1.03-2.68], p=0.039, and OR 1.44 [1.16-1.78,] p=0.001, respectively. On the other side higher BMI and use of clopidogrel and ASA together resulted protective factors OR 0.88 [0.81-0.95], p=0.001 and OR 0.2 [0.06-0.72] p=0.013, respectively. Conclusion: Diabetes Mellitus and multiple stents are strong predictive risk factors, whereas the use of clopidogrel and ASA together are protective factors for clinical In-Stent Restenosis. Paradoxically High BMI is a protective factor for In-stent Restenosis, probably related to a larger diameter of vessels and consequently a larger diameter of stents implanted in these patients. Further studies are needed to clarify this finding.

Keywords: body mass index, diabetes mellitus, in-stent restenosis, percutaneous coronary intervention

Procedia PDF Downloads 200
4911 Hemispheric Locus and Gender Predict the Delay between the Moment of Stroke and Hospitalization

Authors: D. Anderlini, G. Wallis

Abstract:

Background: The number of people experiencing stroke is steadily increasing due to changes in diet and lifestyle, to longer life expectancy resulting in older population, to higher survival rates as a consequence of improvements during the acute phase. This study considers what risk factors might contribute to delayed entry to hospital for treatment. Methods: We analyzed data from 2472 patients admitted to the Stroke Unit of the Royal Brisbane Women's Hospital, Australia, between 2002 to 2011. Results: Previous studies have reported that factors which can contribute to delay include the patient’s age, the time of day, physical location, visit the GP instead of going to the emergency, means of transport, severity of symptoms and type of stroke. Contrary to findings of other studies, we found a strong correlation between side of lesion and delay in admission: patients with right hemisphere lesions had an average delay of 3.78 days, while patients with left hemisphere lesions had an average delay of 1.49 days. Damage to the right hemisphere generally ends in motor impairment in the non-dominant hand and no speech impediment. In contrast, left hemisphere lesions can result in deficit to; dominant hand function and aphasia which will be noticed even if their impact on performance is relatively minor. A finding which goes against many previous studies, is the fact that women get to the hospital much sooner than men, showing an average delay of 0.92 days in women vs. 3.36 days in men. Conclusion: Acute surgical-pharmacological therapies are most effective if applied immediately after stroke. Hence delays to admission can be crucial to the degree of recovery. The tendency of patients to overlook symptoms of right hemisphere lesion should be the target of information campaigns both for the general public and GPs. Why do men go to hospital so late? We don't know yet! Nevertheless an awareness plan specifically direct to male population should be on the agenda of Health Departments.

Keywords: gender, admission delay, stroke location, bioinformatics, biomedicine

Procedia PDF Downloads 224
4910 A Microsurgery-Specific End-Effector Equipped with a Bipolar Surgical Tool and Haptic Feedback

Authors: Hamidreza Hoshyarmanesh, Sanju Lama, Garnette R. Sutherland

Abstract:

In tele-operative robotic surgery, an ideal haptic device should be equipped with an intuitive and smooth end-effector to cover the surgeon’s hand/wrist degrees of freedom (DOF) and translate the hand joint motions to the end-effector of the remote manipulator with low effort and high level of comfort. This research introduces the design and development of a microsurgery-specific end-effector, a gimbal mechanism possessing 4 passive and 1 active DOFs, equipped with a bipolar forceps and haptic feedback. The robust gimbal structure is comprised of three light-weight links/joint, pitch, yaw, and roll, each consisting of low-friction support and a 2-channel accurate optical position sensor. The third link, which provides the tool roll, was specifically designed to grip the tool prongs and accommodate a low mass geared actuator together with a miniaturized capstan-rope mechanism. The actuator is able to generate delicate torques, using a threaded cylindrical capstan, to emulate the sense of pinch/coagulation during conventional microsurgery. While the tool left prong is fixed to the rolling link, the right prong bears a miniaturized drum sector with a large diameter to expand the force scale and resolution. The drum transmits the actuator output torque to the right prong and generates haptic force feedback at the tool level. The tool is also equipped with a hall-effect sensor and magnet bar installed vis-à-vis on the inner side of the two prongs to measure the tooltip distance and provide an analogue signal to the control system. We believe that such a haptic end-effector could significantly increase the accuracy of telerobotic surgery and help avoid high forces that are known to cause bleeding/injury.

Keywords: end-effector, force generation, haptic interface, robotic surgery, surgical tool, tele-operation

Procedia PDF Downloads 113
4909 Predicting Aggregation Propensity from Low-Temperature Conformational Fluctuations

Authors: Hamza Javar Magnier, Robin Curtis

Abstract:

There have been rapid advances in the upstream processing of protein therapeutics, which has shifted the bottleneck to downstream purification and formulation. Finding liquid formulations with shelf lives of up to two years is increasingly difficult for some of the newer therapeutics, which have been engineered for activity, but their formulations are often viscous, can phase separate, and have a high propensity for irreversible aggregation1. We explore means to develop improved predictive ability from a better understanding of how protein-protein interactions on formulation conditions (pH, ionic strength, buffer type, presence of excipients) and how these impact upon the initial steps in protein self-association and aggregation. In this work, we study the initial steps in the aggregation pathways using a minimal protein model based on square-well potentials and discontinuous molecular dynamics. The effect of model parameters, including range of interaction, stiffness, chain length, and chain sequence, implies that protein models fold according to various pathways. By reducing the range of interactions, the folding- and collapse- transition come together, and follow a single-step folding pathway from the denatured to the native state2. After parameterizing the model interaction-parameters, we developed an understanding of low-temperature conformational properties and fluctuations, and the correlation to the folding transition of proteins in isolation. The model fluctuations increase with temperature. We observe a low-temperature point, below which large fluctuations are frozen out. This implies that fluctuations at low-temperature can be correlated to the folding transition at the melting temperature. Because proteins “breath” at low temperatures, defining a native-state as a single structure with conserved contacts and a fixed three-dimensional structure is misleading. Rather, we introduce a new definition of a native-state ensemble based on our understanding of the core conservation, which takes into account the native fluctuations at low temperatures. This approach permits the study of a large range of length and time scales needed to link the molecular interactions to the macroscopically observed behaviour. In addition, these models studied are parameterized by fitting to experimentally observed protein-protein interactions characterized in terms of osmotic second virial coefficients.

Keywords: protein folding, native-ensemble, conformational fluctuation, aggregation

Procedia PDF Downloads 356
4908 Electrochemical Performance of Femtosecond Laser Structured Commercial Solid Oxide Fuel Cells Electrolyte

Authors: Mohamed A. Baba, Gazy Rodowan, Brigita Abakevičienė, Sigitas Tamulevičius, Bartlomiej Lemieszek, Sebastian Molin, Tomas Tamulevičius

Abstract:

Solid oxide fuel cells (SOFC) efficiently convert hydrogen to energy without producing any disturbances or contaminants. The core of the cell is electrolyte. For improving the performance of electrolyte-supported cells, it is desirable to extend the available exchange surface area by micro-structuring of the electrolyte with laser-based micromachining. This study investigated the electrochemical performance of cells micro machined using a femtosecond laser. Commercial ceramic SOFC (Elcogen, AS) with a total thickness of 400 μm was structured by 1030 nm wavelength Yb: KGW fs-laser Pharos (Light Conversion) using 100 kHz repetition frequency and 290 fs pulse length light by scanning with the galvanometer scanner (ScanLab) and focused with a f-Theta telecentric lens (SillOptics). The sample height was positioned using a motorized z-stage. The microstructures were formed using a laser spiral trepanning in Ni/YSZ anode supported membrane at the central part of the ceramic piece of 5.5 mm diameter at active area of the cell. All surface was drilled with 275 µm diameter holes spaced by 275 µm. The machining processes were carried out under ambient conditions. The microstructural effects of the femtosecond laser treatment on the electrolyte surface were investigated prior to the electrochemical characterisation using a scanning electron microscope (SEM) Quanta 200 FEG (FEI). The Novo control Alpha-A was used for electrochemical impedance spectroscopy on a symmetrical cell configuration with an excitation amplitude of 25 mV and a frequency range of 1 MHz to 0.1 Hz. The fuel cell characterization of the cell was examined on open flanges test setup by Fiaxell. Using nickel mesh on the anode side and au mesh on the cathode side, the cell was electrically linked. The cell was placed in a Kittec furnace with a Process IDentifier temperature controller. The wires were connected to a Solartron 1260/1287 frequency analyzer for the impedance and current-voltage characterization. In order to determine the impact of the anode's microstructure on the performance of the commercial cells, the acquired results were compared to cells with unstructured anode. Geometrical studies verified that the depth of the -holes increased linearly according to laser energy and scanning times. On the other hand, it reduced as the scanning speed increased. The electrochemical analysis demonstrates that the open circuit voltage OCV values of the two cells are equal. Further, the modified cell's initial slope reduces to 0.209 from 0.253 of the unmodified cell, revealing that the surface modification considerably decreases energy loss. Plus, the maximum power density for the cell with the microstructure and the reference cell respectively, are 1.45 and 1.16 Wcm⁻².

Keywords: electrochemical performance, electrolyte-supported cells, laser micro-structuring, solid oxide fuel cells

Procedia PDF Downloads 61
4907 Nationalization of the Social Life in Argentina: Accumulation of Capital, State Intervention, Labor Market, and System of Rights in the Last Decades

Authors: Mauro Cristeche

Abstract:

This work begins with a very simple question: How does the State spend? Argentina is witnessing a process of growing nationalization of social life, so it is necessary to find out the explanations of the phenomenon on the specific dynamic of the capitalist mode of production in Argentina and its transformations in the last decades. Then the new question is: what happened in Argentina that could explain this phenomenon? Since the seventies, the capital growth in Argentina faces deep competitive problems. Until that moment the agrarian wealth had worked as a compensation mechanism, but it began to find its limits. In the meantime, some important demographical and structural changes had happened. The strategy of the capitalist class had to become to seek in the cheapness of the labor force the main source of compensation of its weakness. As a result, a tendency to worsen the living conditions and fragmentation of the working class started to develop, manifested by unemployment, underemployment, and the fall of the purchasing power of the salary as a highlighted fact. As a consequence, it is suggested that the role of the State became stronger and public expenditure increased, as a historical trend, because it has to intervene to face the contradictions and constant growth problems posed by the development of capitalism in Argentina. On the one hand, the State has to guarantee the process of buying the cheapened workforce and at the same time the process of reproduction of the working class. On the other hand, it has to help to reproduce the individual capitals but needs to ‘attack’ them in different ways. This is why the role of the State is said to be the general political representative to the national portion of the total social capital. What will be studied is the dynamic of the intervention of the Argentine State in the context of the particular national process of capital growth, and its dynamics in the last decades. What this paper wants to show are the main general causes that could explain the phenomenon of nationalization of the social life and how it has impacted the life conditions of the working class and the system of rights.

Keywords: Argentina, nationalization, public policies, rights, state

Procedia PDF Downloads 131
4906 Sulfanilamide/Epoxy Resin and Its Application as Tackifier in Epoxy Adhesives

Authors: Oiane Ruiz de Azua, Salvador Borros, Nuria Agullo, Jordi Arbusa

Abstract:

Tackiness is described as the ability to spontaneously form a bond to another material under light pressures within a short application time. During the first few minutes of the adhesive's curing, it is necessary to have enough tack to keep the substrates together while cohesion is increasing within the adhesive. This property plays a key role in the manufacturing process of pieces. Epoxy adhesives, unlike other adhesives, usually present low tackiness before curing; however, there is very little literature about the use of tackifiers in epoxy adhesives, except for the high molecular weight epoxy additives. In the present work, a tetrafunctional epoxy resin based on Bisphenol-A and Sulfanilamide has been synthesized in order to be used as a tackifier. This additive offers improved specific adhesion to two-component (2K) epoxy adhesives. The dosage of the tackifier has to be done carefully not to alter the mechanical and rheological properties of the adhesive. The synthetized product has been analyzed by FTIR and ¹H-NMR analysis, and the effect of the addition of 1 wt % of the tackifier on rheological properties, viscoelastic behavior, and mechanical properties has been studied. On one hand, the addition of the product in the epoxy resin part showed a significant increase in tackiness regarding the neat epoxy resin. On the other hand, tackiness of the whole formulation was also increased. Curing time of the adhesive has not undergone any relevant changes with the tackifier addition. Regarding viscoelastic properties, Storage Modulus (G') and Loss Modulus (G'') remain also unchanged at ambient temperature. Probably, in case higher tackifier concentration would be added, differences in viscoelastic properties would be observed. The study of mechanical properties shows that hardness and tensile strength also keep their values unchanged regarding neat two component adhesive. In conclusion, the addition of 1 wt % of sulfanilamide/epoxy enhanced the tackiness of the epoxy resin part, improves tack without modifying significantly either the rheological, the mechanical, or the viscoelastic properties of the product. Thus, the sulfanilamide presented could be a good candidate to be used as an additive to the 2k epoxy formulation for the manufacturing process of pieces.

Keywords: epoxy adhesive, manufacturing process of pieces, sulfanilamide, tackifiers

Procedia PDF Downloads 174