Search results for: free market foundation perspective
8754 Dielectric Properties in Frequency Domain of Main Insulation System of Printed Circuit Board
Authors: Xize Dai, Jian Hao, Claus Leth Bak, Gian Carlo Montanari, Huai Wang
Abstract:
Printed Circuit Board (PCB) is a critical component applicable to power electronics systems, especially for high-voltage applications involving several high-voltage and high-frequency SiC/GaN devices. The insulation system of PCB is facing more challenges from high-voltage and high-frequency stress that can alter the dielectric properties. Dielectric properties of the PCB insulation system also determine the electrical field distribution that correlates with intrinsic and extrinsic aging mechanisms. Hence, investigating the dielectric properties in the frequency domain of the PCB insulation system is a must. The paper presents the frequency-dependent, temperature-dependent, and voltage-dependent dielectric properties, permittivity, conductivity, and dielectric loss tangents of PCB insulation systems. The dielectric properties mechanisms associated with frequency, temperature, and voltage are revealed from the design perspective. It can be concluded that the dielectric properties of PCB in the frequency domain show a strong dependence on voltage, frequency, and temperature. The voltage-, frequency-, and temperature-dependent dielectric properties are associated with intrinsic conduction behavior and polarization patterns from the perspective of dielectric theory. The results may provide some reference for the PCB insulation system design in high voltage, high frequency, and high-temperature power electronics applications.Keywords: electrical insulation system, dielectric properties, high voltage and frequency, printed circuit board
Procedia PDF Downloads 998753 Environmental Accounting: A Conceptual Study of Indian Context
Authors: Pradip Kumar Das
Abstract:
As the entire world continues its rapid move towards industrialization, it has seriously threatened mankind’s ability to maintain an ecological balance. Geographical and natural forces have a significant influence on the location of industries. Industrialization is the foundation stone of the development of any country, while the unplanned industrialization and discharge of waste by industries is the cause of environmental pollution. There is growing degree of awareness and concern globally among nations about environmental degradation or pollution. Environmental resources endowed by the gift of nature and not manmade are invaluable natural resources of a country like India. Any developmental activity is directly related to natural and environmental resources. Economic development without environmental considerations brings about environmental crises and damages the quality of life of present, as well as future generation. As corporate sectors in the global market, especially in India, are becoming anxious about environmental degradation, naturally more and more emphasis will be ascribed to how environment-friendly the outcomes are. Maintaining accounts of such environmental and natural resources in the country has become more urgent. Moreover, international awareness and acceptance of the importance of environmental issues has motivated the development of a branch of accounting called “Environmental Accounting”. Environmental accounting attempts to detect and focus the resources consumed and the costs rendered by an industrial unit to the environment. For the sustainable development of mankind, a healthy environment is indispensable. Gradually, therefore, in many countries including India, environment matters are being given top most priority. Accounting and disclosure of environmental matters have been increasingly manifesting as an important dimension of corporate accounting and reporting practices. But, as conventional accounting deals with mainly non-living things, the formulation of valuation, and measurement and accounting techniques for incorporating environment-related matters in the corporate financial statement sometimes creates problems for the accountant. In the light of this situation, the conceptual analysis of the study is concerned with the rationale of environmental accounting on the economy and society as a whole, and focuses the failures of the traditional accounting system. A modest attempt has been made to throw light on the environmental awareness in developing nations like India and discuss the problems associated with the implementation of environmental accounting. The conceptual study also reflects that despite different anomalies, environmental accounting is becoming an increasing important aspect of the accounting agenda within the corporate sector in India. Lastly, a conclusion, along with recommendations, has been given to overcome the situation.Keywords: environmental accounting, environmental degradation, environmental management, environmental resources
Procedia PDF Downloads 3468752 Experimental Investigation of the Impact of Biosurfactants on Residual-Oil Recovery
Authors: S. V. Ukwungwu, A. J. Abbas, G. G. Nasr
Abstract:
The increasing high price of natural gas and oil with attendant increase in energy demand on world markets in recent years has stimulated interest in recovering residual oil saturation across the globe. In order to meet the energy security, efforts have been made in developing new technologies of enhancing the recovery of oil and gas, utilizing techniques like CO2 flooding, water injection, hydraulic fracturing, surfactant flooding etc. Surfactant flooding however optimizes production but poses risk to the environment due to their toxic nature. Amongst proven records that have utilized other type of bacterial in producing biosurfactants for enhancing oil recovery, this research uses a technique to combine biosurfactants that will achieve a scale of EOR through lowering interfacial tension/contact angle. In this study, three biosurfactants were produced from three Bacillus species from freeze dried cultures using sucrose 3 % (w/v) as their carbon source. Two of these produced biosurfactants were screened with the TEMCO Pendant Drop Image Analysis for reduction in IFT and contact angle. Interfacial tension was greatly reduced from 56.95 mN.m-1 to 1.41 mN.m-1 when biosurfactants in cell-free culture (Bacillus licheniformis) were used compared to 4. 83mN.m-1 cell-free culture of Bacillus subtilis. As a result, cell-free culture of (Bacillus licheniformis) changes the wettability of the biosurfactant treatment for contact angle measurement to more water-wet as the angle decreased from 130.75o to 65.17o. The influence of microbial treatment on crushed rock samples was also observed by qualitative wettability experiments. Treated samples with biosurfactants remained in the aqueous phase, indicating a water-wet system. These results could prove that biosurfactants can effectively change the chemistry of the wetting conditions against diverse surfaces, providing a desirable condition for efficient oil transport in this way serving as a mechanism for EOR. The environmental friendly effect of biosurfactants applications for industrial purposes play important advantages over chemically synthesized surfactants, with various possible structures, low toxicity, eco-friendly and biodegradability.Keywords: bacillus, biosurfactant, enhanced oil recovery, residual oil, wettability
Procedia PDF Downloads 2828751 Numerical Study of Rayleight Number and Eccentricity Effect on Free Convection Fluid Flow and Heat Transfer of Annulus
Authors: Ali Reza Tahavvor‚ Saeed Hosseini, Behnam Amiri
Abstract:
Concentric and eccentric annulus is used frequently in technical and industrial applications such as nuclear reactors, thermal storage system and etc. In this paper, computational fluid dynamics (CFD) is used to investigate two dimensional free convection of laminar flow in annulus with isotherm cylinders surface and cooler inner surface. Problem studied in thirty different cases. Due to natural convection continuity and momentum equations are coupled and must be solved simultaneously. Finite volume method is used for solving governing equations. The purpose was to obtain the eccentricity effect on Nusselt number in different Rayleight numbers, so streamlines and temperature fields must be determined. Results shown that the highest Nusselt number values occurs in degree of eccentricity equal to 0.5 upward for inner cylinder and degree of eccentricity equal to 0.3 upward for outer cylinder. Side eccentricity reduces the outer cylinder Nusselt number but increases inner cylinder Nusselt number. The trend in variation of Nusselt number with respect to eccentricity remain similar in different Rayleight numbers. Correlations are included to calculate the Nusselt number of the cylinders.Keywords: natural convection, concentric, eccentric, Nusselt number, annulus
Procedia PDF Downloads 3788750 Stochastic Pi Calculus in Financial Markets: An Alternate Approach to High Frequency Trading
Authors: Jerome Joshi
Abstract:
The paper presents the modelling of financial markets using the Stochastic Pi Calculus model. The Stochastic Pi Calculus model is mainly used for biological applications; however, the feature of this model promotes its use in financial markets, more prominently in high frequency trading. The trading system can be broadly classified into exchange, market makers or intermediary traders and fundamental traders. The exchange is where the action of the trade is executed, and the two types of traders act as market participants in the exchange. High frequency trading, with its complex networks and numerous market participants (intermediary and fundamental traders) poses a difficulty while modelling. It involves the participants to seek the advantage of complex trading algorithms and high execution speeds to carry out large volumes of trades. To earn profits from each trade, the trader must be at the top of the order book quite frequently by executing or processing multiple trades simultaneously. This would require highly automated systems as well as the right sentiment to outperform other traders. However, always being at the top of the book is also not best for the trader, since it was the reason for the outbreak of the ‘Hot – Potato Effect,’ which in turn demands for a better and more efficient model. The characteristics of the model should be such that it should be flexible and have diverse applications. Therefore, a model which has its application in a similar field characterized by such difficulty should be chosen. It should also be flexible in its simulation so that it can be further extended and adapted for future research as well as be equipped with certain tools so that it can be perfectly used in the field of finance. In this case, the Stochastic Pi Calculus model seems to be an ideal fit for financial applications, owing to its expertise in the field of biology. It is an extension of the original Pi Calculus model and acts as a solution and an alternative to the previously flawed algorithm, provided the application of this model is further extended. This model would focus on solving the problem which led to the ‘Flash Crash’ which is the ‘Hot –Potato Effect.’ The model consists of small sub-systems, which can be integrated to form a large system. It is designed in way such that the behavior of ‘noise traders’ is considered as a random process or noise in the system. While modelling, to get a better understanding of the problem, a broader picture is taken into consideration with the trader, the system, and the market participants. The paper goes on to explain trading in exchanges, types of traders, high frequency trading, ‘Flash Crash,’ ‘Hot-Potato Effect,’ evaluation of orders and time delay in further detail. For the future, there is a need to focus on the calibration of the module so that they would interact perfectly with other modules. This model, with its application extended, would provide a basis for researchers for further research in the field of finance and computing.Keywords: concurrent computing, high frequency trading, financial markets, stochastic pi calculus
Procedia PDF Downloads 868749 Analysis of Histamine Content in Selected Food Products from the Serbian Market
Authors: Brizita Djordjevic, Bojana Vidovic, Milica Zrnic, Uros Cakar, Ivan Stankovic, Davor Korcok, Sladjana Sobajic
Abstract:
Histamine is a biogenic amine, which is formed by enzymatic decarboxylation from the amino acid histidine. It can be found in foods such as fish and fish products, meat and fermented meat products, cheese, wine and beer. The presence of histamine in these foods can indicate microbiological spoilage or poor manufacturing processes. The consumption of food containing large amounts of histamine can have toxicological consequences. In 62 food products (31 canned fish products, 19 wines and 12 cheeses) from the market of Serbia the content of histamine was determined using enzyme-linked immunosorbent assay (ELISA) test kit according to the manufacturer's instructions (Immunolab GmbH, Kassel, Germany). The detection limits of this assay were 20 µg/kg for fish and cheese and 4 µg/L for wine. The concentration of histamine varied between 0.16-207 mg/kg in canned fish products, 0.03-1.47 mg/kg in cheeses and 0.01- 0.18 mg/L in wines. In all analyzed canned fish products the results obtained for the histamine were below the limits set by European and national legislation, so they can be considered acceptable and safe for the health consumers. The levels of histamine in analyzed cheeses and wines were very low and did not pose safety concerns.Keywords: cheese, enzyme-linked immunosorbent assay, histamine, fish products, wine
Procedia PDF Downloads 4508748 Providing a Suitable Model for Launching New Home Appliances Products to the Market
Authors: Ebrahim Sabermaash Eshghi, Donna Sandsmark
Abstract:
In changing modern economic conditions of the world, one the most important issues facing managers of firms, is increasing the sales and profitability through sales of newly developed products. This is while purpose of decreasing unnecessary costs is one of the most essential programs of smart managers for more implementation with new conditions in current business. In modern life, condition of misgiving is dominant in all of the industries. Accordingly, in this research, influence of different aspects of presenting products to the market is investigated. This study is done through a Quantitative-Qualitative (Interviews and Questionnaire) approach. In sum, 103 of informed managers and experts of Pars-Khazar Company have been examined through census. Validity of measurement tools was approved through judgments of experts. Reliability of tools was gained through Cronbach's alpha coefficient in size of 0.930 and in sum, validity and reliability of tools were approved generally. Results of regression test revealed that the influence of all aspects of product introduction supported the performance of product, positively and significantly. In addition that influence of two new factors raised from the interview, namely Human Resource Management and Management of product’s pre-test on performance of products was approved.Keywords: introducing products, performance, home appliances, price, advertisement, production
Procedia PDF Downloads 2138747 A Historical Analysis of The Concept of Equivalence from Different Theoretical Perspectives in Translation Studies
Authors: Amenador Kate Benedicta, Wang Zhiwei
Abstract:
Since the later parts of the 20th century, the notion of equivalence continues to be a central and critical concept in the development of translation theory. After decades of arguments over word-for-word and free translations methods, scholars attempting to develop more systematic and efficient translation theories began to focus on fundamental translation concepts such as equivalence. Although the concept of equivalence has piqued the interest of many scholars, its definition, scope, and applicability have sparked contentious arguments within the discipline. As a result, several distinct theories and explanations on the concept of equivalence have been put forward over the last half-century. Thus, this study explores and discusses the evolution of the critical concept of equivalence in translation studies through a bibliometric method of investigation of manual and digital books and articles by analyzing different scholars' key contributions and limitations on equivalence from various theoretical perspectives. While analyzing them, emphasis is placed on the innovations that each theory has brought to the comprehension of equivalence. In order to achieve the aim of the study, the article began by discussing the contributions of linguistically motivated theories to the notion of equivalence in translation, followed by functionalist-oriented contributions, before moving on to more recent advancements in translation studies on the concept. Because equivalence is such a broad notion, it is impossible to discuss each researcher in depth. As a result, the most well-known names and their equivalent theories are compared and contrasted in this research. The study emphasizes the developmental progression in our comprehension of the equivalence concept and equivalent effect. It concluded that the various theoretical perspective's contributions to the notion of equivalence rather complement and make up for the limitations of each other. The study also highlighted how troublesome the equivalent concept might become in terms of identifying the nature of translation and how central and unavoidable the concept is in every translation action, despite its limitations. The significance of the study lies in its synthesis of the different contributions and limitations of the various theories offered by scholars on the notion of equivalence, lending literature to both student and scholars in the field, and providing insight on future theoretical developmentKeywords: equivalence, functionalist translation theories, linguistic translation approaches, translation theories, Skopos
Procedia PDF Downloads 1178746 Integrating Renewable Energy Forecasting Systems with HEMS and Developing It with a Bottom-Up Approach
Authors: Punit Gandhi, J. C. Brezet, Tim Gorter, Uchechi Obinna
Abstract:
This paper introduces how weather forecasting could help in more efficient energy management for smart homes with the use of Home Energy Management Systems (HEMS). The paper also focuses on educating consumers and helping them make more informed decisions while using the HEMS. A combined approach of technical and user perspective has been selected to develop a novel HEMS-product-service combination in a more comprehensive manner. The current HEMS switches on/off the energy intensive appliances based on the fluctuating electricity tariffs, but with weather forecasting, it is possible to shift the time of use of energy intensive appliances to maximum electricity production from the renewable energy system installed in the house. Also, it is possible to estimate the heating/cooling load of the house for the day ahead demand. Hence, relevant insight is gained in the expected energy production and consumption load for the next day, facilitating better (more efficient, peak shaved, cheaper, etc.) energy management practices for smart homes. In literature, on the user perspective, it has been observed that consumers lose interest in using HEMS after three to four months. Therefore, to further help in better energy management practices, the new system had to be designed in a way that consumers would sustain their interaction with the system on a structural basis. It is hypothesized that, if consumers feel more comfortable with using such system, it would lead to a prolonged usage, including more energy savings and hence financial savings. To test the hypothesis, a survey for the HEMS is conducted, to which 59 valid responses were recorded. Analysis of the survey helped in designing a system which imparts better information about the energy production and consumption to the consumers. It is also found from the survey that, consumers like a variety of options and they do not like a constant reminder of what they should do. Hence, the final system is designed to encourage consumers to make an informed decision about their energy usage with a wide variety of behavioral options available. It is envisaged that the new system will be tested in several pioneering smart energy grid projects in both the Netherlands and India, with a continued ‘design thinking’ approach, combining the technical and user perspective, as the basis for further improvements.Keywords: weather forecasting, smart grid, renewable energy forecasting, user defined HEMS
Procedia PDF Downloads 2358745 C₅₉Pd: A Heterogeneous Catalytic Material for Heck Coupling Reaction
Authors: Manjusha C. Padole, Parag A. Deshpande
Abstract:
Density functional theory calculations were carried out for identification of an active heterogeneous catalyst to carry out Heck coupling reaction which is of pharmaceutical importance. One of the carbonaceous nanomaterials, heterofullerene, was designed for the reaction. Stability and reactivity of the proposed heterofullerenes (C59M, M = Pd/Ni) were established with insights into the metal-carbon bond, electron affinity and chemical potential. Adsorbent potentials of both the heterofullerenes were examined from the adsorption study of four halobenzenes (C6H5F, C6H5Cl, C6H5Br and C6H5I). Oxidative addition activities of all four halobenzenes were investigated by developing free energy landscapes over both the heterofullerenes for rate determining step (oxidative addition). C6H5I showed a good catalytic activity for the rate determining step. Thus, C6H5I was proposed as a suitable halobenzene and complete free energy landscapes for Heck coupling reaction were developed over C59Pd and C59Ni. Smaller activation barriers observed over C59Pd in comparison with C59Ni put us in a position to propose C59Pd to be an efficient heterofullerene for carrying Heck coupling reaction.Keywords: metal-substituted fullerene, density functional theory, electron affinity, oxidative addition, Heck coupling reaction
Procedia PDF Downloads 2298744 Researching and Interpreting Art: Analyzing Whose Voice Matters
Authors: Donna L. Roberts
Abstract:
Beyond the fundamental question of what is (and what isn’t) art, one then moves to the question of what about art, or a specific artwork, matters. If there is an agreement that something is art, the next step is to answer the obvious, ‘So what? What does it mean?’ In answering these questions, one must decide how to focus the proverbial microscope –i.e., what level of perspective is relevant as a point of view for this analysis- the artwork itself, the artist’s intention, the viewer’s interpretation, the artwork’s reflection of the larger artistic movement, the social, political, and historical context of art? One must determine what product and what contexts are meaningful when experiencing and interpreting art. Is beauty really in the eye of the beholder? Or is it more important what the creator was trying to say than what the critic or observer heard? The fact that so many artists –from Rembrandt to Van Gogh to Picasso- include among their works at least one self-portrait seems to scream their point –I matter. But, Is a piece more impactful because of the persona behind it? Or does that persona impose limits and close one’s mind to the possibilities of interpretation? In the popular art text visual culture, Richard Howells argues against a biographical focus on the artist in the analysis of art. Similarly, abstract expressionist Mark Rothko, along with several of his contemporaries of the genre, often did not title his paintings for the express purpose of not imposing a specific meaning or interpretation on the piece. And yet, he once said, ‘The people who weep before my pictures are having the same religious experience I had when I painted them,’ thus alluding to a desire for a shared connection and revelation. This research analyzes the arguments for differing levels of interpretation and points of view when considering a work of art and/or the artist who created it.Keywords: art analysis, art interpretation, art theory, artistic perspective
Procedia PDF Downloads 1548743 Effect of Media Reputation on Financial Performance and Abnormal Returns of Corporate Social Responsibility Winner
Authors: Yu-Chen Wei, Dan-Leng Wang
Abstract:
This study examines whether the reputation from media press affect the financial performance and market abnormal returns around the announcement of corporate social responsibility (CSR) award in the Taiwan Stock Market. The differences between this study and prior literatures are that the media reputation of media coverage and net optimism are constructed by using content analyses. The empirical results show the corporation which won CSR awards could promote financial performance next year. The media coverage and net optimism related to CSR winner are higher than the non-CSR companies prior and after the CSR award is announced, and the differences are significant, but the difference would decrease when the day was closing to announcement. We propose that non-CSR companies may try to manipulate media press to increase the coverage and positive image received by investors compared to the CSR winners. The cumulative real returns and abnormal returns of CSR winners did not significantly higher than the non-CSR samples however the leading returns of CSR winners would higher after the award announcement two months. The comparisons of performances between CSR and non-CSR companies could be the consideration of portfolio management for mutual funds and investors.Keywords: corporate social responsibility, financial performance, abnormal returns, media, reputation management
Procedia PDF Downloads 4408742 Physics-Informed Neural Network for Predicting Strain Demand in Inelastic Pipes under Ground Movement with Geometric and Soil Resistance Nonlinearities
Authors: Pouya Taraghi, Yong Li, Nader Yoosef-Ghodsi, Muntaseer Kainat, Samer Adeeb
Abstract:
Buried pipelines play a crucial role in the transportation of energy products such as oil, gas, and various chemical fluids, ensuring their efficient and safe distribution. However, these pipelines are often susceptible to ground movements caused by geohazards like landslides, fault movements, lateral spreading, and more. Such ground movements can lead to strain-induced failures in pipes, resulting in leaks or explosions, leading to fires, financial losses, environmental contamination, and even loss of human life. Therefore, it is essential to study how buried pipelines respond when traversing geohazard-prone areas to assess the potential impact of ground movement on pipeline design. As such, this study introduces an approach called the Physics-Informed Neural Network (PINN) to predict the strain demand in inelastic pipes subjected to permanent ground displacement (PGD). This method uses a deep learning framework that does not require training data and makes it feasible to consider more realistic assumptions regarding existing nonlinearities. It leverages the underlying physics described by differential equations to approximate the solution. The study analyzes various scenarios involving different geohazard types, PGD values, and crossing angles, comparing the predictions with results obtained from finite element methods. The findings demonstrate a good agreement between the results of the proposed method and the finite element method, highlighting its potential as a simulation-free, data-free, and meshless alternative. This study paves the way for further advancements, such as the simulation-free reliability assessment of pipes subjected to PGD, as part of ongoing research that leverages the proposed method.Keywords: strain demand, inelastic pipe, permanent ground displacement, machine learning, physics-informed neural network
Procedia PDF Downloads 648741 Transmission of Food Wisdom for Salaya Community
Authors: Supranee Wattanasin
Abstract:
The objectives of this research are to find and collect the knowledge in order to transmit the food wisdom of Salaya community. The research is qualitative tool to gather the data. Phase 1: Collect and analyze related literature review on food wisdom including documents about Salaya community to have a clear picture on Salaya community context. Phase 2: Conduct an action research, stage a people forum to exchange knowledge in food wisdom of Salaya community. Learning stage on cooking, types, and benefits of the food wisdom of Salaya community were also set up, as well as a people forum to find ways to transmit and add value to the food wisdom of Salaya community. The result shows that Salaya old market community was once a marketplace located by Mahasawat canal. The old market had become sluggish due to growing development of land transportation. This had affected the ways of food consumption. Residents in the community chose 3 menus that represent the community’s unique food: chicken green curry, desserts in syrup and Khanom Sai-Sai (steamed flour with coconut filling). The researcher had the local residents train the team on how to make these meals. It was found that people in the community transmit the wisdom to the next generation by teaching and telling from parents to children. ‘Learning through the back door’ is one of the learning methods that the community used and still does.Keywords: transmission, food wisdom, Salaya, cooking
Procedia PDF Downloads 4028740 Exploring Empathy Through Patients’ Eyes: A Thematic Narrative Analysis of Patient Narratives in the UK
Authors: Qudsiya Baig
Abstract:
Empathy yields an unparalleled therapeutic value within patient physician interactions. Medical research is inundated with evidence to support that a physician’s ability to empathise with patients leads to a greater willingness to report symptoms, an improvement in diagnostic accuracy and safety, and a better adherence and satisfaction with treatment plans. Furthermore, the Institute of Medicine states that empathy leads to a more patient-centred care, which is one of the six main goals of a 21st century health system. However, there is a paradox between the theoretical significance of empathy and its presence, or lack thereof, in clinical practice. Recent studies have reported that empathy declines amongst students and physicians over time. The three most impactful contributors to this decline are: (1) disagreements over the definitions of empathy making it difficult to implement it into practice (2) poor consideration or regulation of empathy leading to burnout and thus, abandonment altogether, and (3) the lack of diversity in the curriculum and the influence of medical culture, which prioritises science over patient experience, limiting some physicians from using ‘too much’ empathy in the fear of losing clinical objectivity. These issues were investigated by conducting a fully inductive thematic narrative analysis of patient narratives in the UK to evaluate the behaviours and attitudes that patients associate with empathy. The principal enquiries underpinning this study included uncovering the factors that affected experience of empathy within provider-patient interactions and to analyse their effects on patient care. This research contributes uniquely to this discourse by examining the phenomenon of empathy directly from patients’ experiences, which were systematically extracted from a repository of online patient narratives of care titled ‘CareOpinion UK’. Narrative analysis was specifically chosen as the methodology to examine narratives from a phenomenological lens to focus on the particularity and context of each story. By enquiring beyond the superficial who-whatwhere, the study of narratives prescribed meaning to illness by highlighting the everyday reality of patients who face the exigent life circumstances created by suffering, disability, and the threat of life. The following six themes were found to be the most impactful in influencing the experience of empathy: dismissive behaviours, judgmental attitudes, undermining patients’ pain or concerns, holistic care and failures and successes of communication or language. For each theme there were overarching themes relating to either a failure to understand the patient’s perspective or a success in taking a person-centred approach. An in-depth analysis revealed that a lack of empathy was greatly associated with an emotive-cognitive imbalance, which disengaged physicians with their patients’ emotions. This study hereby concludes that competent providers require a combination of knowledge, skills, and more importantly empathic attitudes to help create a context for effective care. The crucial elements of that context involve (a) identifying empathy clues within interactions to engage with patients’ situations, (b) attributing a perspective to the patient through perspective-taking and (c) adapting behaviour and communication according to patient’s individual needs. Empathy underpins that context, as does an appreciation of narrative, and the two are interrelated.Keywords: empathy, narratives, person-centred, perspective, perspective-taking
Procedia PDF Downloads 1428739 Quantification of Learned Non-Use of the Upper-Limb After a Stroke
Authors: K. K. A. Bakhti, D. Mottet, J. Froger, I. Laffont
Abstract:
Background: After a cerebrovascular accident (or stroke), many patients use excessive trunk movements to move their paretic hand towards a target (while the elbow is maintained flexed) even though they can use the upper-limb when the trunk is restrained. This phenomenon is labelled learned non-use and is known to be detrimental to neuroplasticity and recovery. Objective: The aim of this study is to quantify learned non-use of the paretic upper limb during a hand reaching task using 3D movement analysis. Methods: Thirty-four participants post supratentorial stroke were asked to reach a cone placed in front of them at 80% of their arm length. The reaching movement was repeated 5 times with the paretic hand, and then 5 times with the less-impaired hand. This sequence was first performed with the trunk free, then with the trunk restrained. Learned non-use of the upper-limb (LNUUL) was obtained from the difference of the amount of trunk compensation between the free trunk condition and the restrained trunk condition. Results: LNUUL was significantly higher for the paretic hand, with individual values ranging from 1% to 43%, and one-half of the patients with an LNUUL higher than 15%. Conclusions: Quantification of LNUUL can be used to objectively diagnose patients who need trunk rehabilitation. It can be also used for monitoring the rehabilitation progress. Quantification of LNUUL may guide upper-limb rehabilitation towards more optimal motor recovery avoiding maladaptive trunk compensation and its consequences on neuroplasticity.Keywords: learned non-use, rehabilitation, stroke, upper limb
Procedia PDF Downloads 2418738 Flexible Design Solutions for Complex Free form Geometries Aimed to Optimize Performances and Resources Consumption
Authors: Vlad Andrei Raducanu, Mariana Lucia Angelescu, Ion Cinca, Vasile Danut Cojocaru, Doina Raducanu
Abstract:
By using smart digital tools, such as generative design (GD) and digital fabrication (DF), problems of high actuality concerning resources optimization (materials, energy, time) can be solved and applications or products of free-form type can be created. In the new digital technology materials are active, designed in response to a set of performance requirements, which impose a total rethinking of old material practices. The article presents the design procedure key steps of a free-form architectural object - a column type one with connections to get an adaptive 3D surface, by using the parametric design methodology and by exploiting the properties of conventional metallic materials. In parametric design the form of the created object or space is shaped by varying the parameters values and relationships between the forms are described by mathematical equations. Digital parametric design is based on specific procedures, as shape grammars, Lindenmayer - systems, cellular automata, genetic algorithms or swarm intelligence, each of these procedures having limitations which make them applicable only in certain cases. In the paper the design process stages and the shape grammar type algorithm are presented. The generative design process relies on two basic principles: the modeling principle and the generative principle. The generative method is based on a form finding process, by creating many 3D spatial forms, using an algorithm conceived in order to apply its generating logic onto different input geometry. Once the algorithm is realized, it can be applied repeatedly to generate the geometry for a number of different input surfaces. The generated configurations are then analyzed through a technical or aesthetic selection criterion and finally the optimal solution is selected. Endless range of generative capacity of codes and algorithms used in digital design offers various conceptual possibilities and optimal solutions for both technical and environmental increasing demands of building industry and architecture. Constructions or spaces generated by parametric design can be specifically tuned, in order to meet certain technical or aesthetical requirements. The proposed approach has direct applicability in sustainable architecture, offering important potential economic advantages, a flexible design (which can be changed until the end of the design process) and unique geometric models of high performance.Keywords: parametric design, algorithmic procedures, free-form architectural object, sustainable architecture
Procedia PDF Downloads 3818737 Toward the Destigmatizing the Autism Label: Conceptualizing Celebratory Technologies
Authors: LouAnne Boyd
Abstract:
From the perspective of self-advocates, the biggest unaddressed problem is not the symptoms of an autism spectrum diagnosis but the social stigma that accompanies autism. This societal perspective is in contrast to the focus on the majority of interventions. Autism interventions, and consequently, most innovative technologies for autism, aim to improve deficits that occur within the person. For example, the most common Human-Computer Interaction research projects in assistive technology for autism target social skills from a normative perspective. The premise of the autism technologies is that difficulties occur inside the body, hence, the medical model focuses on ways to improve the ailment within the person. However, other technological approaches to support people with autism do exist. In the realm of Human Computer Interaction, there are other modes of research that provide critique of the medical model. For example, critical design, whose intended audience is industry or other HCI researchers, provides products that are the opposite of interventionist work to bring attention to the misalignment between the lived experience and the societal perception of autism. For example, parodies of interventionist work exist to provoke change, such as a recent project called Facesavr, a face covering that helps allistic adults be more independent in their emotional processing. Additionally, from a critical disability studies’ perspective, assistive technologies perpetuate harmful normalizing behaviors. However, these critical approaches can feel far from the frontline in terms of taking direct action to positively impact end users. From a critical yet more pragmatic perspective, projects such as Counterventions lists ways to reduce the likelihood of perpetuating ableism in interventionist’s work by reflectively analyzing a series of evolving assistive technology projects through a societal lens, thus leveraging the momentum of the evolving ecology of technologies for autism. Therefore, all current paradigms fall short of addressing the largest need—the negative impact of social stigma. The current work introduces a new paradigm for technologies for autism, borrowing from a paradigm introduced two decades ago around changing the narrative related to eating disorders. It is the shift from reprimanding poor habits to celebrating positive aspects of eating. This work repurposes Celebratory Technology for Neurodiversity and intended to reduce social stigma by targeting for the public at large. This presentation will review how requirements were derived from current research on autism social stigma as well as design sessions with autistic adults. Congruence between these two sources revealed three key design implications for technology: provide awareness of the autistic experience; generate acceptance of the neurodivergence; cultivate an appreciation for talents and accomplishments of neurodivergent people. The current pilot work in Celebratory Technology offers a new paradigm for supporting autism by shifting the burden of change from the person with autism to address changing society’s biases at large. Shifting the focus of research outside of the autistic body creates a new space for a design that extends beyond the bodies of a few and calls on all to embrace humanity as a whole.Keywords: neurodiversity, social stigma, accessibility, inclusion, celebratory technology
Procedia PDF Downloads 778736 Transferable Knowledge: Expressing Lessons Learnt from Failure to Outsiders
Authors: Stijn Horck
Abstract:
Background: The value of lessons learned from failure increases when these insights can be put to use by those who did not experience the failure. While learning from others has mostly been researched between individuals or teams within the same environment, transferring knowledge from the person who experienced the failure to an outsider comes with extra challenges. As sense-making of failure is an individual process leading to different learning experiences, the potential of lessons learned from failure is highly variable depending on who is transferring the lessons learned. Using an integrated framework of linguistic aspects related to attributional egotism, this study aims to offer a complete explanation of the challenges in transferring lessons learned from failures that are experienced by others. Method: A case study of a failed foundation established to address the information needs for GPs in times of COVID-19 has been used. An overview of failure causes and lessons learned were made through a preliminary analysis of data collected in two phases with metaphoric examples of failure types. This was followed up by individual narrative interviews with the board members who have all experienced the same events to analyse the individual variance of lessons learned through discourse analysis. This research design uses the researcher-as-instrument approach since the recipient of these lessons learned is the author himself. Results: Thirteen causes were given why the foundation has failed, and nine lessons were formulated. Based on the individually emphasized events, the explanation of the failure events mentioned by all or three respondents consisted of more linguistic aspects related to attributional egotism than failure events mentioned by only one or two. Moreover, the learning events mentioned by all or three respondents involved lessons learned that are based on changed insight, while the lessons expressed by only one or two are more based on direct value. Retrospectively, the lessons expressed as a group in the first data collection phase seem to have captured some but not all of the direct value lessons. Conclusion: Individual variance in expressing lessons learned to outsiders can be reduced using metaphoric or analogical explanations from a third party. In line with the attributional egotism theory, individuals separated from a group that has experienced the same failure are more likely to refer to failure causes of which the chances to be contradicted are the smallest. Lastly, this study contributes to the academic literature by demonstrating that the use of linguistic analysis is suitable for investigating the knowledge transfer from lessons learned after failure.Keywords: failure, discourse analysis, knowledge transfer, attributional egotism
Procedia PDF Downloads 1198735 A Concept for Flexible Battery Cell Manufacturing from Low to Medium Volumes
Authors: Tim Giesen, Raphael Adamietz, Pablo Mayer, Philipp Stiefel, Patrick Alle, Dirk Schlenker
Abstract:
The competitiveness and success of new electrical energy storages such as battery cells are significantly dependent on a short time-to-market. Producers who decide to supply new battery cells to the market need to be easily adaptable in manufacturing with respect to the early customers’ needs in terms of cell size, materials, delivery time and quantity. In the initial state, the required output rates do not yet allow the producers to have a fully automated manufacturing line nor to supply handmade battery cells. Yet there was no solution for manufacturing battery cells in low to medium volumes in a reproducible way. Thus, in terms of cell format and output quantity, a concept for the flexible assembly of battery cells was developed by the Fraunhofer-Institute for Manufacturing Engineering and Automation. Based on clustered processes, the modular system platform can be modified, enlarged or retrofitted in a short time frame according to the ordered product. The paper shows the analysis of the production steps from a conventional battery cell assembly line. Process solutions were found by using I/O-analysis, functional structures, and morphological boxes. The identified elementary functions were subsequently clustered by functional coherences for automation solutions and thus the single process cluster was generated. The result presented in this paper enables to manufacture different cell products on the same production system using seven process clusters. The paper shows the solution for a batch-wise flexible battery cell production using advanced process control. Further, the performed tests and benefits by using the process clusters as cyber-physical systems for an integrated production and value chain are discussed. The solution lowers the hurdles for SMEs to launch innovative cell products on the global market.Keywords: automation, battery production, carrier, advanced process control, cyber-physical system
Procedia PDF Downloads 3438734 A History of Taiwan’s Secret Nuclear Program
Authors: Hsiao-ting Lin
Abstract:
This paper analyzes the history of Taiwan’s secret program to develop its nuclear weapons during the Cold War. In July 1971, US President Richard Nixon shocked the world when he announced that his national security adviser Henry Kissinger had made a secret trip to China and that he himself had accepted an invitation to travel to Beijing. This huge breakthrough in the US-PRC relationship was followed by Taipei’s loss of political legitimacy and international credibility as a result of its UN debacle in the fall that year. Confronted with the Nixon White House’s opening to the PRC, leaders in Taiwan felt being betrayed and abandoned, and they were obliged to take countermeasures for the sake of national interest and regime survival. Taipei’s endeavor to create an effective nuclear program, including the possible development of nuclear weapons capabilities, fully demonstrates the government’s resolution to pursue its own national policy, even if such a policy was guaranteed to undermine its relations with the United States. With hindsight, Taiwan’s attempt to develop its own nuclear weapons did not succeed in sabotaging the warming of US-PRC relations. Worse, it was forced to come to a full stop when, in early 1988, the US government pressured Taipei to close related facilities and programs on the island. However, Taiwan’s abortive attempt to develop its nuclear capability did influence Washington’s and Beijing’s handling of their new relationship. There did develop recognition of a common American and PRC interest in avoiding a nuclearized Taiwan. From this perspective, Beijing’s interests would best be served by allowing the island to remain under loose and relatively benign American influence. As for the top leaders on Taiwan, such a policy choice demonstrated how they perceived the shifting dynamics of international politics in the 1960s and 1970s and how they struggled to break free and pursue their own independent national policy within the rigid framework of the US-Taiwan alliance during the Cold War.Keywords: taiwan, richard nixon, nuclear program, chiang Kai-shek, chiang ching-kuo
Procedia PDF Downloads 1368733 Qualitative Analysis of Emotional Thoughts in the Perspective of Nurses Who Have Been Working Experience in Pediatric Hematology-Oncology Unit
Authors: Sevil Inal, Leman Yantiri, Meral Kelleci
Abstract:
Aim: In this study, it was aimed to qualitatively analyze the feelings, thoughts and meanings of the nurses who had experience in child hematology in the past. Method: In this qualitative study, in-depth interviews were conducted with 15 nurses between 29 and 53 years of age who had previously worked in child hematology-oncology unit. Interviews were conducted with a semi-structured interview form. Each interview lasted 20-30 minute. Some of the questions are: ‘What kind of experiences do you experience when you think about the periods you are working in hematology-oncology service?’ ‘Do you explain the reason for living these feelings?’ The data were analyzed with QSR NVivo 7 software. Results: From the perspective of the nurses who had experience working in the pediatric hematology-oncology service in the past, five main themes and sub-themes related to emotions and thoughts towards this experiment were identified in the study. 1) Positive and negative emotions: (a) fear and anxiety, (b) desperation, pity, guilt, (c) burnout, (d) longing; 2) Being coping 3) Professional implications 4) Meaning of life 5) Unmet needs and suggestions. Conclusions: Working in hematology should be viewed as a multidimensional situation that affects the way nurses view their profession and life, leading to a wide range of emotional lives. Data obtained from this study can be used to strengthen hematologic nurses.Keywords: cancer, child, care, hematology, nursing
Procedia PDF Downloads 2418732 Powering Profits: A Dynamic Approach to Sales Marketing and Electronics
Authors: Muhammad Awais Kiani, Maryam Kiani
Abstract:
This abstract explores the confluence of these two domains and highlights the key factors driving success in sales marketing for electronics. The abstract begins by digging into the ever-evolving landscape of consumer electronics, emphasizing how technological advancements and the growth of smart devices have revolutionized the way people interact with electronics. This paradigm shift has created tremendous opportunities for sales and marketing professionals to engage with consumers on various platforms and channels. Next, the abstract discusses the pivotal role of effective sales marketing strategies in the electronics industry. It highlights the importance of understanding consumer behavior, market trends, and competitive landscapes and how this knowledge enables businesses to tailor their marketing efforts to specific target audiences. Furthermore, the abstract explores the significance of leveraging digital marketing techniques, such as social media advertising, search engine optimization, and influencer partnerships, to establish brand identity and drive sales in the electronics market. It emphasizes the power of storytelling and creating captivating content to engage with tech-savvy consumers. Additionally, the abstract emphasizes the role of customer relationship management (CRM) systems and data analytics in optimizing sales marketing efforts. It highlights the importance of leveraging customer insights and analyzing data to personalize marketing campaigns, enhance customer experience, and ultimately drive sales growth. Lastly, the abstract concludes by underlining the importance of adapting to the ever-changing landscape of the electronics industry. It encourages businesses to embrace innovation, stay informed about emerging technologies, and continuously evolve their sales marketing strategies to meet the evolving needs and expectations of consumers. Overall, this abstract sheds light on the captivating realm of sales marketing in the electronics industry, emphasizing the need for creativity, adaptability, and a deep understanding of consumers to succeed in this rapidly evolving market.Keywords: marketing industry, electronics, sales impact, e-commerce
Procedia PDF Downloads 778731 Color Conversion Films with CuInS2/ZnS Quantum Dots Embedded Polystyrene Nanofibers by Electrospinning Process
Authors: Wonkyung Na, Namhun Kim, Heeyeop Chae
Abstract:
Quantum dots (QDs) are getting attentions due to their excellent optical properties in display, solar cell, biomolecule detection and lighting applications. Energy band gap can be easilty controlled by controlling their size and QDs are proper to apply in light-emitting-diode(LED) and lighting application, especially. Typically cadmium (Cd) containing QDs show a narrow photoluminescence (PL) spectrum and high quantum yield. However, Cd is classified as a hazardous materials and the use of Cd is being tightly regulated under 100ppm level in many countries.InP and CuInS2 (CIS) are being investigated as Cd-free QD materials and it is recently demonstrated that the performance of those Cd-free QDs is comparable to their Cd-based rivals.Due to a broad emission spectrum, CuInS2 QDs are also proper to be applied to white LED.4 For the lighting applications, the QD should be made in forms of color conversion films. Various film processes are reported with QDs in polymer matrixes. In this work, we synthesized the CuInS2 (CIS) QDs and QD embedded polystyrene color conversion films were fabricated for white color emission with electro-spinning process. As a result, blue light from blue LED is converted to white light with high color rendering index (CRI) of 72 by the color conversion films.Keywords: CuInS2/ZnS, electro-spinning, color conversion films, white light emitting diodes
Procedia PDF Downloads 8138730 Predicting Mass-School-Shootings: Relevance of the FBI’s ‘Threat Assessment Perspective’ Two Decades Later
Authors: Frazer G. Thompson
Abstract:
The 1990s in America ended with a mass-school-shooting (at least four killed by gunfire excluding the perpetrator(s)) at Columbine High School in Littleton, Colorado. Post-event, many demanded that government and civilian experts develop a ‘profile’ of the potential school shooter in order to identify and preempt likely future acts of violence. This grounded theory research study seeks to explore the validity of the original hypotheses proposed by the Federal Bureau of Investigation (FBI) in 2000, as it relates to the commonality of disclosure by perpetrators of mass-school-shootings, by evaluating fourteen mass-school-shooting events between 2000 and 2019 at locations around the United States. Methods: The strategy of inquiry seeks to investigate case files, public records, witness accounts, and available psychological profiles of the shooter. The research methodology is inclusive of one-on-one interviews with members of the FBI’s Critical Incident Response Group seeking perspective on commonalities between individuals; specifically, disclosure of intent pre-event. Results: The research determined that school shooters do not ‘unfailingly’ notify others of their plans. However, in nine of the fourteen mass-school-shooting events analyzed, the perpetrator did inform the third party of their intent pre-event in some form of written, oral, or electronic communication. In the remaining five instances, the so-called ‘red-flag’ indicators of the potential for an event to occur were profound, and unto themselves, might be interpreted as notification to others of an imminent deadly threat. Conclusion: Data indicates that conclusions drawn in the FBI’s threat assessment perspective published in 2000 are relevant and current. There is evidence that despite potential ‘red-flag’ indicators which may or may not include a variety of other characteristics, perpetrators of mass-school-shooting events are likely to share their intentions with others through some form of direct or indirect communication. More significantly, implications of this research might suggest that society is often informed of potential danger pre-event but lacks any equitable means by which to disseminate, prevent, intervene, or otherwise act in a meaningful way considering said revelation.Keywords: columbine, FBI profiling, guns, mass shooting, mental health, school violence
Procedia PDF Downloads 1228729 Neo-liberalism and Theoretical Explanation of Poverty in Africa: The Nigerian Perspective
Authors: Omotoyosi Bilikies Ilori, Adekunle Saheed Ajisebiyawo
Abstract:
After the Second World War, there was an emergence of a new stage of capitalist globalization with its Neo-liberal ideology. There were global economic and political restructurings that affected third-world countries like Nigeria. Neo-liberalism is the driving force of globalization, which is the latest manifestation of imperialism that engenders endemic poverty in Nigeria. Poverty is severe and widespread in Nigeria. Poverty entails a situation where a person lives on less than one dollar per day and has no access to basic necessities of life. Poverty is inhuman and a breach of human rights. The Nigerian government initiated some strategies in the past to help in poverty reduction. Neo-liberalism manifested in the Third World, such as Nigeria, through the privatization of public enterprises, trade liberalization, and the rollback of the state investments in providing important social services. These main ideas of Neo-liberalism produced poverty in Nigeria and also encouraged the abandonment of the social contract between the government and the people. There is thus a gap in the provision of social services and subsidies for the masses, all of which Neo-liberal ideological positions contradict. This paper is a qualitative study which draws data from secondary sources. The theoretical framework is anchored on the market theory of capitalist globalization and public choice theory. The objectives of this study are to (i) examine the impacts of Neo-liberalism on poverty in Nigeria as a typical example of a Third World country and (ii) find out the effects of Neo-liberalism on the provision of social services and subsidies and employment. The findings from this study revealed that (i) the adoption of the Neo-liberal ideology by the Nigerian government has led to increased poverty and poor provision of social services and employment in Nigeria; and (ii) there is an increase in foreign debts which compounds poverty situation in Nigeria. This study makes the following recommendations: (i) Government should adopt strategies that are pro-poor to eradicate poverty; (ii) The Trade Unions and the masses should develop strategies to challenge Neo-liberalism and reject Neo-liberal ideology.Keywords: neo-liberalism, poverty, employment, poverty reduction, structural adjustment programme
Procedia PDF Downloads 918728 Multisource (RF and Solar) Energy Harvesting for Internet of Things (IoT)
Authors: Emmanuel Ekwueme, Anwar Ali
Abstract:
As the Internet of Things (IoT) continues to expand, the demand for battery-free devices is increasing, which is crucial for the efficiency of 5G networks and eco-friendly industrial systems. The solution is a device that operates indefinitely, requires no maintenance, and has no negative impact on the ambient environment. One promising approach to achieve this is energy harvesting, which involves capturing energy from the ambient environment and transferring it to power devices. This method can revolutionize industries. Such as manufacturing, agriculture, and healthcare by enabling real-time data collection and analysis, reducing maintenance costs, improving efficiency, and contributing to a future with lower carbon emissions. This research explores various energy harvesting techniques, focusing on radio frequencies (RF) and multiple energy sources. It examines RF-based and solar methods for powering battery-free sensors, low-power circuits, and IoT devices. The study investigates a hybrid RF-solar harvesting circuit designed for remote sensing devices. The proposed system includes distinct RF and solar energy harvester circuits, with the RF harvester operating at 2.45GHz and the solar harvester utilizing a maximum power point tracking (MPPT) algorithm to maximize efficiency.Keywords: radio frequency, energy harvesting, Internet of Things (IoT), multisource, solar energy
Procedia PDF Downloads 198727 Weakly Non-Linear Stability Analysis of Newtonian Liquids and Nanoliquids in Shallow, Square and Tall High-Porosity Enclosures
Authors: Pradeep G. Siddheshwar, K. M. Lakshmi
Abstract:
The present study deals with weakly non-linear stability analysis of Rayleigh-Benard-Brinkman convection in nanoliquid-saturated porous enclosures. The modified-Buongiorno-Brinkman model (MBBM) is used for the conservation of linear momentum in a nanoliquid-saturated-porous medium under the assumption of Boussinesq approximation. Thermal equilibrium is imposed between the base liquid and the nanoparticles. The thermophysical properties of nanoliquid are modeled using phenomenological laws and mixture theory. The fifth-order Lorenz model is derived for the problem and is then reduced to the first-order Ginzburg-Landau equation (GLE) using the multi-scale method. The analytical solution of the GLE for the amplitude is then used to quantify the heat transport in closed form, in terms of the Nusselt number. It is found that addition of dilute concentration of nanoparticles significantly enhances the heat transport and the dominant reason for the same is the high thermal conductivity of the nanoliquid in comparison to that of the base liquid. This aspect of nanoliquids helps in speedy removal of heat. The porous medium serves the purpose of retainment of energy in the system due to its low thermal conductivity. The present model helps in making a unified study for obtaining the results for base liquid, nanoliquid, base liquid-saturated porous medium and nanoliquid-saturated porous medium. Three different types of enclosures are considered for the study by taking different values of aspect ratio, and it is observed that heat transport in tall porous enclosure is maximum while that of shallow is the least. Detailed discussion is also made on estimating heat transport for different volume fractions of nanoparticles. Results of single-phase model are shown to be a limiting case of the present study. The study is made for three boundary combinations, viz., free-free, rigid-rigid and rigid-free.Keywords: Boungiorno model, Ginzburg-Landau equation, Lorenz equations, porous medium
Procedia PDF Downloads 3248726 In vitro Assessment of Tomato (Lycopersicon esculentum) and Cauliflower (Brassica oleracea) Seedlings Growth and Proline Production under Salt Stress
Authors: Amir Wahid, Fazal Hadi, Amin Ullah Jan
Abstract:
Tomato and Cauliflower seedlings were grown in-vitro under salt concentrations (0, 2, 4, 8, and 10 dSm-1) with objectives to investigate; (1) The effect of salinity on seedling growth and free proline production, (2) the correlation between seedling growth and proline contents, (3) comparative salt tolerance of both species. Different concentrations of salt showed considerable effect on percent (%) germination of seeds, length and biomass of shoot and root and also showed effect on percent water content of both plants seedlings. Germination rate in cauliflower was two times higher than tomato even at highest salt concentration (10 dSm-1). Seedling growth of both species was less effected at low salt concentrations (2 and 4 dSm-1) but at high concentrations (6 and 8 dSm-1) the seedling growth of both species was significantly decreased. Particularly the tomato root was highly significantly reduced. The proline level linearly increased in both species with increasing salt concentrations up-to 4 dSm-1 and then declined. The cauliflower showed higher free proline level than tomato under all salt treatments. Overall, the cauliflower seedlings showed better growth response along with higher proline contents on comparison with tomato seedlings.Keywords: NaCl (Sodium Chloride), EC (Electrical Conductivity), MS (Murashig and Skoog), ANOVA (Analysis of Variance), LSD (Least Significant Differences)
Procedia PDF Downloads 5598725 Comparative Analysis of Various Waste Oils for Biodiesel Production
Authors: Olusegun Ayodeji Olagunju, Christine Tyreesa Pillay
Abstract:
Biodiesel from waste sources is regarded as an economical and most viable fuel alternative to depleting fossil fuels. In this work, biodiesel was produced from three different sources of waste cooking oil; from cafeterias, which is vegetable-based using the transesterification method. The free fatty acids (% FFA) of the feedstocks were conducted successfully through the titration method. The results for sources 1, 2, and 3 were 0.86 %, 0.54 % and 0.20 %, respectively. The three variables considered in this process were temperature, reaction time, and catalyst concentration within the following range: 50 oC – 70 oC, 30 min – 90 min, and 0.5 % – 1.5 % catalyst. Produced biodiesel was characterized using ASTM standard methods for biodiesel property testing to determine the fuel properties, including kinematic viscosity, specific gravity, flash point, pour point, cloud point, and acid number. The results obtained indicate that the biodiesel yield from source 3 was greater than the other sources. All produced biodiesel fuel properties are within the standard biodiesel fuel specifications ASTM D6751. The optimum yield of biodiesel was obtained at 98.76%, 96.4%, and 94.53% from source 3, source 2, and source 1, respectively at optimum operating variables of 65 oC temperature, 90 minutes reaction time, and 0.5 wt% potassium hydroxide.Keywords: waste cooking oil, biodiesel, free fatty acid content, potassium hydroxide catalyst, optimization analysis
Procedia PDF Downloads 81