Search results for: conventional turning
3103 A Methodology for Seismic Performance Enhancement of RC Structures Equipped with Friction Energy Dissipation Devices
Authors: Neda Nabid
Abstract:
Friction-based supplemental devices have been extensively used for seismic protection and strengthening of structures, however, the conventional use of these dampers may not necessarily lead to an efficient structural performance. Conventionally designed friction dampers follow a uniform height-wise distribution pattern of slip load values for more practical simplicity. This can lead to localizing structural damage in certain story levels, while the other stories accommodate a negligible amount of relative displacement demand. A practical performance-based optimization methodology is developed to tackle with structural damage localization of RC frame buildings with friction energy dissipation devices under severe earthquakes. The proposed methodology is based on the concept of uniform damage distribution theory. According to this theory, the slip load values of the friction dampers redistribute and shift from stories with lower relative displacement demand to the stories with higher inter-story drifts to narrow down the discrepancy between the structural damage levels in different stories. In this study, the efficacy of the proposed design methodology is evaluated through the seismic performance of five different low to high-rise RC frames equipped with friction wall dampers under six real spectrum-compatible design earthquakes. The results indicate that compared to the conventional design, using the suggested methodology to design friction wall systems can lead to, by average, up to 40% reduction of maximum inter-story drift; and incredibly more uniform height-wise distribution of relative displacement demands under the design earthquakes.Keywords: friction damper, nonlinear dynamic analysis, RC structures, seismic performance, structural damage
Procedia PDF Downloads 2263102 Seismicity and Ground Response Analysis for MP Tourism Office in Indore, India
Authors: Deepshikha Shukla, C. H. Solanki, Mayank Desai
Abstract:
In the last few years, it has been observed that earthquake is proving a threat to the scientist across the world. With a large number of earthquakes occurring in day to day life, the threat to life and property has increased manifolds which call for an urgent attention of all the researchers globally to carry out the research in the field of Earthquake Engineering. Any hazard related to the earthquake and seismicity is considered to be seismic hazards. The common forms of seismic hazards are Ground Shaking, Structure Damage, Structural Hazards, Liquefaction, Landslides, Tsunami to name a few. Among all the natural hazards, the most devastating and damaging is the earthquake as all other hazards are triggered only after the occurrence of an earthquake. In order to quantify and estimate the seismicity and seismic hazards, many methods and approaches have been proposed in the past few years. Such approaches are Mathematical, Conventional and Computational. Convex Set Theory, Empirical Green’s Function are some of the Mathematical Approaches whereas the Deterministic and Probabilistic Approaches are the Conventional Approach for the estimation of the seismic Hazards. Ground response and Ground Shaking of a particular area or region plays an important role in the damage caused due to the earthquake. In this paper, seismic study using Deterministic Approach and 1 D Ground Response Analysis has been carried out for Madhya Pradesh Tourism Office in Indore Region in Madhya Pradesh in Central India. Indore lies in the seismic zone III (IS: 1893, 2002) in the Seismic Zoning map of India. There are various faults and lineament in this area and Narmada Some Fault and Gavilgadh fault are the active sources of earthquake in the study area. Deepsoil v6.1.7 has been used to perform the 1 D Linear Ground Response Analysis for the study area. The Peak Ground Acceleration (PGA) of the city ranges from 0.1g to 0.56g.Keywords: seismicity, seismic hazards, deterministic, probabilistic methods, ground response analysis
Procedia PDF Downloads 1653101 Performance Comparison of Droop Control Methods for Parallel Inverters in Microgrid
Authors: Ahmed Ismail, Mustafa Baysal
Abstract:
Although the energy source in the world is mainly based on fossil fuels today, there is a need for alternative energy generation systems, which are more economic and environmentally friendly, due to continuously increasing demand of electric energy and lacking power resources and networks. Distributed Energy Resources (DERs) such as fuel cells, wind and solar power have recently become widespread as alternative generation. In order to solve several problems that might be encountered when integrating DERs to power system, the microgrid concept has been proposed. A microgrid can operate both grid connected and island mode to benefit both utility and customers. For most distributed energy resources (DER) which are connected in parallel in LV-grid like micro-turbines, wind plants, fuel cells and PV cells electrical power is generated as a direct current (DC) and converted to an alternative currents (AC) by inverters. So the inverters are assumed to be primary components in a microgrid. There are many control techniques of parallel inverters to manage active and reactive sharing of the loads. Some of them are based on droop method. In literature, the studies are usually focused on improving the transient performance of inverters. In this study, the performance of two different controllers based on droop control method is compared for the inverters operated in parallel without any communication feedback. For this aim, a microgrid in which inverters are controlled by conventional droop controller and modified droop controller is designed. Modified controller is obtained by adding PID into conventional droop control. Active and reactive power sharing performance, voltage and frequency responses of those control methods are measured in several operational cases. Study cases have been simulated by MATLAB-SIMULINK.Keywords: active and reactive power sharing, distributed generation, droop control, microgrid
Procedia PDF Downloads 5923100 Computational Investigation of Secondary Flow Losses in Linear Turbine Cascade by Modified Leading Edge Fence
Authors: K. N. Kiran, S. Anish
Abstract:
It is well known that secondary flow loses account about one third of the total loss in any axial turbine. Modern gas turbine height is smaller and have longer chord length, which might lead to increase in secondary flow. In order to improve the efficiency of the turbine, it is important to understand the behavior of secondary flow and device mechanisms to curtail these losses. The objective of the present work is to understand the effect of a stream wise end-wall fence on the aerodynamics of a linear turbine cascade. The study is carried out computationally by using commercial software ANSYS CFX. The effect of end-wall on the flow field are calculated based on RANS simulation by using SST transition turbulence model. Durham cascade which is similar to high-pressure axial flow turbine for simulation is used. The aim of fencing in blade passage is to get the maximum benefit from flow deviation and destroying the passage vortex in terms of loss reduction. It is observed that, for the present analysis, fence in the blade passage helps reducing the strength of horseshoe vortex and is capable of restraining the flow along the blade passage. Fence in the blade passage helps in reducing the under turning by 70 in comparison with base case. Fence on end-wall is effective in preventing the movement of pressure side leg of horseshoe vortex and helps in breaking the passage vortex. Computations are carried for different fence height whose curvature is different from the blade camber. The optimum fence geometry and location reduces the loss coefficient by 15.6% in comparison with base case.Keywords: boundary layer fence, horseshoe vortex, linear cascade, passage vortex, secondary flow
Procedia PDF Downloads 3493099 Machine Learning Techniques in Seismic Risk Assessment of Structures
Authors: Farid Khosravikia, Patricia Clayton
Abstract:
The main objective of this work is to evaluate the advantages and disadvantages of various machine learning techniques in two key steps of seismic hazard and risk assessment of different types of structures. The first step is the development of ground-motion models, which are used for forecasting ground-motion intensity measures (IM) given source characteristics, source-to-site distance, and local site condition for future events. IMs such as peak ground acceleration and velocity (PGA and PGV, respectively) as well as 5% damped elastic pseudospectral accelerations at different periods (PSA), are indicators of the strength of shaking at the ground surface. Typically, linear regression-based models, with pre-defined equations and coefficients, are used in ground motion prediction. However, due to the restrictions of the linear regression methods, such models may not capture more complex nonlinear behaviors that exist in the data. Thus, this study comparatively investigates potential benefits from employing other machine learning techniques as statistical method in ground motion prediction such as Artificial Neural Network, Random Forest, and Support Vector Machine. The results indicate the algorithms satisfy some physically sound characteristics such as magnitude scaling distance dependency without requiring pre-defined equations or coefficients. Moreover, it is shown that, when sufficient data is available, all the alternative algorithms tend to provide more accurate estimates compared to the conventional linear regression-based method, and particularly, Random Forest outperforms the other algorithms. However, the conventional method is a better tool when limited data is available. Second, it is investigated how machine learning techniques could be beneficial for developing probabilistic seismic demand models (PSDMs), which provide the relationship between the structural demand responses (e.g., component deformations, accelerations, internal forces, etc.) and the ground motion IMs. In the risk framework, such models are used to develop fragility curves estimating exceeding probability of damage for pre-defined limit states, and therefore, control the reliability of the predictions in the risk assessment. In this study, machine learning algorithms like artificial neural network, random forest, and support vector machine are adopted and trained on the demand parameters to derive PSDMs for them. It is observed that such models can provide more accurate estimates of prediction in relatively shorter about of time compared to conventional methods. Moreover, they can be used for sensitivity analysis of fragility curves with respect to many modeling parameters without necessarily requiring more intense numerical response-history analysis.Keywords: artificial neural network, machine learning, random forest, seismic risk analysis, seismic hazard analysis, support vector machine
Procedia PDF Downloads 1063098 Assumption of Cognitive Goals in Science Learning
Authors: Mihail Calalb
Abstract:
The aim of this research is to identify ways for achieving sustainable conceptual understanding within science lessons. For this purpose, a set of teaching and learning strategies, parts of the theory of visible teaching and learning (VTL), is studied. As a result, a new didactic approach named "learning by being" is proposed and its correlation with educational paradigms existing nowadays in science teaching domain is analysed. In the context of VTL the author describes the main strategies of "learning by being" such as guided self-scaffolding, structuring of information, and recurrent use of previous knowledge or help seeking. Due to the synergy effect of these learning strategies applied simultaneously in class, the impact factor of learning by being on cognitive achievement of students is up to 93 % (the benchmark level is equal to 40% when an experienced teacher applies permanently the same conventional strategy during two academic years). The key idea in "learning by being" is the assumption by the student of cognitive goals. From this perspective, the article discusses the role of student’s personal learning effort within several teaching strategies employed in VTL. The research results emphasize that three mandatory student – related moments are present in each constructivist teaching approach: a) students’ personal learning effort, b) student – teacher mutual feedback and c) metacognition. Thus, a successful educational strategy will target to achieve an involvement degree of students into the class process as high as possible in order to make them not only know the learning objectives but also to assume them. In this way, we come to the ownership of cognitive goals or students’ deep intrinsic motivation. A series of approaches are inherent to the students’ ownership of cognitive goals: independent research (with an impact factor on cognitive achievement equal to 83% according to the results of VTL); knowledge of success criteria (impact factor – 113%); ability to reveal similarities and patterns (impact factor – 132%). Although it is generally accepted that the school is a public service, nonetheless it does not belong to entertainment industry and in most of cases the education declared as student – centered actually hides the central role of the teacher. Even if there is a proliferation of constructivist concepts, mainly at the level of science education research, we have to underline that conventional or frontal teaching, would never disappear. Research results show that no modern method can replace an experienced teacher with strong pedagogical content knowledge. Such a teacher will inspire and motivate his/her students to love and learn physics. The teacher is precisely the condensation point for an efficient didactic strategy – be it constructivist or conventional. In this way, we could speak about "hybridized teaching" where both the student and the teacher have their share of responsibility. In conclusion, the core of "learning by being" approach is guided learning effort that corresponds to the notion of teacher–student harmonic oscillator, when both things – guidance from teacher and student’s effort – are equally important.Keywords: conceptual understanding, learning by being, ownership of cognitive goals, science learning
Procedia PDF Downloads 1673097 Isolation and Characterization of an Ethanol Resistant Bacterium from Sap of Saccharum officinarum for Efficient Fermentation
Authors: Rukshika S Hewawasam, Sisira K. Weliwegamage, Sanath Rajapakse, Subramanium Sotheeswaran
Abstract:
Bio fuel is one of the emerging industries around the world due to arise of crisis in petroleum fuel. Fermentation is a cost effective and eco-friendly process in production of bio-fuel. So inventions in microbes, substrates, technologies in fermentation cause new modifications in fermentation. One major problem in microbial ethanol fermentation is the low resistance of conventional microorganisms to the high ethanol concentrations, which ultimately lead to decrease in the efficiency of the process. In the present investigation, an ethanol resistant bacterium was isolated from sap of Saccharum officinarum (sugar cane). The optimal cultural conditions such as pH, temperature, incubation period, and microbiological characteristics, morphological characteristics, biochemical characteristics, ethanol tolerance, sugar tolerance, growth curve assay were investigated. Isolated microorganism was tolerated to 18% (V/V) of ethanol concentration in the medium and 40% (V/V) glucose concentration in the medium. Biochemical characteristics have revealed as Gram negative, non-motile, negative for Indole test ,Methyl Red test, Voges- Proskauer`s test, Citrate Utilization test, and Urease test. Positive results for Oxidase test was shown by isolated bacterium. Sucrose, Glucose, Fructose, Maltose, Dextrose, Arabinose, Raffinose, Lactose, and Sachcharose can be utilized by this particular bacterium. It is a significant feature in effective fermentation. The fermentation process was carried out in glucose medium under optimum conditions; pH 4, temperature 30˚C, and incubated for 72 hours. Maximum ethanol production was recorded as 12.0±0.6% (V/V). Methanol was not detected in the final product of the fermentation process. This bacterium is especially useful in bio-fuel production due to high ethanol tolerance of this microorganism; it can be used to enhance the fermentation process over conventional microorganisms. Investigations are currently conducted on establishing the identity of the bacteriumKeywords: bacterium, bio-fuel, ethanol tolerance, fermentation
Procedia PDF Downloads 3403096 A Comparative Study of the Use of Medicinal Plants and Conventional Medicine for the Treatment of Hepatitis B Virus in Ibadan Metropolis
Authors: Julius Adebayo John
Abstract:
The objective of this study is to compare the use of medicinal plants and Conventional medicine intervention in the management of HBV among Ibadan populace. A purposive sampling technique was used to administer questionnaires at 2 places, namely, the University College Hospital and Total Healthcare Diagnostic Centre, Ibadan, where viral loads are carried out. A EuroQol (EQ – 5D) was adopted to collect data. Descriptive and inferential analyses were performed. Also, ANOVA, Correlation, charts, and tables were used. Findings revealed a high prevalence of HBV among female respondents and sample between ages 26years to 50years. Results showed that the majority discovered their health status through free HBV tests. Analysis indicated that the use of medicinal plant extract is cost-effective in 73% of cases. Rank order utility derived from medicinal plants is higher than other interventions. Correlation analysis performed for the current health status of respondents were significant at P<0.01 against the intervention management adopted (0.046), cost of treatment (0.549), utility (0.407) at P<0.00, duration of the treatment (0.604) at P<0.01; viral load before treatment (-0.142) not significant at P<0.01, the R2 (72.2%) showed the statistical variance in respondents current health status as explained by the independent variables. Respondents gained quality-adjusted life-years (QALYs) of between 1year to 3years. Suggestions were made for a public-private partnership effort against HBV with emphasis on periodic screening, viral load test subsidy, and free vaccination of people with –HBV status. Promoting phytomedicine through intensive research with strong regulation of herbal practitioners will go a long way in alleviating the burdens of the disease in society.Keywords: medicinal plant, HBV management interventions, utility, QALYs, ibadan metropolis
Procedia PDF Downloads 1553095 Effect of Perioperative Multimodal Analgesia on Postoperative Opioid Consumption and Complications in Elderly Traumatic Hip Fracture Patients: A Systematic Review of Randomised Controlled Trials
Authors: Raheel Shakoor Siddiqui, Shahbaz Malik, Manikandar Srinivas Cheruvu, Sanjay Narayana Murthy, Livio DiMascio
Abstract:
Background: elderly traumatic hip fracture patients frequently present to trauma services globally. Rising low energy falls amongst an osteoporotic aging population is the commonest cause for injury. Hip fractures in this population are a major cause for severe pain, morbidity and mortality. The term hip fracture is interchangeable with neck of femur fracture, fractured neck of femur or proximal femur fracture. Hip fracture pain management protocols and guidelines suggest conventional analgesia, nerve block and opioid based treatment as rescue analgesia. There is a current global opioid crisis with overuse, abuse and dependence. Adverse opioid related complications in vulnerable elderly patients further adds to morbidity and mortality. Systematic reviews in literature have evidenced superiority of multimodal analgesia in osteoarthritic primary joint replacements compared to opioids however, this has not yet been conducted for elderly traumatic hip fracture patients. Aims: The primary aim of this systematic review is to provide standardised evidence following Cochrane and PRISMA guidance in determining advantages of perioperative multimodal analgesia over conventional opioid based treatments in elderly traumatic hip fractures. Methods: 5 databases were searched from January 2000-2023 which identified 8 randomised controlled trials and 446 total participants. These trials met defined PICOS eligibility criteria of patient mean age ≥ 65 years presenting with a unilateral traumatic fractured neck of femur for operative intervention. Analgesic intervention with perioperative multimodal analgesia has been compared to conventional opioid based analgesia. Outcomes of interest include, primarily, the change in postoperative opioid consumption within a 0-30 postoperative period and secondarily, the change in postoperative adverse events and complications. A qualitative synthesis has been performed due to clinical heterogenicity and variance amongst trials. Results: GRADE evidence of moderate quality supports perioperative multimodal analgesia leads to a reduction in postoperative opioid consumption however, low quality evidence supports a reduction of adverse effects and complications. Conclusion: Perioperative multimodal analgesia whether used preoperative, intraoperative and/or postoperative leads to a reduction in postoperative opioid consumption for elderly traumatic hip fracture patients. This review recommends the use of perioperative multimodal analgesia as part of hip fracture pain protocols however, caution and clinical judgement should be used as the risk of adverse effects may not be lower.Keywords: trauma, orthopaedics, hip, fracture, neck of femur fracture, analgesia, multimodal analgesia, opioid
Procedia PDF Downloads 973094 Egyptian Soil Isolate Shows Promise as a Source of a New Broad-spectrum Antimicrobial Agent Against Multidrug-resistant Pathogens
Authors: Norhan H. Mahdally, Bathini Thissera Riham A. ElShiekh, Noha M. Elhosseiny, Mona T. Kashef, Ali M. El Halawany, Mostafa E. Rateb, Ahmed S. Attia
Abstract:
Multidrug-resistant (MDR) pathogens pose a global threat to healthcare settings. The exhaustion of the current antibiotic arsenal and the scarcity of new antimicrobials in the pipeline aggravate this threat and necessitate a prompt and effective response. This study focused on two major pathogens that can cause serious infections: carbapenem-resistant Acinetobacter baumannii (CRAB) and methicillin-resistant Staphylococcus aureus (MRSA). Multiple soil isolates were collected from several locations throughout Egypt and screened for their conventional and non-conventional antimicrobial activities against MDR pathogens. One isolate exhibited potent antimicrobial activity and was subjected to multiple rounds of fractionation. After fermentation and bio-guided fractionation, we identified pure microbial secondary metabolites with two scaffolds that exhibited promising effects against CRAB and MRSA. Scaling up and chemical synthesis of derivatives of the identified metabolite resulted in obtaining a more potent derivative, which we designated as 2HP. Cytotoxicity studies indicated that 2HP is well-tolerated by human cells. Ongoing work is focusing on formulating the new compound into a nano-formulation to enhance its delivery. Also, to have a better idea about how this compound works, a proteomic approach is currently underway. Our findings suggest that 2HP is a potential new broad-spectrum antimicrobial agent. Further studies are needed to confirm these findings and to develop 2HP into a safe and effective treatment for MDR infections.Keywords: broad-spectrum antimicrobials, carbapenem-resistant acinetobacter baumannii, drug discovery, methicillin-resistant staphylococcus aureus, multidrug-resistant, natural products
Procedia PDF Downloads 803093 Strategic Asset Allocation Optimization: Enhancing Portfolio Performance Through PCA-Driven Multi-Objective Modeling
Authors: Ghita Benayad
Abstract:
Asset allocation, which affects the long-term profitability of portfolios by distributing assets to fulfill a range of investment objectives, is the cornerstone of investment management in the dynamic and complicated world of financial markets. This paper offers a technique for optimizing strategic asset allocation with the goal of improving portfolio performance by addressing the inherent complexity and uncertainty of the market through the use of Principal Component Analysis (PCA) in a multi-objective modeling framework. The study's first section starts with a critical evaluation of conventional asset allocation techniques, highlighting how poorly they are able to capture the intricate relationships between assets and the volatile nature of the market. In order to overcome these challenges, the project suggests a PCA-driven methodology that isolates important characteristics influencing asset returns by decreasing the dimensionality of the investment universe. This decrease provides a stronger basis for asset allocation decisions by facilitating a clearer understanding of market structures and behaviors. Using a multi-objective optimization model, the project builds on this foundation by taking into account a number of performance metrics at once, including risk minimization, return maximization, and the accomplishment of predetermined investment goals like regulatory compliance or sustainability standards. This model provides a more comprehensive understanding of investor preferences and portfolio performance in comparison to conventional single-objective optimization techniques. While applying the PCA-driven multi-objective optimization model to historical market data, aiming to construct portfolios better under different market situations. As compared to portfolios produced from conventional asset allocation methodologies, the results show that portfolios optimized using the proposed method display improved risk-adjusted returns, more resilience to market downturns, and better alignment with specified investment objectives. The study also looks at the implications of this PCA technique for portfolio management, including the prospect that it might give investors a more advanced framework for navigating financial markets. The findings suggest that by combining PCA with multi-objective optimization, investors may obtain a more strategic and informed asset allocation that is responsive to both market conditions and individual investment preferences. In conclusion, this capstone project improves the field of financial engineering by creating a sophisticated asset allocation optimization model that integrates PCA with multi-objective optimization. In addition to raising concerns about the condition of asset allocation today, the proposed method of portfolio management opens up new avenues for research and application in the area of investment techniques.Keywords: asset allocation, portfolio optimization, principle component analysis, multi-objective modelling, financial market
Procedia PDF Downloads 473092 Calculation of Secondary Neutron Dose Equivalent in Proton Therapy of Thyroid Gland Using FLUKA Code
Authors: M. R. Akbari, M. Sadeghi, R. Faghihi, M. A. Mosleh-Shirazi, A. R. Khorrami-Moghadam
Abstract:
Proton radiotherapy (PRT) is becoming an established treatment modality for cancer. The localized tumors, the same as undifferentiated thyroid tumors are insufficiently handled by conventional radiotherapy, while protons would propose the prospect of increasing the tumor dose without exceeding the tolerance of the surrounding healthy tissues. In spite of relatively high advantages in giving localized radiation dose to the tumor region, in proton therapy, secondary neutron production can have significant contribution on integral dose and lessen advantages of this modality contrast to conventional radiotherapy techniques. Furthermore, neutrons have high quality factor, therefore, even a small physical dose can cause considerable biological effects. Measuring of this neutron dose is a very critical step in prediction of secondary cancer incidence. It has been found that FLUKA Monte Carlo code simulations have been used to evaluate dose due to secondaries in proton therapy. In this study, first, by validating simulated proton beam range in water phantom with CSDA range from NIST for the studied proton energy range (34-54 MeV), a proton therapy in thyroid gland cancer was simulated using FLUKA code. Secondary neutron dose equivalent of some organs and tissues after the target volume caused by 34 and 54 MeV proton interactions were calculated in order to evaluate secondary cancer incidence. A multilayer cylindrical neck phantom considering all the layers of neck tissues and a proton beam impinging normally on the phantom were also simulated. Trachea (accompanied by Larynx) had the greatest dose equivalent (1.24×10-1 and 1.45 pSv per primary 34 and 54 MeV protons, respectively) among the simulated tissues after the target volume in the neck region.Keywords: FLUKA code, neutron dose equivalent, proton therapy, thyroid gland
Procedia PDF Downloads 4253091 Efficiency of Virtual Reality Exercises with Nintendo Wii System on Balance and Independence in Motor Functions in Hemiparetic Patients: A Randomized Controlled Study
Authors: Ayça Utkan Karasu, Elif Balevi Batur, Gülçin Kaymak Karataş
Abstract:
The aim of this study was to examine the efficiency of virtual reality exercises with Nintendo Wii system on balance and independence in motor functions. This randomized controlled assessor-blinded study included 23 stroke inpatients with hemiparesis all within 12 months poststroke. Patients were randomly assigned to control group (n=11) or experimental group (n=12) via block randomization method. Control group participated in a conventional balance rehabilitation programme. Study group received a four-week balance training programme five times per week with a session duration of 20 minutes in addition to the conventional balance rehabilitation programme. Balance was assessed by the Berg’s balance scale, the functional reach test, the timed up and go test, the postural assessment scale for stroke, the static balance index. Also, displacement of centre of pressure sway and centre of pressure displacement during weight shifting was calculated by Emed-SX system. Independence in motor functions was assessed by The Functional Independence Measure (FIM) ambulation and FIM transfer subscales. The outcome measures were evaluated at baseline, 4th week (posttreatment), 8th week (follow-up). Repeated measures analysis of variance was performed for each of the outcome measure. Significant group time interaction was detected in the scores of the Berg’s balance scale, the functional reach test, eyes open anteroposterior and mediolateral center of pressure sway distance, eyes closed anteroposterior center of pressure sway distance, center of pressure displacement during weight shifting to effected side, unaffected side and total centre of pressure displacement during weight shifting (p < 0.05). Time effect was statistically significant in the scores of the Berg’s balance scale, the functional reach test, the timed up and go test, the postural assessment scale for stroke, the static balance index, eyes open anteroposterior and mediolateral center of pressure sway distance, eyes closed mediolateral center of pressure sway distance, the center of pressure displacement during weight shifting to effected side, the functional independence measure ambulation and transfer scores (p < 0.05). Virtual reality exercises with Nintendo Wii system combined with a conventional balance rehabilitation programme enhances balance performance and independence in motor functions in stroke patients.Keywords: balance, hemiplegia, stroke rehabilitation, virtual reality
Procedia PDF Downloads 2213090 Narratives of Self-Renewal: Looking for A Middle Earth In-Between Psychoanalysis and the Search for Consciousness
Authors: Marilena Fatigante
Abstract:
Contemporary psychoanalysis is increasingly acknowledging the existential demands of clients in psychotherapy. A significant aspect of the personal crises that patients face today is often rooted in the difficulty to find meaning in their own existence, even after working through or resolving traumatic memories and experiences. Tracing back to the correspondence between Freud and Romain Rolland (1927), psychoanalysis could not ignore that investigation of the psyche also encompasses the encounter with deep, psycho-sensory experiences, which involve a sense of "being one with the external world as a whole", the well-known “oceanic feeling”, as Rolland posed it. Despite the recognition of Non-ordinary States of Consciousness (NSC) as catalysts for transformation in clinical practice, highlighted by neuroscience and results from psychedelic-assisted therapies, there is few research on how psychoanalytic knowledge can integrate with other treatment traditions. These traditions, commonly rooted in non -Western, unconventional, and non-formal psychological knowledge, emphasize the individual’s innate tendency toward existential integrity and transcendence of self-boundaries. Inspired by an autobiographical account, this paper examines narratives of 12 individuals, who engaged in psychoanalytic therapy and also underwent treatment involving a non-formal helping relationship with an expert guide in consciousness, which included experience of this nature. The guide relies on 35 yrs of experience in Psychological, multidisciplinary studies in Human Sciences and Art, and demonstrates knowledge of many wisdom traditions, ranging from Eastern to Western philosophy, including Psychoanalysis and its development in cultural perspective (e.g, Ethnopsychiatry). Analyses focused primarily on two dimensions that research has identified as central in assessing the degree of treatment “success” in the patients’ narrative accounts of their therapies: agency and coherence, defined respectively as the increase, expressed in language, of the client’s perceived ability to manage his/her own challenges and the capacity, inherent in “narrative” itself as a resource for meaning making (Bruner, 1990), to provide the subject with a sense of unity, endowing his /her life experience with temporal and logical sequentiality. The present study reports that, in all narratives from the participants, agency and coherence are described differently than in “common” psychotherapy narratives. Although the participants consistently identified themselves as responsible agentic subject, the sense of agency derived from the non-conventional guidance pathway is never reduced to a personal, individual accomplishment. Rather, the more a new, fuller sense of “Life” (more than “Self”) develops out of the guidance pathway they engage with the expert guide, the more they “surrender” their own sense of autonomy and self-containment. Something, which Safran (2016) identified as well talking about the sense of surrender and “grace” in psychoanalytic sessions. Secondly, narratives of individuals engaging with the expert guide describe coherence not as repairing or enforcing continuity but as enhancing their ability to navigate dramatic discontinuities, falls, abrupt leaps and passages marked by feelings of loss and bereavement. The paper ultimately explores whether valid criteria can be established to analyze experiences of non-conventional paths of self-evolution. These paths are not opposed or alternative to conventional ones, and should not be simplistically dismissed as exotic or magical.Keywords: oceanic feeling, non conventional guidance, consciousness, narratives, treatment outcomes
Procedia PDF Downloads 383089 Productivity, Phenolic Composition and Antioxidant Activity of Arrowroot (Maranta arundinacea)
Authors: Maira C. M. Fonseca, Maria Aparecida N. Sediyama, Rosana Goncalves R. das Dores, Sanzio Mollica Vidigal, Alberto C. P. Dias
Abstract:
Among Brazilian plant diversity, many species are used as food and considered minor crops (non-conventional plant foods) (NCPF). Arrowroot (Maranta arundinacea) is a NCPF from which starch is extracted from rhizome do not have gluten. Thus, arrowroot flower starch can be consumed by celiac people. Additional, some medicinal and functional proprieties are assigned to arrowroot leaves which currently are underutilized. In Brazil, it’s cultivated mainly by small scale farmers and there is no specific recommendation for fertilization. This work aimed to determinate the best fertilization for rhizome production and to verify its influence in phenolic composition and antioxidant activity of leaf extracts. Two arrowroot varieties, “Common” and “Seta”, were cultivated in organic system at state of Minas Gerais, Brazil, using cattle manure with three levels of nitrogen (N) (0, 300 and 900 kg N ha-1). The experiment design was in randomized block with four replicates. The highest production of rhizomes in both varieties, “Common” (38198.24 kg ha-1) and “Seta” (43567.71 kg ha-1), were obtained with the use of 300 kg N ha-1. With this fertilization, the total aerial part, petiole and leaf production in the varieties were respectively: “Common” (190.312 kg ha-1; 159.312 kg ha-1; 31.100 kg ha-1) and “Seta” (207.656 kg ha-1; 180.539 kg ha-1; 27.062 kg ha-1). Methanolic leaf extracts were analysed by HPLC-DAD. The major phenolic compounds found were caffeioylquinic acids, p-coumaric derivatives and flavonoids. In general, the production of these compounds significantly decreases with the increase levels of nitrogen (900 kg N ha-1). With 300 kg N ha-1 the phenolic production was similar to control. The antioxidant activity was evaluated using DPPH method and was detected around 60% of radical scavenging when 0.1 mg/mL of plant extracts were used. We concluded that fertilization with 300 kg N ha-1 increased arrowroot rhizome production, maintaining phenolic compounds yield at leaves.Keywords: antioxidant activity, non-conventional plants, organic fertilization, phenolic compounds
Procedia PDF Downloads 2043088 Learning-by-Heart vs. Learning by Thinking: Fostering Thinking in Foreign Language Learning A Comparison of Two Approaches
Authors: Danijela Vranješ, Nataša Vukajlović
Abstract:
Turning to learner-centered teaching instead of the teacher-centered approach brought a whole new perspective into the process of teaching and learning and set a new goal for improving the educational process itself. However, recently a tremendous decline in students’ performance on various standardized tests can be observed, above all on the PISA-test. The learner-centeredness on its own is not enough anymore: the students’ ability to think is deteriorating. Especially in foreign language learning, one can encounter a lot of learning by heart: whether it is grammar or vocabulary, teachers often seem to judge the students’ success merely on how well they can recall a specific word, phrase, or grammar rule, but they rarely aim to foster their ability to think. Convinced that foreign language teaching can do both, this research aims to discover how two different approaches to teaching foreign language foster the students’ ability to think as well as to what degree they help students get to the state-determined level of foreign language at the end of the semester as defined in the Common European Framework. For this purpose, two different curricula were developed: one is a traditional, learner-centered foreign language curriculum that aims at teaching the four competences as defined in the Common European Framework and serves as a control variable, whereas the second one has been enriched with various thinking routines and aims at teaching the foreign language as a means to communicate ideas and thoughts rather than reducing it to the four competences. Moreover, two types of tests were created for each approach, each based on the content taught during the semester. One aims to test the students’ competences as defined in the CER, and the other aims to test the ability of students to draw on the knowledge gained and come to their own conclusions based on the content taught during the semester. As it is an ongoing study, the results are yet to be interpreted.Keywords: common european framework of reference, foreign language learning, foreign language teaching, testing and assignment
Procedia PDF Downloads 1063087 Islam and Democracy: A Paradoxical Study of Syed Maududi and Javed Ghamidi
Authors: Waseem Makai
Abstract:
The term ‘political Islam’ now seem to have gained the centre stage in every discourse pertaining to Islamic legitimacy and compatibility in modern civilisations. A never ceasing tradition of the philosophy of caliphate that has kept overriding the options of any alternate political institution in the Muslim world still permeates a huge faction of believers. Fully accustomed with the proliferation of changes and developments in individual, social and natural dispositions of the world, Islamic theologians retaliated to this flux through both conventional and modernist approaches. The so-called conventional approach was quintessential of the interpretations put forth by Syed Maududi, with new comprehensive, academic and powerful vigour, as never seen before. He generated the avant-garde scholarship which would bear testimony to his statements, made to uphold the political institution of Islam as supreme and noble. However, it was not his trait to challenge the established views but to codify them in such a bracket which a man of the 20th century would find captivating to his heart and satisfactory to his rationale. The delicate microcosms like selection of a caliph, implementation of Islamic commandments (Sharia), interest free banking sectors, imposing tax (Jazyah) on non-believers, waging the holy crusade (Jihad) for the expansion of Islamic boundaries, stoning for committing adulteration and capital punishment for apostates were all there in his scholarship which he spent whole of his life defending in the best possible manner. What and where did he went wrong with all this, was supposedly to be notified later, by his once been disciple, Javed Ahmad Ghamidi. Ghamidi is being accused of struggling between Scylla and Charybdis as he tries to remain steadfast to his basic Islamic tenets while modernising their interpretations to bring them in harmony with the Western ideals of democracy and liberty. His blatant acknowledgement of putting democracy at a high pedestal, calling the implementation of Sharia a non-mandatory task and denial to bracket people in the categories of Zimmi and Kaafir fully vindicates his stance against conventional narratives like that of Syed Maududi. Ghamidi goes to the extent of attributing current forms of radicalism and extremism, as exemplified in the operations of organisations like ISIS in Iraq and Syria and Tehreek-e-Taliban in Pakistan, to such a version of political Islam as upheld not only by Syed Maududi but by other prominent theologians like Ibn-Timyah, Syed Qutub and Dr. Israr Ahmad also. Ghamidi is wretched, in a way that his allegedly insubstantial claims gained him enough hostilities to leave his homeland when two of his close allies were brutally murdered. Syed Maududi and Javed Ghamidi, both stand poles apart in their understanding of Islam and its political domain. Who has the appropriate methodology, scholarship and execution in his mode of comprehension, is an intriguing task, worth carrying out in detail.Keywords: caliphate, democracy, ghamidi, maududi
Procedia PDF Downloads 2003086 Microwave Sanitization of Polyester Fabrics
Authors: K. Haggag, M. Salama, H. El-Sayed
Abstract:
Polyester fabrics were sanitized by exposing them to vaporized water under the influence of conventional heating or microwave irradiation. Hydrogen peroxide was added the humid sanitizing environment as a disinfectant. The said sanitization process was found to be effective towards two types of bacteria, namely Escherichia coli ATCC 2666 (G –ve) and Staphylococcus aureus ATCC 6538 (G +ve). The effect of the sanitization process on some of the inherent properties of polyester fabrics was monitored.Keywords: polyester, fabric, sanitization, microwave, bacteria
Procedia PDF Downloads 3743085 Review of Malaria Diagnosis Techniques
Authors: Lubabatu Sada Sodangu
Abstract:
Malaria is a major cause of death in tropical and subtropical nations. Malaria cases are continually rising as a result of a number of factors, despite the fact that the condition is now treatable using effective methods. In this situation, quick and effective diagnostic methods are essential for the management and control of malaria. Malaria diagnosis using conventional methods is still troublesome, hence new technologies have been created and implemented to get around the drawbacks. The review describes the currently known malaria diagnostic techniques, their strengths and shortcomings.Keywords: malaria, technique, diagnosis, Africa
Procedia PDF Downloads 553084 Review of Malaria Diagnosis Techniques
Authors: Lubabatu Sada Sodangi
Abstract:
Malaria is a major cause of death in tropical and subtropical nations. Malaria cases are continually rising as a result of a number of factors, despite the fact that the condition is now treatable using effective methods. In this situation, quick and effective diagnostic methods are essential for the management and control of malaria. Malaria diagnosis using conventional methods is still troublesome; hence, new technologies have been created and implemented to get around the drawbacks. The review describes the currently known malaria diagnostic techniques, their strengths, and shortcomings.Keywords: malaria, technique, diagnosis, Africa
Procedia PDF Downloads 603083 Assessment of Multi-Domain Energy Systems Modelling Methods
Authors: M. Stewart, Ameer Al-Khaykan, J. M. Counsell
Abstract:
Emissions are a consequence of electricity generation. A major option for low carbon generation, local energy systems featuring Combined Heat and Power with solar PV (CHPV) has significant potential to increase energy performance, increase resilience, and offer greater control of local energy prices while complementing the UK’s emissions standards and targets. Recent advances in dynamic modelling and simulation of buildings and clusters of buildings using the IDEAS framework have successfully validated a novel multi-vector (simultaneous control of both heat and electricity) approach to integrating the wide range of primary and secondary plant typical of local energy systems designs including CHP, solar PV, gas boilers, absorption chillers and thermal energy storage, and associated electrical and hot water networks, all operating under a single unified control strategy. Results from this work indicate through simulation that integrated control of thermal storage can have a pivotal role in optimizing system performance well beyond the present expectations. Environmental impact analysis and reporting of all energy systems including CHPV LES presently employ a static annual average carbon emissions intensity for grid supplied electricity. This paper focuses on establishing and validating CHPV environmental performance against conventional emissions values and assessment benchmarks to analyze emissions performance without and with an active thermal store in a notional group of non-domestic buildings. Results of this analysis are presented and discussed in context of performance validation and quantifying the reduced environmental impact of CHPV systems with active energy storage in comparison with conventional LES designs.Keywords: CHPV, thermal storage, control, dynamic simulation
Procedia PDF Downloads 2403082 Probabilistic Crash Prediction and Prevention of Vehicle Crash
Authors: Lavanya Annadi, Fahimeh Jafari
Abstract:
Transportation brings immense benefits to society, but it also has its costs. Costs include such as the cost of infrastructure, personnel and equipment, but also the loss of life and property in traffic accidents on the road, delays in travel due to traffic congestion and various indirect costs in terms of air transport. More research has been done to identify the various factors that affect road accidents, such as road infrastructure, traffic, sociodemographic characteristics, land use, and the environment. The aim of this research is to predict the probabilistic crash prediction of vehicles using machine learning due to natural and structural reasons by excluding spontaneous reasons like overspeeding etc., in the United States. These factors range from weather factors, like weather conditions, precipitation, visibility, wind speed, wind direction, temperature, pressure, and humidity to human made structures like road structure factors like bump, roundabout, no exit, turning loop, give away, etc. Probabilities are dissected into ten different classes. All the predictions are based on multiclass classification techniques, which are supervised learning. This study considers all crashes that happened in all states collected by the US government. To calculate the probability, multinomial expected value was used and assigned a classification label as the crash probability. We applied three different classification models, including multiclass Logistic Regression, Random Forest and XGBoost. The numerical results show that XGBoost achieved a 75.2% accuracy rate which indicates the part that is being played by natural and structural reasons for the crash. The paper has provided in-deep insights through exploratory data analysis.Keywords: road safety, crash prediction, exploratory analysis, machine learning
Procedia PDF Downloads 1113081 High and Low Salinity Polymer in Omani Oil Field
Authors: Intisar Al Busaidi, Rashid Al Maamari, Daowoud Al Mahroqi, Mahvash Karimi
Abstract:
In recent years, some research studies have been performed on the hybrid application of polymer and low salinity water flooding (LSWF). Numerous technical and economic benefits of low salinity polymer flooding (LSPF) have been reported. However, as with any EOR technology, there are various risks involved in using LSPF. Ions exchange between porous media and brine is one of the Crude oil/ brine/ rocks (COBR) reactions that is identified as a potential risk in LSPF. To the best of our knowledge, this conclusion was drawn based on bulk rheology measurements, and no explanation was provided on how water chemistry changed in the presence of polymer. Therefore, this study aimed to understand rock/ brine interactions with high and low salinity brine in the absence and presence of polymer with Omani reservoir core plugs. Many single-core flooding experiments were performed with low and high salinity polymer solutions to investigate the influence of partially hydrolyzed polyacrylic amide with different brine salinities on cation exchange reactions. Ion chromatography (IC), total organic carbon (TOC), rheological, and pH measurements were conducted for produced aqueous phase. A higher increase in pH and lower polymer adsorption was observed in LSPF compared with conventional polymer flooding. In addition, IC measurements showed that all produced fluids in the absence and presence of polymer showed elevated Ca²⁺, Mg²⁺, K+, Cl- and SO₄²⁻ ions compared to the injected fluids. However, the divalent cations levels, mainly Ca²⁺, were the highest and remained elevated for several pore volumes in the presence of LSP. The results are in line with rheological measurements where the highest viscosity reduction was recorded with the highest level of Ca²⁺ production. Despite the viscosity loss due to cation exchange reactions, LSP can be an attractive alternative to conventional polymer flooding in the Marmul field.Keywords: polymer, ions, exchange, recovery, low salinity
Procedia PDF Downloads 1143080 Flexible Feedstock Concept in Gasification Process for Carbon-Negative Energy Technology: A Case Study in Malaysia
Authors: Zahrul Faizi M. S., Ali A., Norhuda A. M.
Abstract:
Emission of greenhouse gases (GHG) from solid waste treatment and dependency on fossil fuel to produce electricity are the major concern in Malaysia as well as global. Innovation in downdraft gasification with combined heat and power (CHP) systems has the potential to minimize solid waste and reduce the emission of anthropogenic GHG from conventional fossil fuel power plants. However, the efficiency and capability of downdraft gasification to generate electricity from various alternative fuels, for instance, agriculture residues (i.e., woodchip, coconut shell) and municipal solid waste (MSW), are still controversial, on top of the toxicity level from the produced bottom ash. Thus this study evaluates the adaptability and reliability of the 20 kW downdraft gasification system to generate electricity (while considering environmental sustainability from the bottom ash) using flexible local feedstock at 20, 40, and 60% mixed ratio of MSW: agriculture residues. Feedstock properties such as feed particle size, moisture, and ash contents are also analyzed to identify optimal characteristics for the combination of feedstock (feedstock flexibility) to obtain maximum energy generation. Results show that the gasification system is capable to flexibly accommodate different feedstock compositions subjected to specific particle size (less than 2 inches) at a moisture content between 15 to 20%. These values exhibit enhance gasifier performance and provide a significant effect to the syngas composition utilizes by the internal combustion engine, which reflects energy production. The result obtained in this study is able to provide a new perspective on the transition of the conventional gasification system to a future reliable carbon-negative energy technology. Subsequently, promoting commercial scale-up of the downdraft gasification system.Keywords: carbon-negative energy, feedstock flexibility, gasification, renewable energy
Procedia PDF Downloads 1353079 Value Index, a Novel Decision Making Approach for Waste Load Allocation
Authors: E. Feizi Ashtiani, S. Jamshidi, M.H Niksokhan, A. Feizi Ashtiani
Abstract:
Waste load allocation (WLA) policies may use multi-objective optimization methods to find the most appropriate and sustainable solutions. These usually intend to simultaneously minimize two criteria, total abatement costs (TC) and environmental violations (EV). If other criteria, such as inequity, need for minimization as well, it requires introducing more binary optimizations through different scenarios. In order to reduce the calculation steps, this study presents value index as an innovative decision making approach. Since the value index contains both the environmental violation and treatment costs, it can be maximized simultaneously with the equity index. It implies that the definition of different scenarios for environmental violations is no longer required. Furthermore, the solution is not necessarily the point with minimized total costs or environmental violations. This idea is testified for Haraz River, in north of Iran. Here, the dissolved oxygen (DO) level of river is simulated by Streeter-Phelps equation in MATLAB software. The WLA is determined for fish farms using multi-objective particle swarm optimization (MOPSO) in two scenarios. At first, the trade-off curves of TC-EV and TC-Inequity are plotted separately as the conventional approach. In the second, the Value-Equity curve is derived. The comparative results show that the solutions are in a similar range of inequity with lower total costs. This is due to the freedom of environmental violation attained in value index. As a result, the conventional approach can well be replaced by the value index particularly for problems optimizing these objectives. This reduces the process to achieve the best solutions and may find better classification for scenario definition. It is also concluded that decision makers are better to focus on value index and weighting its contents to find the most sustainable alternatives based on their requirements.Keywords: waste load allocation (WLA), value index, multi objective particle swarm optimization (MOPSO), Haraz River, equity
Procedia PDF Downloads 4223078 Remote Sensing through Deep Neural Networks for Satellite Image Classification
Authors: Teja Sai Puligadda
Abstract:
Satellite images in detail can serve an important role in the geographic study. Quantitative and qualitative information provided by the satellite and remote sensing images minimizes the complexity of work and time. Data/images are captured at regular intervals by satellite remote sensing systems, and the amount of data collected is often enormous, and it expands rapidly as technology develops. Interpreting remote sensing images, geographic data mining, and researching distinct vegetation types such as agricultural and forests are all part of satellite image categorization. One of the biggest challenge data scientists faces while classifying satellite images is finding the best suitable classification algorithms based on the available that could able to classify images with utmost accuracy. In order to categorize satellite images, which is difficult due to the sheer volume of data, many academics are turning to deep learning machine algorithms. As, the CNN algorithm gives high accuracy in image recognition problems and automatically detects the important features without any human supervision and the ANN algorithm stores information on the entire network (Abhishek Gupta., 2020), these two deep learning algorithms have been used for satellite image classification. This project focuses on remote sensing through Deep Neural Networks i.e., ANN and CNN with Deep Sat (SAT-4) Airborne dataset for classifying images. Thus, in this project of classifying satellite images, the algorithms ANN and CNN are implemented, evaluated & compared and the performance is analyzed through evaluation metrics such as Accuracy and Loss. Additionally, the Neural Network algorithm which gives the lowest bias and lowest variance in solving multi-class satellite image classification is analyzed.Keywords: artificial neural network, convolutional neural network, remote sensing, accuracy, loss
Procedia PDF Downloads 1593077 Geosynthetic Tubes in Coastal Structures a Better Substitute for Shorter Planning Horizon: A Case Study
Authors: A. Pietro Rimoldi, B. Anilkumar Gopinath, C. Minimol Korulla
Abstract:
Coastal engineering structure is conventionally designed for a shorter planning horizon usually 20 years. These structures are subjected to different offshore climatic externalities like waves, tides, tsunamis etc. during the design life period. The probability of occurrence of these different offshore climatic externalities varies. The impact frequently caused by these externalities on the structures is of concern because it has a significant bearing on the capital /operating cost of the project. There can also be repeated short time occurrence of these externalities in the assumed planning horizon which can cause heavy damage to the conventional coastal structure which are mainly made of rock. A replacement of the damaged portion to prevent complete collapse is time consuming and expensive when dealing with hard rock structures. But if coastal structures are made of Geo-synthetic containment systems such replacement is quickly possible in the time period between two successive occurrences. In order to have a better knowledge and to enhance the predictive capacity of these occurrences, this study estimates risk of encounter within the design life period of various externalities based on the concept of exponential distribution. This gives an idea of the frequency of occurrences which in turn gives an indication of whether replacement is necessary and if so at what time interval such replacements have to be effected. To validate this theoretical finding, a pilot project has been taken up in the field so that the impact of the externalities can be studied both for a hard rock and a Geosynthetic tube structure. The paper brings out the salient feature of a case study which pertains to a project in which Geosynthetic tubes have been used for reformation of a seawall adjacent to a conventional rock structure in Alappuzha coast, Kerala, India. The effectiveness of the Geosystem in combatting the impact of the short-term externalities has been brought out.Keywords: climatic externalities, exponential distribution, geosystems, planning horizon
Procedia PDF Downloads 2273076 Green Extraction Technologies of Flavonoids Containing Pharmaceuticals
Authors: Lamzira Ebralidze, Aleksandre Tsertsvadze, Dali Berashvili, Aliosha Bakuridze
Abstract:
Nowadays, there is an increasing demand for biologically active substances from vegetable, animal, and mineral resources. In terms of the use of natural compounds, pharmaceutical, cosmetic, and nutrition industry has big interest. The biggest drawback of conventional extraction methods is the need to use a large volume of organic extragents. The removal of the organic solvent is a multi-stage process. And their absolute removal cannot be achieved, and they still appear in the final product as impurities. A large amount of waste containing organic solvent damages not only human health but also has the harmful effects of the environment. Accordingly, researchers are focused on improving the extraction methods, which aims to minimize the use of organic solvents and energy sources, using alternate solvents and renewable raw materials. In this context, green extraction principles were formed. Green Extraction is a need of today’s environment. Green Extraction is the concept, and it totally corresponds to the challenges of the 21st century. The extraction of biologically active compounds based on green extraction principles is vital from the view of preservation and maintaining biodiversity. Novel technologies of green extraction are known, such as "cold methods" because during the extraction process, the temperature is relatively lower, and it doesn’t have a negative impact on the stability of plant compounds. Novel technologies provide great opportunities to reduce or replace the use of organic toxic solvents, the efficiency of the process, enhance excretion yield, and improve the quality of the final product. The objective of the research is the development of green technologies of flavonoids containing preparations. Methodology: At the first stage of the research, flavonoids containing preparations (Tincture Herba Leonuri, flamine, rutine) were prepared based on conventional extraction methods: maceration, bismaceration, percolation, repercolation. At the same time, the same preparations were prepared based on green technologies, microwave-assisted, UV extraction methods. Product quality characteristics were evaluated by pharmacopeia methods. At the next stage of the research technological - economic characteristics and cost efficiency of products prepared based on conventional and novel technologies were determined. For the extraction of flavonoids, water is used as extragent. Surface-active substances are used as co-solvent in order to reduce surface tension, which significantly increases the solubility of polyphenols in water. Different concentrations of water-glycerol mixture, cyclodextrin, ionic solvent were used for the extraction process. In vitro antioxidant activity will be studied by the spectrophotometric method, using DPPH (2,2-diphenyl-1- picrylhydrazyl) as an antioxidant assay. The advantage of green extraction methods is also the possibility of obtaining higher yield in case of low temperature, limitation extraction process of undesirable compounds. That is especially important for the extraction of thermosensitive compounds and maintaining their stability.Keywords: extraction, green technologies, natural resources, flavonoids
Procedia PDF Downloads 1293075 Changes in Textural Properties of Zucchini Slices Under Effects of Partial Predrying and Deep-Fat-Frying
Authors: E. Karacabey, Ş. G. Özçelik, M. S. Turan, C. Baltacıoğlu, E. Küçüköner
Abstract:
Changes in textural properties of any food material during processing is significant for further consumer’s evaluation and directly affects their decisions. Thus any food material should be considered in terms of textural properties after any process. In the present study zucchini slices were partially predried to control and reduce the product’s final oil content. A conventional oven was used for partially dehydration of zucchini slices. Following frying was carried in an industrial fryer having temperature controller. This study was based on the effect of this predrying process on textural properties of fried zucchini slices. Texture profile analysis was performed. Hardness, elasticity, chewiness, cohesiveness were studied texture parameters of fried zucchini slices. Temperature and weight loss were monitored parameters of predrying process, whereas, in frying, oil temperature and process time were controlled. Optimization of two successive processes was done by response surface methodology being one of the common used statistical process optimization tools. Models developed for each texture parameters displayed high success to predict their values as a function of studied processes’ conditions. Process optimization was performed according to target values for each property determined for directly fried zucchini slices taking the highest score from sensory evaluation. Results indicated that textural properties of predried and then fried zucchini slices could be controlled by well-established equations. This is thought to be significant for fried stuff related food industry, where controlling of sensorial properties are crucial to lead consumer’s perception and texture related ones are leaders. This project (113R015) has been supported by TUBITAK.Keywords: optimization, response surface methodology, texture profile analysis, conventional oven, modelling
Procedia PDF Downloads 4333074 Beyond Empathy: From Justice to Reconciliation
Authors: Nissim Avissar
Abstract:
This paper aims to question the practice of bringing together people belonging to groups in conflict with the aim of bridging differences through universal empathy and interpersonal connections. It is argued that in cases where one group has the power, and the other is in a struggle to change the balance assuming universal equality between the groups and encouraging emphatic understanding is a non-emphatic practice. Accordingly, a new concept is posited–justice-sensitive empathy, conditioning empathy in such situations on the acknowledgement of an imbalance of power/injustice. With this reframing in mind, educational practices promoting social justice are discussed. In order to create conditions for justice-seeking or politically sensitive empathy, we need to go beyond the conventional definitions of empathy and offer other means and possibilities. Three possibilities are discussed. The first focuses on intra-group (as opposed to inter-group) processes within each group. It means temporary and tactical separation that may allow each group to focus on its own needs and values and perhaps to return to the dialogue more confidently. The second option emphasizes the notion of "constructive conflict," which means that each side still aspires to promote his own interests but without demolishing the other side (which is a rival but also an unwanted and forced partner). Here, alongside the "obligation to resist" and to act to promote justice as we view and understand it, we have to take into account the other side. The third and last option relates to the practice of Restorative Justice. This practice originated in the Truth and Reconciliation committees in South Africa, but it is now widely used in other contexts. Those committees had the authority to punish (or pardon) people; however, their main purpose was to seek truth and, from there, nourish reconciliation. This is the main idea of restorative justice; it seeks justice for the sake of restoring relationships. All the above options involve action and are aware of power relations (i.e., politics). They all seek justice. They may create conditions for the more conventional empathic practice to evolve, but no less than that, they are examples of justice-seeking and politically sensitive empathetic practice.Keywords: education, empathy, justice, reconciliation
Procedia PDF Downloads 97