Search results for: optimal formulation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4223

Search results for: optimal formulation

893 A Strategy to Reduce Salt Intake: The Use of a Seasoning Obtained from Wine Pomace

Authors: María Luisa Gonzalez-SanJose, Javier Garcia-Lomillo, Raquel Del Pino, Miriam Ortega-Heras, Maria Dolores Rivero-Perez, Pilar Muñiz-Rodriguez

Abstract:

One of the most preoccupant problems related to the diet of the occidental societies is the high salt intake. In Spain, salt intake is almost twice as recommended by the World Health Organization (WHO). A lot of negative health effects of high sodium intake have been described being the hypertension, cardiovascular and coronary diseases ones of the most important. Due to this fact, government and other institutions are working on the gradual reduction of this consumption. Intake of meat products have been described as the main processed products that bring salt to the diet, followed by snacks and savory crackers. However, fortunately, the food industry has also raised awareness of this problem and is working intensely, and in recent years attempts to reduce the salt content in processed products, and is developing special lines with low sodium content. It is important to consider that processed food are the main source of sodium in occidental countries. One of the possible strategies to reduce the salt content in food is to find substitutes that can emulate their taste properties without adding much sodium or products that mask or substitute salty sensations with other flavors and aromas. In this sense, multiple products have been proposed and used until now. Potassium salts produce similar salty sensations without bring sodium, however their intake should be also limited, by healthy reasons. Furthermore, some potassium salts shows some better notes. Other alternatives are the use of flavor enhancers, spices, aromatic herbs, sea-plant derivate products, etc. The wine pomace is rich in potassium salts, content organic acid and other flavored substances, therefore it could be an interesting raw material to obtain derived products that could be useful as alternative ‘seasonings’. Considering previous comments, the main aim of this study was to evaluate the possible use of a natural seasoning, made from red wine pomace, in two different foods, crackers and burgers. The seasoning was made in the pilot plant of food technology of the University of Burgos, where the studied crackers and patties were also made. Different members of the University, students, docent and administrative personal, taste the products, and a trained panel evaluated salty intensity. The seasoning in addition to potassium contain significant levels of dietary fiber and phenolic compounds, which also makes it interesting as a functional ingredient. Both burgers and crackers made with the seasoning showed better taste that those without salt. Obviously, they showed lower sodium content than normal formulation, and were richer in potassium, antioxidant and fiber. Then, they showed lower values of the relation Na/K. All these facts are correlated with more ‘healthy’ products especially to that people with hypertension and other coronary dysfunctions.

Keywords: healthy foods, low salt, seasoning, wine pomace

Procedia PDF Downloads 275
892 Unraveling the Complexity of Postpartum Distress: Examining the Influence of Alexithymia, Social Support, Partners' Support, and Birth Satisfaction on Postpartum Distress among Bulgarian Mothers

Authors: Stela Doncheva

Abstract:

Postpartum distress, encompassing depressive symptoms, obsessions, and anxiety, remains a subject of significant scientific interest due to its prevalence among individuals giving birth. This critical and transformative period presents a multitude of factors that impact women's health. On the one hand, variables such as social support, satisfaction in romantic relationships, shared newborn care, and birth satisfaction directly affect the mental well-being of new mothers. On the other hand, the interplay of hormonal changes, personality characteristics, emotional difficulties, and the profound life adjustments experienced by mothers can profoundly influence their self-esteem and overall physical and emotional well-being. This paper extensively explores the factors of alexithymia, social support, partners' support, and birth satisfaction to gain deeper insights into their impact on postpartum distress. Utilizing a qualitative survey consisting of six self-reflective questionnaires, this study collects valuable data regarding the individual postpartum experiences of Bulgarian mothers. The primary objective is to enrich our understanding of the complex factors involved in the development of postpartum distress during this crucial period. The results shed light on the intricate nature of the problem and highlight the significant influence of bio-psycho-social elements. By contributing to the existing knowledge in the field, this research provides valuable implications for the development of interventions and support systems tailored to the unique needs of mothers in the postpartum period. Ultimately, this study aims to improve the overall well-being of new mothers and promote optimal maternal health during the postpartum journey.

Keywords: maternal mental health, postpartum distress, postpartum depression, postnatal mothers

Procedia PDF Downloads 68
891 The Methanotrophic Activity in a Landfill Bio-Cover through a Subzero Winter

Authors: Parvin Berenjkar, Qiuyan Yuan, Richard Sparling, Stan Lozecznik

Abstract:

Landfills highly contribute to anthropological global warming through CH₄ emissions. Landfills are usually capped by a conventional soil cover to control the migration of gases. Methane is consumed by CH₄-oxidizing microorganisms known as methanotrophs that naturally exist in the landfill soil cover. The growth of methanotrophs can be optimized in a bio-cover that typically consists of a gas distribution layer (GDL) to homogenize landfill gas fluxes and an overlying oxidation layer composed of suitable materials that support methanotrophic populations. Materials such as mature yard waste composts can provide an inexpensive and favourable porous support for the growth and activity of methanotrophs. In areas with seasonal cold climates, it is valuable to know if methanotrophs in a bio-cover can survive in winter until the next spring, and how deep they are active in the bio-cover to mitigate CH₄. In this study, a pilot bio-cover was constructed in a closed landfill cell in Winnipeg that has a very cold climate in Canada. The bio-cover has a surface area of 2.5 m x 3.5 m and 1.5 m of depth, filled with 50 cm of gravel as a GDL and 70 cm of biosolids compost amended with yard and leaf waste compost. The observed in situ potential of methanotrophs for CH₄ oxidation was investigated at a specific period of time from December 2016 to April 2017 as well as November 2017 to April 2018, when the transition to surface frost and thawing happens in the bio-cover. Compost samples taken from different depths of the bio-cover were incubated in the laboratory under standardized conditions; an optimal air: methane atmosphere, at 22ºC, but at in situ moisture content. Results showed that the methanotrophs were alive oxidizing methane without a lag, indicating that there was the potential for methanotrophic activity at some depths of the bio-cover.

Keywords: bio-cover, global warming, landfill, methanotrophic activity

Procedia PDF Downloads 122
890 Optimization of Digestive Conditions of Opuntia ficus-indica var. Saboten using Food-Grade Enzymes

Authors: Byung Wook Yang, Sae Kyul Kim, Seung Il Ahn, Jae Hee Choi, Heejung Jung, Yejin Choi, Byung Yong Kim, Young Tae Hahm

Abstract:

Opuntia ficus-indica is a member of the Cactaceae family that is widely grown in all the semiarid countries throughout the world. Opuntia ficus-indica var. Saboten (OFS), commonly known as prickly pear cactus, is commercially cultivated as a dietary foodstuffs and medicinal stuffs in Jeju Island, Korea. Owing to high viscosity of OFS’ pad, its application to the commercial field has been limited. When the low viscosity of OFS’s pad is obtained, it is useful for the manufacture of healthy food in the related field. This study was performed to obtain the optimal digestion conditions of food-grade enzymes (Pectinex, Viscozyme and Celluclast) with the powder of OFS stem. And also, the contents of water-soluble dietary fiber (WSDF) of the dried powder prepared by the extraction of OFS stem were monitored and optimized using the response surface methodology (RSM), which included 20 experimental points with 3 replicates for two independent variables (fermentation temperature and time). A central composite design was used to monitor the effect of fermentation temperature (30-90 °C, X1) and fermentation time (1-10h, X2) on dependent variables, such as viscosity (Y1), water-soluble dietary fiber (Y2) and dietary fiber yield (Y3). Estimated maximum values at predicted optimum conditions were in agreement with experimental values. Optimum temperature and duration were 50°C and 12 hours, respectively. Viscosity value reached 3.4 poise. Yield of water-soluble dietary fiber is determined in progress.

Keywords: Opuntia ficus-indica var. saboten, enzymatic fermentation, response surface methodology, water-soluble dietary fiber, viscosity

Procedia PDF Downloads 347
889 Ultrasound-Assisted Sol – Gel Synthesis of Nano-Boehmite for Biomedical Purposes

Authors: Olga Shapovalova, Vladimir Vinogradov

Abstract:

Among many different sol – gel matrices only alumina can be successfully parenteral injected in the human body. And this is not surprising, because boehmite (aluminium oxyhydroxide) is the metal oxide approved by FDA and EMA for intravenous and intramuscular administrations, and also has been using for a longtime as adjuvant for producing of many modern vaccines. In our earlier study, it has been shown, that denaturation temperature of enzymes entrapped in sol-gel boehmite matrix increases for 30 – 60 °С with preserving of initial activity. It makes such matrices more attractive for long-term storage of non-stable drugs. In current work we present ultrasound-assisted sol-gel synthesis of nano-boehmite. This method provides bio-friendly, very stable, highly homogeneous alumina sol with using only water and aluminium isopropoxide as a precursor. Many parameters of the synthesis were studied in details: time of ultrasound treatment, US frequency, surface area, pore and nanoparticle size, zeta potential and others. Here we investigated the dependence of stability of colloidal sols and textural properties of the final composites as a function of the time of ultrasonic treatment. Chosen ultrasonic treatment time was between 30 and 180 minutes. Surface area, average pore diameter and total pore volume of the final composites were measured by surface and pore size analyzer Nova 1200 Quntachrome. It was shown that the matrices with ultrasonic treatment time equal to 90 minutes have the biggest surface area 431 ± 24 m2/g. On the other had such matrices have a smaller stability in comparison with the samples with ultrasonic treatment time equal to 120 minutes that have the surface area 390 ± 21 m2/g. It was shown that the stable sols could be formed only after 120 minutes of ultrasonic treatment, otherwise the white precipitate of boehmite is formed. We conclude that the optimal ultrasonic treatment time is 120 minutes.

Keywords: boehmite matrix, stabilisation, ultrasound-assisted sol-gel synthesis

Procedia PDF Downloads 267
888 Parameters Identification and Sensitivity Study for Abrasive WaterJet Milling Model

Authors: Didier Auroux, Vladimir Groza

Abstract:

This work is part of STEEP Marie-Curie ITN project, and it focuses on the identification of unknown parameters of the proposed generic Abrasive WaterJet Milling (AWJM) PDE model, that appears as an ill-posed inverse problem. The necessity of studying this problem comes from the industrial milling applications where the possibility to predict and model the final surface with high accuracy is one of the primary tasks in the absence of any knowledge of the model parameters that should be used. In this framework, we propose the identification of model parameters by minimizing a cost function, measuring the difference between experimental and numerical solutions. The adjoint approach based on corresponding Lagrangian gives the opportunity to find out the unknowns of the AWJM model and their optimal values that could be used to reproduce the required trench profile. Due to the complexity of the nonlinear problem and a large number of model parameters, we use an automatic differentiation software tool (TAPENADE) for the adjoint computations. By adding noise to the artificial data, we show that in fact the parameter identification problem is highly unstable and strictly depends on input measurements. Regularization terms could be effectively used to deal with the presence of data noise and to improve the identification correctness. Based on this approach we present results in 2D and 3D of the identification of the model parameters and of the surface prediction both with self-generated data and measurements obtained from the real production. Considering different types of model and measurement errors allows us to obtain acceptable results for manufacturing and to expect the proper identification of unknowns. This approach also gives us the ability to distribute the research on more complex cases and consider different types of model and measurement errors as well as 3D time-dependent model with variations of the jet feed speed.

Keywords: Abrasive Waterjet Milling, inverse problem, model parameters identification, regularization

Procedia PDF Downloads 317
887 Search for APN Permutations in Rings ℤ_2×ℤ_2^k

Authors: Daniel Panario, Daniel Santana de Freitas, Brett Stevens

Abstract:

Almost Perfect Nonlinear (APN) permutations with optimal resistance against differential cryptanalysis can be found in several domains. The permutation used in the standard for symmetric cryptography (the AES), for example, is based on a special kind of inversion in GF(28). Although very close to APN (2-uniform), this permutation still contains one number 4 in its differential spectrum, which means that, rigorously, it must be classified as 4-uniform. This fact motivates the search for fully APN permutations in other domains of definition. The extremely high complexity associated to this kind of problem precludes an exhaustive search for an APN permutation with 256 elements to be performed without the support of a suitable mathematical structure. On the other hand, in principle, there is nothing to indicate which mathematically structured domains can effectively help the search, and it is necessary to test several domains. In this work, the search for APN permutations in rings ℤ2×ℤ2k is investigated. After a full, exhaustive search with k=2 and k=3, all possible APN permutations in those rings were recorded, together with their differential profiles. Some very promising heuristics in these cases were collected so that, when used as a basis to prune backtracking for the same search in ℤ2×ℤ8 (search space with size 16! ≅244), just a few tenths of a second were enough to produce an APN permutation in a single CPU. Those heuristics were empirically extrapolated so that they could be applied to a backtracking search for APNs over ℤ2×ℤ16 (search space with size 32! ≅2117). The best permutations found in this search were further refined through Simulated Annealing, with a definition of neighbors suitable to this domain. The best result produced with this scheme was a 3-uniform permutation over ℤ2×ℤ16 with only 24 values equal to 3 in the differential spectrum (all the other 968 values were less than or equal 2, as it should be the case for an APN permutation). Although far from being fully APN, this result is technically better than a 4-uniform permutation and demanded only a few seconds in a single CPU. This is a strong indication that the use of mathematically structured domains, like the rings described in this work, together with heuristics based on smaller cases, can lead to dramatic cuts in the computational resources involved in the complexity of the search for APN permutations in extremely large domains.

Keywords: APN permutations, heuristic searches, symmetric cryptography, S-box design

Procedia PDF Downloads 160
886 Decomposition of the Discount Function Into Impatience and Uncertainty Aversion. How Neurofinance Can Help to Understand Behavioral Anomalies

Authors: Roberta Martino, Viviana Ventre

Abstract:

Intertemporal choices are choices under conditions of uncertainty in which the consequences are distributed over time. The Discounted Utility Model is the essential reference for describing the individual in the context of intertemporal choice. The model is based on the idea that the individual selects the alternative with the highest utility, which is calculated by multiplying the cardinal utility of the outcome, as if the reception were instantaneous, by the discount function that determines a decrease in the utility value according to how the actual reception of the outcome is far away from the moment the choice is made. Initially, the discount function was assumed to have an exponential trend, whose decrease over time is constant, in line with a profile of a rational investor described by classical economics. Instead, empirical evidence called for the formulation of alternative, hyperbolic models that better represented the actual actions of the investor. Attitudes that do not comply with the principles of classical rationality are termed anomalous, i.e., difficult to rationalize and describe through normative models. The development of behavioral finance, which describes investor behavior through cognitive psychology, has shown that deviations from rationality are due to the limited rationality condition of human beings. What this means is that when a choice is made in a very difficult and information-rich environment, the brain does a compromise job between the cognitive effort required and the selection of an alternative. Moreover, the evaluation and selection phase of the alternative, the collection and processing of information, are dynamics conditioned by systematic distortions of the decision-making process that are the behavioral biases involving the individual's emotional and cognitive system. In this paper we present an original decomposition of the discount function to investigate the psychological principles of hyperbolic discounting. It is possible to decompose the curve into two components: the first component is responsible for the smaller decrease in the outcome as time increases and is related to the individual's impatience; the second component relates to the change in the direction of the tangent vector to the curve and indicates how much the individual perceives the indeterminacy of the future indicating his or her aversion to uncertainty. This decomposition allows interesting conclusions to be drawn with respect to the concept of impatience and the emotional drives involved in decision-making. The contribution that neuroscience can make to decision theory and inter-temporal choice theory is vast as it would allow the description of the decision-making process as the relationship between the individual's emotional and cognitive factors. Neurofinance is a discipline that uses a multidisciplinary approach to investigate how the brain influences decision-making. Indeed, considering that the decision-making process is linked to the activity of the prefrontal cortex and amygdala, neurofinance can help determine the extent to which abnormal attitudes respect the principles of rationality.

Keywords: impatience, intertemporal choice, neurofinance, rationality, uncertainty

Procedia PDF Downloads 130
885 Topology Optimization of Heat and Mass Transfer for Two Fluids under Steady State Laminar Regime: Application on Heat Exchangers

Authors: Rony Tawk, Boutros Ghannam, Maroun Nemer

Abstract:

Topology optimization technique presents a potential tool for the design and optimization of structures involved in mass and heat transfer. The method starts with an initial intermediate domain and should be able to progressively distribute the solid and the two fluids exchanging heat. The multi-objective function of the problem takes into account minimization of total pressure loss and maximization of heat transfer between solid and fluid subdomains. Existing methods account for the presence of only one fluid, while the actual work extends optimization distribution of solid and two different fluids. This requires to separate the channels of both fluids and to ensure a minimum solid thickness between them. This is done by adding a third objective function to the multi-objective optimization problem. This article uses density approach where each cell holds two local design parameters ranging from 0 to 1, where the combination of their extremums defines the presence of solid, cold fluid or hot fluid in this cell. Finite volume method is used for direct solver coupled with a discrete adjoint approach for sensitivity analysis and method of moving asymptotes for numerical optimization. Several examples are presented to show the ability of the method to find a trade-off between minimization of power dissipation and maximization of heat transfer while ensuring the separation and continuity of the channel of each fluid without crossing or mixing the fluids. The main conclusion is the possibility to find an optimal bi-fluid domain using topology optimization, defining a fluid to fluid heat exchanger device.

Keywords: topology optimization, density approach, bi-fluid domain, laminar steady state regime, fluid-to-fluid heat exchanger

Procedia PDF Downloads 400
884 Alcohol-Containing versus Aqueous-Based Solutions for Skin Preparation in Abdominal Surgery: A Systematic Review and Meta-Analysis

Authors: Dimitra V. Peristeri, Hussameldin M. Nour, Amiya Ahsan, Sameh Abogabal, Krishna K. Singh, Muhammad Shafique Sajid

Abstract:

Introduction: The use of optimal skin antiseptic agents for the prevention of surgical site infection (SSI) is of critical importance, especially during abdominal surgical procedures. Alcohol-based chlorhexidine gluconate (CHG) and aqueous-based povidone-iodine (PVI) are the two most common skin antiseptics used nowadays. The objective of this article is to evaluate the effectiveness of alcohol-based CHG versus aqueous-based PVI used for skin preparation before abdominal surgery to reduce SSIs. Methods: Standard medical databases such as MEDLINE, Embase, Pubmed, and Cochrane Library were searched to find randomised, controlled trials (RCTs) comparing alcohol-based CHG skin preparation versus aqueous-based PVI in patients undergoing abdominal surgery. The combined outcomes of SSIs were calculated using an odds ratio (OR) with 95% confidence intervals (95% CI). All data were analysed using Review Manager (RevMan) Software 5.4, and the meta-analysis was performed with a random effect model analysis. Results: A total of 11 studies, all RCTs, were included (n= 12072 participants), recruiting adult patients undergoing abdominal surgery. In the random effect model analysis, the use of alcohol-based CHG in patients undergoing abdominal surgery was associated with a reduced risk of SSI compared to aqueous-based PVI (OR: 0.84; 95% CI [0.74, 0.96], z= 2.61, p= 0.009). Conclusion: Alcohol-based CHG may be more effective for preventing the risk of SSI compared to aqueous-based PVI agents in abdominal surgery. The conclusion of this meta-analysis may add a guiding value to reinforce current clinical practice guidelines.

Keywords: skin preparation, surgical site infection, chlorhexidine, skin antiseptics

Procedia PDF Downloads 112
883 Phage Display-Derived Vaccine Candidates for Control of Bovine Anaplasmosis

Authors: Itzel Amaro-Estrada, Eduardo Vergara-Rivera, Virginia Juarez-Flores, Mayra Cobaxin-Cardenas, Rosa Estela Quiroz, Jesus F. Preciado, Sergio Rodriguez-Camarillo

Abstract:

Bovine anaplasmosis is an infectious, tick-borne disease caused mainly by Anaplasma marginale; typical signs include anemia, fever, abortion, weight loss, decreased milk production, jaundice, and potentially death. Sick bovine can recover when antibiotics are administered; however, it usually remains as carrier for life, being a risk of infection for susceptible cattle. Anaplasma marginale is an obligate intracellular Gram-negative bacterium with genetic composition highly diverse among geographical isolates. There are currently no vaccines fully effective against bovine anaplasmosis; therefore, the economic losses due to disease are present. Vaccine formulation became a hard task for several pathogens as Anaplasma marginale, but peptide-based vaccines are an interesting proposal way to induce specific responses. Phage-displayed peptide libraries have been proved one of the most powerful technologies for identifying specific ligands. Screening of these peptides libraries is also a tool for studying interactions between proteins or peptides. Thus, it has allowed the identification of ligands recognized by polyclonal antiserums, and it has been successful for the identification of relevant epitopes in chronic diseases and toxicological conditions. Protective immune response to bovine anaplasmosis includes high levels of immunoglobulins subclass G2 (IgG2) but not subclass IgG1. Therefore, IgG2 from the serum of protected bovine can be useful to identify ligands, which can be part of an immunogen for cattle. In this work, phage display random peptide library Ph.D. ™ -12 was incubating with IgG2 or blood sera of immunized bovines against A. marginale as targets. After three rounds of biopanning, several candidates were selected for additional analysis. Subsequently, their reactivity with sera immunized against A. marginale, as well as with positive and negative sera to A. marginale was evaluated by immunoassays. A collection of recognized peptides tested by ELISA was generated. More than three hundred phage-peptides were separately evaluated against molecules which were used during panning. At least ten different peptides sequences were determined from their nucleotide composition. In this approach, three phage-peptides were selected by their binding and affinity properties. In the case of the development of vaccines or diagnostic reagents, it is important to evaluate the immunogenic and antigenic properties of the peptides. Immunogenic in vitro and in vivo behavior of peptides will be assayed as synthetic and as phage-peptide for to determinate their vaccine potential. Acknowledgment: This work was supported by grant SEP-CONACYT 252577 given to I. Amaro-Estrada.

Keywords: bovine anaplasmosis, peptides, phage display, veterinary vaccines

Procedia PDF Downloads 143
882 Optimization of Enzymatic Hydrolysis of Cooked Porcine Blood to Obtain Hydrolysates with Potential Biological Activities

Authors: Miguel Pereira, Lígia Pimentel, Manuela Pintado

Abstract:

Animal blood is a major by-product of slaughterhouses and still represents a cost and environmental problem in some countries. To be eliminated, blood should be stabilised by cooking and afterwards the slaughterhouses must have to pay for its incineration. In order to reduce the elimination costs and valorise the high protein content the aim of this study was the optimization of hydrolysis conditions, in terms of enzyme ratio and time, in order to obtain hydrolysates with biological activity. Two enzymes were tested in this assay: pepsin and proteases from Cynara cardunculus (cardosins). The latter has the advantage to be largely used in the Portuguese Dairy Industry and has a low price. The screening assays were carried out in a range of time between 0 and 10 h and using a ratio of enzyme/reaction volume between 0 and 5%. The assays were performed at the optimal conditions of pH and temperature for each enzyme: 55 °C at pH 5.2 for cardosins and 37 °C at pH 2.0 for pepsin. After reaction, the hydrolysates were evaluated by FPLC (Fast Protein Liquid Chromatography) and tested for their antioxidant activity by ABTS method. FPLC chromatograms showed different profiles when comparing the enzymatic reactions with the control (no enzyme added). The chromatogram exhibited new peaks with lower MW that were not present in control samples, demonstrating the hydrolysis by both enzymes. Regarding to the antioxidant activity, the best results for both enzymes were obtained using a ratio enzyme/reactional volume of 5% during 5 h of hydrolysis. However, the extension of reaction did not affect significantly the antioxidant activity. This has an industrial relevant aspect in what concerns to the process cost. In conclusion, the enzymatic blood hydrolysis can be a better alternative to the current elimination process allowing to the industry the reuse of an ingredient with biological properties and economic value.

Keywords: antioxidant activity, blood, by-products, enzymatic hydrolysis

Procedia PDF Downloads 510
881 Functional Ingredients from Potato By-Products: Innovative Biocatalytic Processes

Authors: Salwa Karboune, Amanda Waglay

Abstract:

Recent studies indicate that health-promoting functional ingredients and nutraceuticals can help support and improve the overall public health, which is timely given the aging of the population and the increasing cost of health care. The development of novel ‘natural’ functional ingredients is increasingly challenging. Biocatalysis offers powerful approaches to achieve this goal. Our recent research has been focusing on the development of innovative biocatalytic approaches towards the isolation of protein isolates from potato by-products and the generation of peptides. Potato is a vegetable whose high-quality proteins are underestimated. In addition to their high proportion in the essential amino acids, potato proteins possess angiotensin-converting enzyme-inhibitory potency, an ability to reduce plasma triglycerides associated with a reduced risk of atherosclerosis, and stimulate the release of the appetite regulating hormone CCK. Potato proteins have long been considered not economically feasible due to the low protein content (27% dry matter) found in tuber (Solanum tuberosum). However, potatoes rank the second largest protein supplying crop grown per hectare following wheat. Potato proteins include patatin (40-45 kDa), protease inhibitors (5-25 kDa), and various high MW proteins. Non-destructive techniques for the extraction of proteins from potato pulp and for the generation of peptides are needed in order to minimize functional losses and enhance quality. A promising approach for isolating the potato proteins was developed, which involves the use of multi-enzymatic systems containing selected glycosyl hydrolase enzymes that synergistically work to open the plant cell wall network. This enzymatic approach is advantageous due to: (1) the use of milder reaction conditions, (2) the high selectivity and specificity of enzymes, (3) the low cost and (4) the ability to market natural ingredients. Another major benefit to this enzymatic approach is the elimination of a costly purification step; indeed, these multi-enzymatic systems have the ability to isolate proteins, while fractionating them due to their specificity and selectivity with minimal proteolytic activities. The isolated proteins were used for the enzymatic generation of active peptides. In addition, they were applied into a reduced gluten cookie formulation as consumers are putting a high demand for easy ready to eat snack foods, with high nutritional quality and limited to no gluten incorporation. The addition of potato protein significantly improved the textural hardness of reduced gluten cookies, more comparable to wheat flour alone. The presentation will focus on our recent ‘proof-of principle’ results illustrating the feasibility and the efficiency of new biocatalytic processes for the production of innovative functional food ingredients, from potato by-products, whose potential health benefits are increasingly being recognized.

Keywords: biocatalytic approaches, functional ingredients, potato proteins, peptides

Procedia PDF Downloads 380
880 Enhancing Wire Electric Discharge Machining Efficiency through ANOVA-Based Process Optimization

Authors: Rahul R. Gurpude, Pallvita Yadav, Amrut Mulay

Abstract:

In recent years, there has been a growing focus on advanced manufacturing processes, and one such emerging process is wire electric discharge machining (WEDM). WEDM is a precision machining process specifically designed for cutting electrically conductive materials with exceptional accuracy. It achieves material removal from the workpiece metal through spark erosion facilitated by electricity. Initially developed as a method for precision machining of hard materials, WEDM has witnessed significant advancements in recent times, with numerous studies and techniques based on electrical discharge phenomena being proposed. These research efforts and methods in the field of ED encompass a wide range of applications, including mirror-like finish machining, surface modification of mold dies, machining of insulating materials, and manufacturing of micro products. WEDM has particularly found extensive usage in the high-precision machining of complex workpieces that possess varying hardness and intricate shapes. During the cutting process, a wire with a diameter ranging from 0.18mm is employed. The evaluation of EDM performance typically revolves around two critical factors: material removal rate (MRR) and surface roughness (SR). To comprehensively assess the impact of machining parameters on the quality characteristics of EDM, an Analysis of Variance (ANOVA) was conducted. This statistical analysis aimed to determine the significance of various machining parameters and their relative contributions in controlling the response of the EDM process. By undertaking this analysis, optimal levels of machining parameters were identified to achieve desirable material removal rates and surface roughness.

Keywords: WEDM, MRR, optimization, surface roughness

Procedia PDF Downloads 76
879 A Deep Learning Model with Greedy Layer-Wise Pretraining Approach for Optimal Syngas Production by Dry Reforming of Methane

Authors: Maryam Zarabian, Hector Guzman, Pedro Pereira-Almao, Abraham Fapojuwo

Abstract:

Dry reforming of methane (DRM) has sparked significant industrial and scientific interest not only as a viable alternative for addressing the environmental concerns of two main contributors of the greenhouse effect, i.e., carbon dioxide (CO₂) and methane (CH₄), but also produces syngas, i.e., a mixture of hydrogen (H₂) and carbon monoxide (CO) utilized by a wide range of downstream processes as a feedstock for other chemical productions. In this study, we develop an AI-enable syngas production model to tackle the problem of achieving an equivalent H₂/CO ratio [1:1] with respect to the most efficient conversion. Firstly, the unsupervised density-based spatial clustering of applications with noise (DBSAN) algorithm removes outlier data points from the original experimental dataset. Then, random forest (RF) and deep neural network (DNN) models employ the error-free dataset to predict the DRM results. DNN models inherently would not be able to obtain accurate predictions without a huge dataset. To cope with this limitation, we employ reusing pre-trained layers’ approaches such as transfer learning and greedy layer-wise pretraining. Compared to the other deep models (i.e., pure deep model and transferred deep model), the greedy layer-wise pre-trained deep model provides the most accurate prediction as well as similar accuracy to the RF model with R² values 1.00, 0.999, 0.999, 0.999, 0.999, and 0.999 for the total outlet flow, H₂/CO ratio, H₂ yield, CO yield, CH₄ conversion, and CO₂ conversion outputs, respectively.

Keywords: artificial intelligence, dry reforming of methane, artificial neural network, deep learning, machine learning, transfer learning, greedy layer-wise pretraining

Procedia PDF Downloads 88
878 Optimizing the Window Geometry Using Fractals

Authors: K. Geetha Ramesh, A. Ramachandraiah

Abstract:

In an internal building space, daylight becomes a powerful source of illumination. The challenge therefore, is to develop means of utilizing both direct and diffuse natural light in buildings while maintaining and improving occupant's visual comfort, particularly at greater distances from the windows throwing daylight. The geometrical features of windows in a building have significant effect in providing daylight. The main goal of this research is to develop an innovative window geometry, which will effectively provide the daylight component adequately together with internal reflected component(IRC) and also the external reflected component(ERC), if any. This involves exploration of a light redirecting system using fractal geometry for windows, in order to penetrate and distribute daylight more uniformly to greater depths, minimizing heat gain and glare, and also to reduce building energy use substantially. Of late the creation of fractal geometrical window and the occurrence of daylight illuminance due to such windows is becoming an interesting study. The amount of daylight can change significantly based on the window geometry and sky conditions. This leads to the (i) exploration of various fractal patterns suitable for window designs, and (ii) quantification of the effect of chosen fractal window based on the relationship between the fractal pattern, size, orientation and glazing properties for optimizing daylighting. There are a lot of natural lighting applications able to predict the behaviour of a light in a room through a traditional opening - a regular window. The conventional prediction methodology involves the evaluation of the daylight factor, the internal reflected component and the external reflected component. Having evaluated the daylight illuminance level for a conventional window, the technical performance of a fractal window for an optimal daylighting is to be studied and compared with that of a regular window. The methodologies involved are highlighted in this paper.

Keywords: daylighting, fractal geometry, fractal window, optimization

Procedia PDF Downloads 301
877 Analysis of the Relationship between Micro-Regional Human Development and Brazil's Greenhouse Gases Emission

Authors: Geanderson Eduardo Ambrósio, Dênis Antônio Da Cunha, Marcel Viana Pires

Abstract:

Historically, human development has been based on economic gains associated with intensive energy activities, which often are exhaustive in the emission of Greenhouse Gases (GHGs). It requires the establishment of targets for mitigation of GHGs in order to disassociate the human development from emissions and prevent further climate change. Brazil presents itself as one of the most GHGs emitters and it is of critical importance to discuss such reductions in intra-national framework with the objective of distributional equity to explore its full mitigation potential without compromising the development of less developed societies. This research displays some incipient considerations about which Brazil’s micro-regions should reduce, when the reductions should be initiated and what its magnitude should be. We started with the methodological assumption that human development and GHGs emissions arise in the future as their behavior was observed in the past. Furthermore, we assume that once a micro-region became developed, it is able to maintain gains in human development without the need of keep growing GHGs emissions rates. The human development index and the carbon dioxide equivalent emissions (CO2e) were extrapolated to the year 2050, which allowed us to calculate when the micro-regions will become developed and the mass of GHG’s emitted. The results indicate that Brazil must throw 300 GT CO2e in the atmosphere between 2011 and 2050, of which only 50 GT will be issued by micro-regions before it’s develop and 250 GT will be released after development. We also determined national mitigation targets and structured reduction schemes where only the developed micro-regions would be required to reduce. The micro-region of São Paulo, the most developed of the country, should be also the one that reduces emissions at most, emitting, in 2050, 90% less than the value observed in 2010. On the other hand, less developed micro-regions will be responsible for less impactful reductions, i.e. Vale do Ipanema will issue in 2050 only 10% below the value observed in 2010. Such methodological assumption would lead the country to issue, in 2050, 56.5% lower than that observed in 2010, so that the cumulative emissions between 2011 and 2050 would reduce by 130 GT CO2e over the initial projection. The fact of associating the magnitude of the reductions to the level of human development of the micro-regions encourages the adoption of policies that favor both variables as the governmental planner will have to deal with both the increasing demand for higher standards of living and with the increasing magnitude of reducing emissions. However, if economic agents do not act proactively in local and national level, the country is closer to the scenario in which emits more than the one in which mitigates emissions. The research highlighted the importance of considering the heterogeneity in determining individual mitigation targets and also ratified the theoretical and methodological feasibility to allocate larger share of contribution for those who historically emitted more. It is understood that the proposals and discussions presented should be considered in mitigation policy formulation in Brazil regardless of the adopted reduction target.

Keywords: greenhouse gases, human development, mitigation, intensive energy activities

Procedia PDF Downloads 320
876 The Acute Effects of Higher Versus Lower Load Duration and Intensity on Morphological and Mechanical Properties of the Healthy Achilles Tendon: A Randomized Crossover Trial

Authors: Eman Merza, Stephen Pearson, Glen Lichtwark, Peter Malliaras

Abstract:

The Achilles tendon (AT) exhibits volume changes related to fluid flow under acute load which may be linked to changes in stiffness. Fluid flow provides a mechanical signal for cellular activity and may be one mechanism that facilitates tendon adaptation. This study aimed to investigate whether isometric intervention involving a high level of load duration and intensity could maximize the immediate reduction in AT volume and stiffness compared to interventions involving a lower level of load duration and intensity. Sixteen healthy participants (12 males, 4 females; age= 24.4 ± 9.4 years; body mass= 70.9 ± 16.1 kg; height= 1.7 ± 0.1 m) performed three isometric interventions of varying levels of load duration (2 s and 8 s) and intensity (35% and 75% maximal voluntary isometric contraction) over a 3 week period. Freehand 3D ultrasound was used to measure free AT volume (at rest) and length (at 35%, 55%, and 75% of maximum plantarflexion force) pre- and post-interventions. The slope of the force-elongation curve over these force levels represented individual stiffness (N/mm). Large reductions in free AT volume and stiffness resulted in response to long-duration high-intensity loading whilst less reduction was produced with a lower load intensity. In contrast, no change in free AT volume and a small increase in AT stiffness occurred with lower load duration. These findings suggest that the applied load on the AT must be heavy and sustained for a long duration to maximize immediate volume reduction, which might be an acute response that enables optimal long-term tendon adaptation via mechanotransduction pathways.

Keywords: Achilles tendon, volume, stiffness, free tendon, 3d ultrasound

Procedia PDF Downloads 102
875 Optimization of Process Parameters for Copper Extraction from Wastewater Treatment Sludge by Sulfuric Acid

Authors: Usarat Thawornchaisit, Kamalasiri Juthaisong, Kasama Parsongjeen, Phonsiri Phoengchan

Abstract:

In this study, sludge samples that were collected from the wastewater treatment plant of a printed circuit board manufacturing industry in Thailand were subjected to acid extraction using sulfuric acid as the chemical extracting agent. The effects of sulfuric acid concentration (A), the ratio of a volume of acid to a quantity of sludge (B) and extraction time (C) on the efficiency of copper extraction were investigated with the aim of finding the optimal conditions for maximum removal of copper from the wastewater treatment sludge. Factorial experimental design was employed to model the copper extraction process. The results were analyzed statistically using analysis of variance to identify the process variables that were significantly affected the copper extraction efficiency. Results showed that all linear terms and an interaction term between volume of acid to quantity of sludge ratio and extraction time (BC), had statistically significant influence on the efficiency of copper extraction under tested conditions in which the most significant effect was ascribed to volume of acid to quantity of sludge ratio (B), followed by sulfuric acid concentration (A), extraction time (C) and interaction term of BC, respectively. The remaining two-way interaction terms, (AB, AC) and the three-way interaction term (ABC) is not statistically significant at the significance level of 0.05. The model equation was derived for the copper extraction process and the optimization of the process was performed using a multiple response method called desirability (D) function to optimize the extraction parameters by targeting maximum removal. The optimum extraction conditions of 99% of copper were found to be sulfuric acid concentration: 0.9 M, ratio of the volume of acid (mL) to the quantity of sludge (g) at 100:1 with an extraction time of 80 min. Experiments under the optimized conditions have been carried out to validate the accuracy of the Model.

Keywords: acid treatment, chemical extraction, sludge, waste management

Procedia PDF Downloads 198
874 Optimizing Parallel Computing Systems: A Java-Based Approach to Modeling and Performance Analysis

Authors: Maher Ali Rusho, Sudipta Halder

Abstract:

The purpose of the study is to develop optimal solutions for models of parallel computing systems using the Java language. During the study, programmes were written for the examined models of parallel computing systems. The result of the parallel sorting code is the output of a sorted array of random numbers. When processing data in parallel, the time spent on processing and the first elements of the list of squared numbers are displayed. When processing requests asynchronously, processing completion messages are displayed for each task with a slight delay. The main results include the development of optimisation methods for algorithms and processes, such as the division of tasks into subtasks, the use of non-blocking algorithms, effective memory management, and load balancing, as well as the construction of diagrams and comparison of these methods by characteristics, including descriptions, implementation examples, and advantages. In addition, various specialised libraries were analysed to improve the performance and scalability of the models. The results of the work performed showed a substantial improvement in response time, bandwidth, and resource efficiency in parallel computing systems. Scalability and load analysis assessments were conducted, demonstrating how the system responds to an increase in data volume or the number of threads. Profiling tools were used to analyse performance in detail and identify bottlenecks in models, which improved the architecture and implementation of parallel computing systems. The obtained results emphasise the importance of choosing the right methods and tools for optimising parallel computing systems, which can substantially improve their performance and efficiency.

Keywords: algorithm optimisation, memory management, load balancing, performance profiling, asynchronous programming.

Procedia PDF Downloads 14
873 The Burmese Exodus of 1942: Towards Evolving Policy Protocols for a Refugee Archive

Authors: Vinod Balakrishnan, Chrisalice Ela Joseph

Abstract:

The Burmese Exodus of 1942, which left more than 4 lakh as refugees and thousands dead, is one of the worst forced migrations in recorded history. Adding to the woes of the refugees is the lack of credible documentation of their lived experiences, trauma, and stories and their erasure from recorded history. Media reports, national records, and mainstream narratives that have registered the exodus provide sanitized versions which have reduced the refugees to a nameless, faceless mass of travelers and obliterated their lived experiences, trauma, and sufferings. This attitudinal problem compels the need to stem the insensitivity that accompanies institutional memory by making a case for a more humanistically evolved policy that puts in place protocols for the way the humanities would voice the concern for the refugee. A definite step in this direction and a far more relevant project in our times is the need to build a comprehensive refugee archive that can be a repository of the refugee experiences and perspectives. The paper draws on Hannah Arendt’s position on the Jewish refugee crisis, Agamben’s work on statelessness and citizenship, Foucault’s notion of governmentality and biopolitics, Edward Said’s concepts on Exile, Fanon’s work on the dispossessed, Derrida’s work on ‘the foreigner and hospitality’ in order to conceptualize the refugee condition which will form the theoretical framework for the paper. It also refers to the existing scholarship in the field of refugee studies such as Roger Zetter’s work on the ‘refugee label’, Philip Marfleet’s work on ‘refugees and history’, Lisa Malkki’s research on the anthropological discourse of the refugee and refugee studies. The paper is also informed by the work that has been done by the international organizations to address the refugee crisis. The emphasis is on building a strong argument for the establishment of the refugee archive that finds but a passing and a none too convincing reference in refugee studies in order to enable a multi-dimensional understanding of the refugee crisis. Some of the old questions cannot be dismissed as outdated as the continuing travails of the refugees in different parts of the world only remind us that they are still, largely, unanswered. The questions are -What is the nature of a Refugee Archive? How is it different from the existing historical and political archives? What are the implications of the refugee archive? What is its contribution to refugee studies? The paper draws on Diana Taylor’s concept of the archive and the repertoire to theorize the refugee archive as a repository that has the documentary function of the ‘archive’ and the ‘agency’ function of the repertoire. It then reads Ayya’s Accounts- a memoir by Anand Pandian -in the light of Hannah Arendt’s concepts of the ‘refugee as vanguard’ and ‘story telling as political action’- to illustrate how the memoir contributes to the refugee archive that provides the refugee a place and agency in history. The paper argues for a refugee archive that has implications for the formulation of inclusive refugee policies.

Keywords: Ayya’s Accounts, Burmese Exodus, policy protocol, refugee archive

Procedia PDF Downloads 141
872 Attitudes Towards Homosexuality, Bisexuality and Transgenderism among Medical Students of a Sri Lankan University

Authors: Rajapaksha J. S. R. L., Rajapaksha R. G. D. T., Ranawaka A. U. R., Rangalla R. D. M. P., Ranwala R. D. E. B., Chandratilake M. N.

Abstract:

Introduction: Lesbian, gay, bisexual, and transgender (LGBT) patients experience discrimination, insensitivity, and ignorance about LGBT-specific health needs among healthcare providers. Developing the correct attitudes among medical students towards LGBT may help provide them with optimal healthcare. Objectives: This study aimed at assessing the attitudes of medical students towards the LBGT community. Methodology: A cross-sectional descriptive study was among all the medical students in the Faculty of Medicine, University of Kelaniya, Sri Lanka, using a validated online questionnaire. The questionnaire focused on eight areas. The data were descriptively analyzed, and the demographic groups were compared. Results: 358 students completed the survey. The response rate was 34.26%. Their attitudes on traditional gender roles and comfortability in interacting with LGBT people were moderate, and they disagreed with negative LGBT social beliefs. They knew less about the origin of sexuality/gender of LGBT. Although they accepted LGBT as a part of diversity, they discouraged normalizing the social practices of LGBT people. Their acceptance and association of LGBT were moderately positive. A minority has encountered LGBT in close social circles, and the majority of them were batch-mates. Although males’ knowledge about the origin of LGBT was higher, they favoured traditional gender roles more. The religious groups showed no differences. The favourability of attitudes towards LGBT reflected respondents’ political ideology. Conclusion: Although medical students’ knowledge on the sexuality/gender basis of LGBT is poor, they have moderately favourable attitudes towards them. They accept LGBT as a part of social diversity but not their social practices. Poor knowledge, lack of encounters, cultural influences, and political ideology may have influenced their attitudes.

Keywords: medical students, attitude, LGBT, diversity

Procedia PDF Downloads 169
871 Climate Change Impact on Whitefly (Bemisia tabaci) Population Infesting Tomato (Lycopersicon esculentus) in Sub-Himalayan India and Their Sustainable Management Using Biopesticides

Authors: Sunil Kumar Ghosh

Abstract:

Tomato (Lycopersicon esculentus L.) is an annual vegetable crop grown in the sub-Himalayan region of north east India throughout the year except rainy season in normal field cultivation. The crop is susceptible to various insect pests of which whitefly (Bemesia tabaci Genn.) causes heavy damage. Thus, a study on its occurrence and sustainable management is needed for successful cultivation. The pest was active throughout the growing period. During 38th standard week to 41st standard week that is during 3rd week of September to 2nd week of October minimum population was observed. The maximum population level was maintained during 11th standard week to 18th standard week that is during 2nd week of March to 3rd week of March with peak population (0.47/leaf) was recorded. Weekly population counts on white fly showed non-significant negative correlation (p=0.05) with temperature and weekly total rainfall where as significant negative correlation with relative humidity. Eight treatments were taken to study the management of the white fly pest such as botanical insecticide azadirachtin botanical extracts, Spilanthes paniculata flower, Polygonum hydropiper L. flower, tobacco leaf and garlic and mixed formulation like neem and floral extract of Spilanthes were evaluated and compared with the ability of acetamiprid. The insectide acetamiprid was found most lethal against whitefly providing 76.59% suppression, closely followed by extracts of neem + Spilanthes providing 62.39% suppression. Spectophotometric scanning of crude methanolic extract of Polygonum flower showed strong absorbance wave length between 645-675 nm. Considering the level of peaks of wave length the flower extract contain some important chemicals like Spirilloxanthin, Quercentin diglycoside, Quercentin 3-O-rutinoside, Procyanidin B1 and Isorhamnetin 3-O-rutinoside. These chemicals are responsible for pest control. Spectophotometric scanning of crude methanolic extract of Spilanthes flower showed strong absorbance wave length between 645-675 nm. Considering the level of peaks of wave length the flower extract contain some important chemicals of which polysulphide compounds are important and responsible of pest control. Neem and Spilanthes individually did not produce good results but when used as a mixture they recorded better results. Highest yield (30.15 t/ha) were recorded from acetamiprid treated plots followed by neem + Spilanthes (27.55 t/ha). Azadirachtin and Plant extracts are biopesticides having less or no hazardous effects on human health and environment. Thus they can be incorporated in IPM programmes and organic farming in vegetable cultivation.

Keywords: biopesticides, organic farming, seasonal fluctuation, vegetable IPM

Procedia PDF Downloads 309
870 Digital Environment as a Factor of the City's Competitiveness in Attracting Tourists: The Case of Yekaterinburg

Authors: Alexander S. Burnasov, Anatoly V. Stepanov, Maria Y. Ilyushkina

Abstract:

In the conditions of transition to the digital economy, the digital environment of the city becomes one of the key factors of its tourism attractiveness. Modern digital environment makes travelling more accessible, improves the quality of travel services and the attractiveness of many tourist destinations. The digitalization of the industry allows to use resources more efficiently, to simplify business processes, to minimize risks, and to improve travel safety. The city promotion as a tourist destination in the foreign market becomes decisive in the digital environment. Information technologies are extremely important for the functioning of not only any tourist enterprise but also the city as a whole. In addition to solving traditional problems, it is also possible to implement some innovations from the tourism industry, such as the availability of city services in international systems of booking tickets and booking rooms in hotels, the possibility of early booking of theater and museum tickets, the possibility of non-cash payment by cards of international payment systems, Internet access in the urban environment for travelers. The availability of the city's digital services makes it possible to reduce ordering costs, contributes to the optimal selection of tourist products that meet the requirements of the tourist, provides increased transparency of transactions. The users can compare prices, features, services, and reviews of the travel service. The ability to share impressions with friends thousands of miles away directly affects the image of the city. It is possible to promote the image of the city in the digital environment not only through world-scale events (such as World Cup 2018, international summits, etc.) but also through the creation and management of services in the digital environment aimed at supporting tourism services, which will help to improve the positioning of the city in the global tourism market.

Keywords: competitiveness, digital environment, travelling, Yekaterinburg

Procedia PDF Downloads 138
869 Analysis of Real Time Seismic Signal Dataset Using Machine Learning

Authors: Sujata Kulkarni, Udhav Bhosle, Vijaykumar T.

Abstract:

Due to the closeness between seismic signals and non-seismic signals, it is vital to detect earthquakes using conventional methods. In order to distinguish between seismic events and non-seismic events depending on their amplitude, our study processes the data that come from seismic sensors. The authors suggest a robust noise suppression technique that makes use of a bandpass filter, an IIR Wiener filter, recursive short-term average/long-term average (STA/LTA), and Carl short-term average (STA)/long-term average for event identification (LTA). The trigger ratio used in the proposed study to differentiate between seismic and non-seismic activity is determined. The proposed work focuses on significant feature extraction for machine learning-based seismic event detection. This serves as motivation for compiling a dataset of all features for the identification and forecasting of seismic signals. We place a focus on feature vector dimension reduction techniques due to the temporal complexity. The proposed notable features were experimentally tested using a machine learning model, and the results on unseen data are optimal. Finally, a presentation using a hybrid dataset (captured by different sensors) demonstrates how this model may also be employed in a real-time setting while lowering false alarm rates. The planned study is based on the examination of seismic signals obtained from both individual sensors and sensor networks (SN). A wideband seismic signal from BSVK and CUKG station sensors, respectively located near Basavakalyan, Karnataka, and the Central University of Karnataka, makes up the experimental dataset.

Keywords: Carl STA/LTA, features extraction, real time, dataset, machine learning, seismic detection

Procedia PDF Downloads 127
868 Determinants of Consultation Time at a Family Medicine Center

Authors: Ali Alshahrani, Adel Almaai, Saad Garni

Abstract:

Aim of the study: To explore duration and determinants of consultation time at a family medicine center. Methodology: This study was conducted at the Family Medicine Center in Ahad Rafidah City, at the southwestern part of Saudi Arabia. It was conducted on the working days of March 2013. Trained nurses helped in filling in the checklist. A total of 459 patients were included. A checklist was designed and used in this study. It included patient’s age, sex, diagnosis, type of visit, referral and its type, psychological problems and additional work-up. In addition, number of daily bookings, physician`s experience and consultation time. Results: More than half of patients (58.39%) had less than 10 minutes’ consultation (Mean+SD: 12.73+9.22 minutes). Patients treated by physicians with shortest experience (i.e., ≤5 years) had the longest consultation time while those who were treated with physicians with the longest experience (i.e., > 10 years) had the shortest consultation time (13.94±10.99 versus 10.79±7.28, p=0.011). Regarding patients’ diagnosis, those with chronic diseases had the longest consultation time (p<0.001). Patients who did not need referral had significantly shorter consultation time compared with those who had routine or urgent referral (11.91±8.42,14.60±9.03 and 22.42±14.81 minutes, respectively, p<0.001). Patients with associated psychological problems needed significantly longer consultation time than those without associated psychological problems (20.06±13.32 versus 12.45±8.93, p<0.001). Conclusions: The average length of consultation time at Ahad Rafidah Family Medicine Center is approximately 13 minutes. Less-experienced physicians tend to spend longer consultation times with patients. Referred patients, those with psychological problems, those with chronic diseases tend to have longer consultation time. Recommendations: Family physicians should be encouraged to keep their optimal consultation time. Booking an adequate number of patients per shift would allow the family physician to provide enough consultation time for each patient.

Keywords: consultation, quality, medicine, clinics

Procedia PDF Downloads 288
867 The Construction Women Self in Law: A Case of Medico-Legal Jurisprudence Textbooks in Rape Cases

Authors: Rahul Ranjan

Abstract:

Using gender as a category to cull out historical analysis, feminist scholars have produced plethora of literature on the sexual symbolics and carnal practices of modern European empires. At a symbolic level, the penetration and conquest of faraway lands was charged with sexual significance and intrigue. The white male’s domination and possession of dark and fertile lands in Africa, Asia and the Americas offered, in Anne McClintock’s words, ‘a fantastic magic lantern of the mind onto which Europe projected its forbidden sexual desires and fears’. The politics of rape were also symbolically a question significant to the politics of empire. To the colonized subject, rape was a fearsome factor, a language that spoke of violent and voracious nature of imperial exploitation. The colonized often looked at rape as an act which colonizers used as tool of oppression. The rape as act of violence got encoded into the legal structure under the helm of Lord Macaulay in the so called ‘Age of Reform’ in 1860 under IPC (Indian penal code). Initially Lord Macaulay formed Indian Law Commission in 1837 in which he drafted a bill and defined the ‘crime of rape as sexual intercourse by a man to a woman against her will and without her consent , except in cases involving girls under nine years of age where consent was immaterial’. The modern English law of rape formulated under the colonial era introduced twofold issues to the forefront. On the one hand it deployed ‘technical experts’ who wrote textbooks of medical jurisprudence that were used as credential citation to make case more ‘objective’, while on the other hand the presumptions about barbaric subjects, the colonized women’s body that was docile which is prone to adultery reflected in cases. The untrustworthiness of native witness also remained an imperative for British jurists to put extra emphasis making ‘objective’ and ‘presumptuous’. This sort of formulation put women down on the pedestrian of justice because it disadvantaged her doubly through British legality and their thinking about the rape. The Imperial morality that acted as vanguards of women’s chastity coincided language of science propagated in the post-enlightenment which not only annulled non-conformist ideas but also made itself a hegemonic language, was often used as a tool and language in encoding of law. The medico-legal understanding of rape in the colonial India has its clear imprints in the post-colonial legality. The onus on the part of rape’s victim was dictated for the longest time and still continues does by widely referred idea that ‘there should signs, marks of resistance on the body of the victim’ otherwise it is likely to be considered consensual. Having said so, this paper looks at the textual continuity that had prolonged the colonial construct of women’s body and the self.

Keywords: body, politics, textual construct, phallocentric

Procedia PDF Downloads 378
866 A Knowledge-Based Development of Risk Management Approaches for Construction Projects

Authors: Masoud Ghahvechi Pour

Abstract:

Risk management is a systematic and regular process of identifying, analyzing and responding to risks throughout the project's life cycle in order to achieve the optimal level of elimination, reduction or control of risk. The purpose of project risk management is to increase the probability and effect of positive events and reduce the probability and effect of unpleasant events on the project. Risk management is one of the most fundamental parts of project management, so that unmanaged or untransmitted risks can be one of the primary factors of failure in a project. Effective risk management does not apply to risk regression, which is apparently the cheapest option of the activity. However, the main problem with this option is the economic sensitivity, because what is potentially profitable is by definition risky, and what does not pose a risk is economically interesting and does not bring tangible benefits. Therefore, in relation to the implemented project, effective risk management is finding a "middle ground" in its management, which includes, on the one hand, protection against risk from a negative direction by means of accurate identification and classification of risk, which leads to analysis And it becomes a comprehensive analysis. On the other hand, management using all mathematical and analytical tools should be based on checking the maximum benefits of these decisions. Detailed analysis, taking into account all aspects of the company, including stakeholder analysis, will allow us to add what will become tangible benefits for our project in the future to effective risk management. Identifying the risk of the project is based on the theory that which type of risk may affect the project, and also refers to specific parameters and estimating the probability of their occurrence in the project. These conditions can be divided into three groups: certainty, uncertainty, and risk, which in turn support three types of investment: risk preference, risk neutrality, specific risk deviation, and its measurement. The result of risk identification and project analysis is a list of events that indicate the cause and probability of an event, and a final assessment of its impact on the environment.

Keywords: risk, management, knowledge, risk management

Procedia PDF Downloads 68
865 Blood Pressure Level, Targeted Blood Pressure Control Rate, and Factors Related to Blood Pressure Control in Post-Acute Ischemic Stroke Patients

Authors: Nannapus Saramad, Rewwadee Petsirasan, Jom Suwanno

Abstract:

Background: This retrospective study design was to describe average blood pressure, blood pressure level, target blood pressure control rate post-stroke BP control in the year following discharge from Sichon hospital, Sichon District, Nakhon Si Thammarat province. The secondary data analysis was employed from the patient’s health records with patient or caregiver interview. A total of 232 eligible post-acute ischemic strokes in the year following discharge (2017-2018) were recruited. Methods: Data analyses were applied to identify the relationship values of single variables were determined through univariate analyses: The Chi-square test, Fisher exact test, the variables found to have a p-value < 0.2 were analyzed by the binary logistic regression Results: Most of the patients in this study were men 61.6%, an average age of 65.4 ± 14.8 years. Systolic blood pressure levels were in the grade 1-2 hypertension and diastolic pressure at optimal and normal at all times during the initial treatment through the present. The results revealed 25% among the groups under the age of 60 achieved BP control; 36.3% for older than 60 years group; and 27.9% for diabetic group. The multivariate analysis revealed the final relationship of four significant variables: 1) receiving calcium-channel blocker (p =.027); 2) medication adherence of antihypertensive (p = .024) 3) medication adherence of antiplatelet ( p = .020); and 4) medication behavior ( p = . 010) . Conclusion: The medical nurse and health care provider should promote their adherence to behavior to improve their blood pressure control.

Keywords: acute ischemic stroke, target blood pressure control, medication adherence, recurrence stroke

Procedia PDF Downloads 122
864 Dewatering of Brewery Sludge through the Use of Biopolymers

Authors: Audrey Smith, M. Saifur Rahaman

Abstract:

The waste crisis has become a global issue, forcing many industries to reconsider their disposal methods and environmental practices. Sludge is a form of waste created in many fields, which include water and wastewater, pulp and paper, as well as from breweries. The composition of this sludge differs between sources and can, therefore, have varying disposal methods or future applications. When looking at the brewery industry, it produces a significant amount of sludge with a high water content. In order to avoid landfilling, this waste can further be processed into a valuable material. Specifically, the sludge must undergo dewatering, a process which typically involves the addition of coagulants like aluminum sulfate or ferric chloride. These chemicals, however, limit the potential uses of the sludge since it will contain traces of metals. In this case, the desired outcome of the brewery sludge would be to produce animal feed; however, these conventional coagulants would add a toxic component to the sludge. The use of biopolymers like chitosan, which act as a coagulant, can be used to dewater brewery sludge while allowing it to be safe for animal consumption. Chitosan is also a by-product created by the shellfish processing industry and therefore reduces the environmental imprint since it involves using the waste from one industry to treat the waste from another. In order to prove the effectiveness of this biopolymer, experiments using jar-tests will be utilised to determine the optimal dosages and conditions, while variances of contaminants like ammonium will also be observed. The efficiency of chitosan can also be compared to other polysaccharides to determine which is best suited for this waste. Overall a significant separation has been achieved between the solid and liquid content of the waste during the coagulation-flocculation process when applying chitosan. This biopolymer can, therefore, be used to dewater brewery sludge such that it can be repurposed as animal feed. The use of biopolymers can also be applied to treat sludge from other industries, which can reduce the amount of waste produced and allow for more diverse options for reuse.

Keywords: animal feed, biopolymer, brewery sludge, chitosan

Procedia PDF Downloads 160