Search results for: real time mode
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 21765

Search results for: real time mode

6975 Wet Extraction of Lutein and Lipids from Microalga by Quantitative Determination of Polarity

Authors: Mengyue Gong, Xinyi Li, Amarjeet Bassi

Abstract:

Harvesting by-products while recovering biodiesel is considered a potentially valuable approach to increase the market feasibility of microalgae industry. Lutein is a possible by-product from microalgae that promotes eye health. The extraction efficiency and the expensive drying process of wet algae represent the major challenges for the utilization of microalgae biomass as a feedstock for lipids, proteins, and carotenoids. A wet extraction method was developed to extract lipids and lutein from microalga Chlorella vulgaris. To evaluate different solvent (mixtures) for the extraction, a quantitative analysis was established based on the polarity of solvents using Nile Red as the polarity (ETN) indicator. By the choice of binary solvent system then adding proper amount of water to achieve phase separation, lipids and lutein can be extracted simultaneously. Some other parameters for lipids and lutein production were also studied including saponification time, temperature, choice of alkali, and pre-treatment methods. The extraction efficiency with wet algae was compared with dried algae and shown better pigment recovery. The results indicated that the product pattern in each extracted phase was polarity dependent. Lutein and β-carotene were the main carotenoids extracted with ethanol while lipids come out with hexane.

Keywords: biodiesel, Chlorella vulgaris, extraction, lutein

Procedia PDF Downloads 329
6974 Narrative Psychology and Its Role in Illuminating the Experience of Suffering

Authors: Maureen Gibney

Abstract:

The examination of narrative in psychology has a long tradition, starting with psychoanalytic theory and embracing over time cognitive, social, and personality psychology, among others. Narrative use has been richly detailed as well in medicine, nursing, and social service. One aspect of narrative that has ready utility in higher education and in clinical work is the exploration of suffering and its meaning. Because it is such a densely examined topic, suffering provides a window into identity, sense of purpose, and views of humanity and of the divine. Storytelling analysis permits an exploration of a host of specific manifestations of suffering such as pain and illness, moral injury, and the impact of prolonged suffering on love and relationships. This presentation will review the origins and current understandings of narrative theory in general, and will draw from psychology, medicine, ethics, nursing, and social service in exploring the topic of suffering in particular. It is suggested that the use of narrative themes such as meaning making, agency and communion, generativity, and loss and redemption allows for a finely grained analysis of common and more atypical sources of suffering, their resolution, and the acceptance of their continuation when resolution is not possible. Such analysis, used in professional work and in higher education, can enrich one’s empathy and one’s sense of both the fragility and strength of everyday life.

Keywords: meaning making, narrative theory, suffering, teaching

Procedia PDF Downloads 256
6973 Tea Club (Singapore)-Learning to Navigate the Social World without Fear: Adapted from PEERS® for Young Adults

Authors: Janice Cheong, Tan Seying

Abstract:

The growing years in adolescence are often a tumultuous time for both the individual and family; this is especially so for individuals with Autism Spectrum Disorder (ASD) and Social Communication Disorder (SCD). Tea Club, which is adapted from the PEERS® for Young Adults, seeks to address some of the social challenges faced by Singaporean adolescents with ASD/SCD while navigating social situations. Tea club (hybrid) consists of face-to-face sessions and virtual sessions. These sessions work with both the adolescent and their parents to tackle the individual's difficulties with social skills, empathy, and loneliness. Prior to the group intervention, both participants and their parents scored on the Test of Adolescent Social Skills Knowledge (TASSK) and Autism Spectrum Quotient (AQ), respectively. The session was spread across four months. At the end of the group based intervention, participants’ and parents’ scores were collected again and compared. Inputs on the programme and participant’s confidence in socialization were also gathered from both participants and their parents and looked at thematically. The findings highlight some of the challenges faced by teens with ASD in Singapore and the benefits of the intervention. Parental sentiments are also examined and discussed.

Keywords: adolescence autism, group intervention, social communication disorder, social skills

Procedia PDF Downloads 131
6972 Randomness in Cybertext: A Study on Computer-Generated Poetry from the Perspective of Semiotics

Authors: Hongliang Zhang

Abstract:

The use of chance procedures and randomizers in poetry-writing can be traced back to surrealist works, which, by appealing to Sigmund Freud's theories, were still logocentrism. In the 1960s, random permutation and combination were extensively used by the Oulipo, John Cage and Jackson Mac Low, which further deconstructed the metaphysical presence of writing. Today, the randomly-generated digital poetry has emerged as a genre of cybertext which should be co-authored by readers. At the same time, the classical theories have now been updated by cybernetics and media theories. N· Katherine Hayles put forward the concept of ‘the floating signifiers’ by Jacques Lacan to be the ‘the flickering signifiers’ , arguing that the technology per se has become a part of the textual production. This paper makes a historical review of the computer-generated poetry in the perspective of semiotics, emphasizing that the randomly-generated digital poetry which hands over the dual tasks of both interpretation and writing to the readers demonstrates the intervention of media technology in literature. With the participation of computerized algorithm and programming languages, poems randomly generated by computers have not only blurred the boundary between encoder and decoder, but also raises the issue of human-machine. It is also a significant feature of the cybertext that the productive process of the text is full of randomness.

Keywords: cybertext, digital poetry, poetry generator, semiotics

Procedia PDF Downloads 165
6971 Wear Measuring and Wear Modelling Based On Archard, ASTM, and Neural Network Models

Authors: A. Shebani, C. Pislaru

Abstract:

Wear of materials is an everyday experience and has been observed and studied for long time. The prediction of wear is a fundamental problem in the industrial field, mainly correlated to the planning of maintenance interventions and economy. Pin-on-disc test is the most common test which is used to study the wear behaviour. In this paper, the pin-on-disc (AEROTECH UNIDEX 11) is used for the investigation of the effects of normal load and hardness of material on the wear under dry and sliding conditions. In the pin-on-disc rig, two specimens were used; one, a pin which is made of steel with a tip, is positioned perpendicular to the disc, where the disc is made of aluminium. The pin wear and disc wear were measured by using the following instruments: The Talysurf instrument, a digital microscope, and the alicona instrument; where the Talysurf profilometer was used to measure the pin/disc wear scar depth, and the alicona was used to measure the volume loss for pin and disc. After that, the Archard model, American Society for Testing and Materials model (ASTM), and neural network model were used for pin/disc wear modelling and the simulation results are implemented by using the Matlab program. This paper focuses on how the alicona can be considered as a powerful tool for wear measurements and how the neural network is an effective algorithm for wear estimation.

Keywords: wear modelling, Archard Model, ASTM Model, Neural Networks Model, Pin-on-disc Test, Talysurf, digital microscope, Alicona

Procedia PDF Downloads 439
6970 Hydroxyapatite from Biowaste for the Reinforcement of Polymer

Authors: John O. Akindoyo, M. D. H. Beg, Suriati Binti Ghazali, Nitthiyah Jeyaratnam

Abstract:

Regeneration of bone due to the many health challenges arising from traumatic effects of bone loss, bone tumours and other bone infections is fast becoming indispensable. Over the period of time, some approaches have been undertaken to mitigate this challenge. This includes but not limited to xenografts, allografts, autografts as well as artificial substitutions like bioceramics, synthetic cements and metals. However, most of these techniques often come along with peculiar limitation and problems such as morbidity, availability, disease transmission, collateral site damage or absolute rejection by the body as the case may be. Hydroxyapatite (HA) is very compatible and suitable for this application. However, most of the common methods for HA synthesis are expensive and environmentally unfriendly. Extraction of HA from bio-wastes have been perceived not only to be cost effective, but also environment-friendly. In this research, HA was produced from bio-waste: namely bovine bones through a combination of hydrothermal chemical processes and ordinary calcination techniques. Structure and property of the HA was carried out through different characterization techniques (such as TGA, FTIR, DSC, XRD and BET). The synthesized HA was found to possess similar properties to stoichiometric HA with highly desirable thermal, degradation, structural and porous properties. This material is unique for its potential minimal cost, environmental friendliness and property controllability. It is also perceived to be suitable for tissue and bone engineering applications.

Keywords: biomaterial, biopolymer, bone, hydroxyapatite

Procedia PDF Downloads 308
6969 Electron Microscopical Analysis of Arterial Line Filters During Cardiopulmonary Bypass

Authors: Won-Gon Kim

Abstract:

Introduction: The clinical value of arterial line filters is still a controversial issue. Proponents of arterial line filtration argue that filters remove particulate matter and undissolved gas from circulation, while opponents argue the absence of conclusive clinical data. We conducted scanning electron microscope (SEM) studies of arterial line filters used clinically in the CPB circuits during adult cardiac surgery and analyzed the types and characteristics of materials entrapped in the arterial line filters. Material and Methods: Twelve arterial line filters were obtained during routine hypothermic cardiopulmonary bypass in 12 adult cardiac patients. The arterial line filter was a screen type with a pore size of 40 ㎛ (Baxter Health care corporation Bentley division, Irvine, CA, U.S.A.). After opening the housing, the woven polyester strands were examined with SEM. Results and Conclusion: All segments examined(120 segments, each 2.5 X 2.5 cm in size) contained no embolic particles larger in their cross-sectional area than the pore size of the filter(40 ㎛). The origins of embolic particulates were mostly from environmental foreign bodies. This may suggest a possible need for more aggressive filtration of smaller particulates than is generally carried out at the present time.

Keywords: arterial line filter, tubing wear, scanning electron microscopy, SEM

Procedia PDF Downloads 437
6968 Diet-Induced Epigenetic Transgenerational Inheritance

Authors: Gaby Fahmy

Abstract:

The last decades have seen a rise in metabolic disorders like diabetes, obesity, and fatty liver disease around the world. Environmental factors, especially nutrition, have contributed to this increase. Additionally, pre-conceptional parental nutritional choices have been shown to result in epigenetic modifications affecting gene expression during the developmental process in-utero. These epigenetic modifications have also been seen to extend to the following offspring in a trans-generational effect. This further highlights the significance and relevance of epigenetics and epigenetic tags, which were previously thought to be stripped in newly formed embryos. Suitable prenatal nutrition may partially counteract adverse outcomes caused by exposures to environmental contaminants, ultimately resulting in improved metabolic profiles like body weight and glucose homeostasis. This was seen in patients who were given dietary interventions like restrictive caloric intake, intermittent fasting, and time-restricted feeding. Changes in nutrition are pivotal in the regulation of epigenetic modifications that are transgenerational. For example, dietary choices such as fatty foods vs. vegetables and nuts in fathers were shown to significantly affect sperm motility and volume. This was pivotal in understanding the importance of paternal inheritance. Further research in the field is needed as it remains unclear how many generations are affected by these changes.

Keywords: epigenetics, transgenerational, diet, fasting

Procedia PDF Downloads 85
6967 Managing Truck Drivers’ Fatigue: A Critical Review of the Literature and Recommended Remedies

Authors: Mozhgan Aliakbari, Sara Moridpour

Abstract:

In recent years, much attention has been given to truck drivers’ fatigue management. Long working hours negatively influence truck drivers’ physiology, health, and safety. However, there is little empirical research in the heavy vehicle transport sector in Australia to identify the influence of working hours’ management on drivers’ fatigue and consequently, on the risk of crashes and injuries. There is no national legislation regulating the number of hours or kilometres travelled by truck drivers. Consequently, it is almost impossible to define a standard number of hours or kilometres for truck drivers in a safety management system. This paper reviews the existing studies concerning safe system interventions such as tachographs in relation to fatigue caused by long working hours. This paper also reviews the literature to identify the influence of frequency of rest breaks on the reduction of work-related road transport accidents involving trucks. A framework is presented to manage truck drivers’ fatigue, which may result in the reduction of injuries and fatalities involving heavy vehicles.

Keywords: fatigue, time management, trucks, traffic safety

Procedia PDF Downloads 269
6966 Efficiency Improvement for Conventional Rectangular Horn Antenna by Using EBG Technique

Authors: S. Kampeephat, P. Krachodnok, R. Wongsan

Abstract:

The conventional rectangular horn has been used for microwave antenna a long time. Its gain can be increased by enlarging the construction of horn to flare exponentially. This paper presents a study of the shaped woodpile Electromagnetic Band Gap (EBG) to improve its gain for conventional horn without construction enlargement. The gain enhancement synthesis method for shaped woodpile EBG that has to transfer the electromagnetic fields from aperture of a horn antenna through woodpile EBG is presented by using the variety of shaped woodpile EBGs such as planar, triangular, quadratic, circular, gaussian, cosine, and squared cosine structures. The proposed technique has the advantages of low profile, low cost for fabrication and light weight. The antenna characteristics such as reflection coefficient (S11), radiation patterns and gain are simulated by utilized A Computer Simulation Technology (CST) software. With the proposed concept, an antenna prototype was fabricated and experimented. The S11 and radiation patterns obtained from measurements show a good impedance matching and a gain enhancement of the proposed antenna. The gain at dominant frequency of 10 GHz is 25.6 dB, application for X- and Ku-Band Radar, that higher than the gain of the basic rectangular horn antenna around 8 dB with adding only one appropriated EBG structures.

Keywords: conventional rectangular horn antenna, electromagnetic band gap, gain enhancement, X- and Ku-band radar

Procedia PDF Downloads 260
6965 Deep Learning Based Polarimetric SAR Images Restoration

Authors: Hossein Aghababaei, Sergio Vitale, Giampaolo ferraioli

Abstract:

In the context of Synthetic Aperture Radar (SAR) data, polarization is an important source of information for Earth's surface monitoring . SAR Systems are often considered to transmit only one polarization. This constraint leads to either single or dual polarimetric SAR imaging modalities. Single polarimetric systems operate with a fixed single polarization of both transmitted and received electromagnetic (EM) waves, resulting in a single acquisition channel. Dual polarimetric systems, on the other hand, transmit in one fixed polarization and receive in two orthogonal polarizations, resulting in two acquisition channels. Dual polarimetric systems are obviously more informative than single polarimetric systems and are increasingly being used for a variety of remote sensing applications. In dual polarimetric systems, the choice of polarizations for the transmitter and the receiver is open. The choice of circular transmit polarization and coherent dual linear receive polarizations forms a special dual polarimetric system called hybrid polarimetry, which brings the properties of rotational invariance to geometrical orientations of features in the scene and optimizes the design of the radar in terms of reliability, mass, and power constraints. The complete characterization of target scattering, however, requires fully polarimetric data, which can be acquired with systems that transmit two orthogonal polarizations. This adds further complexity to data acquisition and shortens the coverage area or swath of fully polarimetric images compared to the swath of dual or hybrid polarimetric images. The search for solutions to augment dual polarimetric data to full polarimetric data will therefore take advantage of full characterization and exploitation of the backscattered field over a wider coverage with less system complexity. Several methods for reconstructing fully polarimetric images using hybrid polarimetric data can be found in the literature. Although the improvements achieved by the newly investigated and experimented reconstruction techniques are undeniable, the existing methods are, however, mostly based upon model assumptions (especially the assumption of reflectance symmetry), which may limit their reliability and applicability to vegetation and forest scenarios. To overcome the problems of these techniques, this paper proposes a new framework for reconstructing fully polarimetric information from hybrid polarimetric data. The framework uses Deep Learning solutions to augment hybrid polarimetric data without relying on model assumptions. A convolutional neural network (CNN) with a specific architecture and loss function is defined for this augmentation problem by focusing on different scattering properties of the polarimetric data. In particular, the method controls the CNN training process with respect to several characteristic features of polarimetric images defined by the combination of different terms in the cost or loss function. The proposed method is experimentally validated with real data sets and compared with a well-known and standard approach from the literature. From the experiments, the reconstruction performance of the proposed framework is superior to conventional reconstruction methods. The pseudo fully polarimetric data reconstructed by the proposed method also agree well with the actual fully polarimetric images acquired by radar systems, confirming the reliability and efficiency of the proposed method.

Keywords: SAR image, deep learning, convolutional neural network, deep neural network, SAR polarimetry

Procedia PDF Downloads 74
6964 Scalable Systolic Multiplier over Binary Extension Fields Based on Two-Level Karatsuba Decomposition

Authors: Chiou-Yng Lee, Wen-Yo Lee, Chieh-Tsai Wu, Cheng-Chen Yang

Abstract:

Shifted polynomial basis (SPB) is a variation of polynomial basis representation. SPB has potential for efficient bit-level and digit-level implementations of multiplication over binary extension fields with subquadratic space complexity. For efficient implementation of pairing computation with large finite fields, this paper presents a new SPB multiplication algorithm based on Karatsuba schemes, and used that to derive a novel scalable multiplier architecture. Analytical results show that the proposed multiplier provides a trade-off between space and time complexities. Our proposed multiplier is modular, regular, and suitable for very-large-scale integration (VLSI) implementations. It involves less area complexity compared to the multipliers based on traditional decomposition methods. It is therefore, more suitable for efficient hardware implementation of pairing based cryptography and elliptic curve cryptography (ECC) in constraint driven applications.

Keywords: digit-serial systolic multiplier, elliptic curve cryptography (ECC), Karatsuba algorithm (KA), shifted polynomial basis (SPB), pairing computation

Procedia PDF Downloads 352
6963 Thermal Behaviors of the Strong Form Factors of Charmonium and Charmed Beauty Mesons from Three Point Sum Rules

Authors: E. Yazıcı, H. Sundu, E. Veli Veliev

Abstract:

In order to understand the nature of strong interactions and QCD vacuum, investigation of the meson coupling constants have an important role. The knowledge on the temperature dependence of the form factors is very important for the interpretation of heavy-ion collision experiments. Also, more accurate determination of these coupling constants plays a crucial role in understanding of the hadronic decays. With the increasing of CM energies of the experiments, researches on meson interactions have become one of the more interesting problems of hadronic physics. In this study, we analyze the temperature dependence of the strong form factor of the BcBcJ/ψ vertex using the three point QCD sum rules method. Here, we assume that with replacing the vacuum condensates and also the continuum threshold by their thermal version, the sum rules for the observables remain valid. In calculations, we take into account the additional operators, which appear in the Wilson expansion at finite temperature. We also investigated the momentum dependence of the form factor at T = 0, fit it into an analytic function, and extrapolate into the deep time-like region in order to obtain a strong coupling constant of the vertex. Our results are consistent with the results existing in the literature.

Keywords: QCD sum rules, thermal QCD, heavy mesons, strong coupling constants

Procedia PDF Downloads 181
6962 Investigation of Flow Structure over X-45 Type Non-Slender Delta Wing Planform

Authors: B. Yanıktepe, C. Özalp, B. Şahin

Abstract:

Delta wing planform is an essential aerodynamic configuration, which could be effectively used at relatively high angles of attack than conventional wings in subsonic flow conditions. The flow over delta wings can be characterized by a pair of leading edge vortices emanating from wing apex. Boundary layer separation causes these vortical structures formed by rolling up of viscous flow sheet. This flow separation mechanism is occurred due to angle of attack and sharp leading edges of the delta wing. Therefore, complexity and variety in planform designs rise to catch the best under abnormal flow conditions. The present experimental study investigates the near surface flow structure and aerodynamic flow characteristics of X-45 type non-slender delta wing planform using dye visualization, Stereoscopic Particle Image Velocimetry (stereo-PIV). The instantaneous images are acquired on the plan-view plane within 5o≤α≤20o to calculate the time-averaged flow data. It can be concluded that vortical flow with a pair of well-defined LEVs over X-45 develop at very low angles of attack, secondary vortex are also evident and form close to the wing surface similar to delta and lambda planforms. The stall occurs at an angle of attack α=32o.

Keywords: aerodynamic, delta wing, PIV, vortex breakdown

Procedia PDF Downloads 409
6961 Effect of Naphtha on the Composition of a Heavy Crude, in Addition to a Cycle Steam Stimulation Process

Authors: A. Guerrero, A. Leon, S. Munoz, M. Sandoval

Abstract:

The addition of solvent to cyclic steam stimulation is done in order to reduce the solvent-vapor ratio at late stages of the process, the moment in which this relationship increases significantly. The study of the use of naphtha in addition to the cyclic steam stimulation has been mainly oriented to the effect it achieves on the incremental recovery compared to the application of steam only. However, the effect of naphtha on the reactivity of crude oil components under conditions of cyclic steam stimulation or if its effect is the only dilution has not yet been considered, to author’s best knowledge. The present study aims to evaluate and understand the effect of naphtha and the conditions of cyclic steam stimulation, on the remaining composition of the improved oil, as well as the main mechanisms present in the heavy crude - naphtha interaction. Tests were carried out with the system solvent (naphtha)-oil (12.5° API, 4216 cP @ 40° C)- steam, in a batch micro-reactor, under conditions of cyclic steam stimulation (250-300 °C, 400 psi). The characterization of the samples obtained was carried out by MALDI-TOF MS (matrix-assisted laser desorption/ionization time-of-flight mass spectrometry) and NMR (Nuclear Magnetic Resonance) techniques. The results indicate that there is a rearrangement of the microstructure of asphaltenes, resulting in a decrease in these and an increase in lighter components such as resins.

Keywords: composition change, cyclic steam stimulation, interaction mechanism, naphtha

Procedia PDF Downloads 126
6960 Evaluating Portfolio Performance by Highlighting Network Property and the Sharpe Ratio in the Stock Market

Authors: Zahra Hatami, Hesham Ali, David Volkman

Abstract:

Selecting a portfolio for investing is a crucial decision for individuals and legal entities. In the last two decades, with economic globalization, a stream of financial innovations has rushed to the aid of financial institutions. The importance of selecting stocks for the portfolio is always a challenging task for investors. This study aims to create a financial network to identify optimal portfolios using network centralities metrics. This research presents a community detection technique of superior stocks that can be described as an optimal stock portfolio to be used by investors. By using the advantages of a network and its property in extracted communities, a group of stocks was selected for each of the various time periods. The performance of the optimal portfolios compared to the famous index. Their Sharpe ratio was calculated in a timely manner to evaluate their profit for making decisions. The analysis shows that the selected potential portfolio from stocks with low centrality measurement can outperform the market; however, they have a lower Sharpe ratio than stocks with high centrality scores. In other words, stocks with low centralities could outperform the S&P500 yet have a lower Sharpe ratio than high central stocks.

Keywords: portfolio management performance, network analysis, centrality measurements, Sharpe ratio

Procedia PDF Downloads 138
6959 Fiction and Reality in Animation: Taking Final Flight of the Osiris as an Example

Authors: Syong-Yang Chung, Xin-An Chen

Abstract:

This study aims to explore the less well-known animation “Final Flight of the Osiris”, consisting of an initial exploration of the film color, storyline, and the simulacrum meanings of the roles, which leads to a further exploration of the light-shadow contrast and the psychological images presented by the screen colors and the characters. The research is based on literature review, and all data was compiled for the analysis of the visual vocabulary evolution of the characters. In terms of the structure, the relational study of the animation and the historical background of that time came first, including The Wachowskis’ and Andy Jones’ impact towards the cinematographic version and the animation version of “The Matrix”. Through literature review, the film color, the meaning and the relevant points were clarified. It was found in this research that “Final Flight of the Osiris” separates the realistic and virtual spaces by the changing the color tones; the "self" of the audience gradually dissolves into the "virtual" in the simulacra world, and the "Animatrix" has become a virtual field for the audience to understand itself about "existence" and "self".

Keywords: the matrix, the final flight of Osiris, Wachowski brothers, simulacres

Procedia PDF Downloads 218
6958 Tuning for a Small Engine with a Supercharger

Authors: Shinji Kajiwara, Tadamasa Fukuoka

Abstract:

The formula project of Kinki University has been involved in the student Formula SAE of Japan (JSAE) since the second year the competition was held. The vehicle developed in the project uses a ZX-6R engine, which has been manufactured by Kawasaki Heavy Industries for the JSAE competition for the eighth time. The limited performance of the concept vehicle was improved through the development of a power train. The supercharger loading, engine dry sump, and engine cooling management of the vehicle were also enhanced. The supercharger loading enabled the vehicle to achieve a maximum output of 59.6 kW (80.6 PS)/9000 rpm and a maximum torque of 70.6 Nm (7.2 kgf m)/8000 rpm. We successfully achieved 90% of the engine’s torque band (4000–10000 rpm) with 50% of the revolutions in regular engine use (2000–12000 rpm). Using a dry sump system, we periodically managed hydraulic pressure during engine operation. A system that controls engine stoppage when hydraulic pressure falls was also constructed. Using the dry sump system at 80 mm reduced the required engine load and the vehicle’s center of gravity. Even when engine motion was suspended by the electromotive force exerted by the water pump, the circulation of cooling water was still possible. These findings enabled us to create a cooling system in accordance with the requirements of the competition.

Keywords: engine, combustion, cooling system, numerical simulation, power, torque, mechanical super charger

Procedia PDF Downloads 289
6957 Study of the Toxic Activity of the Entomopathogenic Fungus Beauveria bassiana on the Wistar Rat Rattus norvegicus

Authors: F. Haddadj, S. Hamdi, A. Milla, S. Zenia, A. Smai, H. Saadi, F. Marniche, B. Doumandji-Mitiche

Abstract:

The use of a biopesticide based on a microorganism scale requires particular care including safety against the useful auxiliary fauna and mammals among other human beings. Due to its persistence in soil and its apparent human and animal safety, Beauveria bassiana is a cryptogram used for controlling pests organizations, particularly in the locust where its effectiveness has been proven. This fungus is also called for greater respect for biotic communities and the environment. Indeed, biopesticides have several environmental benefits: biodegradability, their activity and selectivity decrease unintended non-target species effects, decreased resistance to some of them. It is in this sense that we contribute by presenting our work on the safety of B. bassiana against mammals. For this we conducted a toxicological study of this fungus strain on Wistar rats Rattus norvegicus, first its effect on weight gain. In a second time were performed histological target organ is the liver. After 20 days of treatment, the results of the toxicological studies have shown that B. bassiana caused no change in the physiological state of rats or weight gain, behavior and diet. On cuts in liver histology revealed no disturbance on the organ.

Keywords: B. bassiana, entomopathogenic fungus, histology, Rattus norvegicus

Procedia PDF Downloads 229
6956 Cultivation And Production of Insects, Especially Mealworms (Mealworms) and Investigating Its Potential as Food for Animals and Even Humans

Authors: Marzieh Eshaghi Koupaei

Abstract:

By cultivating mealworm, we reduce greenhouse gases and avoid the use of transgenic products such as soybeans, and we provide food resources rich in protein, amino acids, minerals, etc. for humans and animals, and it has created employment and entrepreneurship. We serve the environment by producing oil from mealworm in the cosmetic industry, using its waste as organic fertilizer and its powder in bodybuilding, and by breaking down plastic by mealworm. The production and breeding of mealworm requires very little infrastructure and does not require much trouble, and requires very little food, and reproduces easily and quickly, and a mealworm production workshop is noiseless, odorless, and pollution-free And the costs are very low. It is possible to use third grade fruits and unsalable fruits of farmers to feed the mealworms, which is completely economical and cost-effective. Mealworms can break down plastic in their intestines and turn it into carbon dioxide. . This process was done in only 16 days, which is a very short time compared to several centuries for plastic to decompose. By producing mealworm, we have helped to preserve the environment and provided the source of protein needed by humans and animals. This industrial insect has the ability and value of commercialization and creates employment and helps the economy of the society.

Keywords: breeding, production of insects, mealworms, research, animal feed, human feed

Procedia PDF Downloads 40
6955 Long Term Evolution Multiple-Input Multiple-Output Network in Unmanned Air Vehicles Platform

Authors: Ashagrie Getnet Flattie

Abstract:

Line-of-sight (LOS) information, data rates, good quality, and flexible network service are limited by the fact that, for the duration of any given connection, they experience severe variation in signal strength due to fading and path loss. Wireless system faces major challenges in achieving wide coverage and capacity without affecting the system performance and to access data everywhere, all the time. In this paper, the cell coverage and edge rate of different Multiple-input multiple-output (MIMO) schemes in 20 MHz Long Term Evolution (LTE) system under Unmanned Air Vehicles (UAV) platform are investigated. After some background on the enormous potential of UAV, MIMO, and LTE in wireless links, the paper highlights the presented system model which attempts to realize the various benefits of MIMO being incorporated into UAV platform. The performances of the three MIMO LTE schemes are compared with the performance of 4x4 MIMO LTE in UAV scheme carried out to evaluate the improvement in cell radius, BER, and data throughput of the system in different morphology. The results show that significant performance gains such as bit error rate (BER), data rate, and coverage can be achieved by using the presented scenario.

Keywords: LTE, MIMO, path loss, UAV

Procedia PDF Downloads 265
6954 Inference for Compound Truncated Poisson Lognormal Model with Application to Maximum Precipitation Data

Authors: M. Z. Raqab, Debasis Kundu, M. A. Meraou

Abstract:

In this paper, we have analyzed maximum precipitation data during a particular period of time obtained from different stations in the Global Historical Climatological Network of the USA. One important point to mention is that some stations are shut down on certain days for some reason or the other. Hence, the maximum values are recorded by excluding those readings. It is assumed that the number of stations that operate follows zero-truncated Poisson random variables, and the daily precipitation follows a lognormal random variable. We call this model a compound truncated Poisson lognormal model. The proposed model has three unknown parameters, and it can take a variety of shapes. The maximum likelihood estimators can be obtained quite conveniently using Expectation-Maximization (EM) algorithm. Approximate maximum likelihood estimators are also derived. The associated confidence intervals also can be obtained from the observed Fisher information matrix. Simulation results have been performed to check the performance of the EM algorithm, and it is observed that the EM algorithm works quite well in this case. When we analyze the precipitation data set using the proposed model, it is observed that the proposed model provides a better fit than some of the existing models.

Keywords: compound Poisson lognormal distribution, EM algorithm, maximum likelihood estimation, approximate maximum likelihood estimation, Fisher information, skew distribution

Procedia PDF Downloads 99
6953 Librarian Liaisons: Facilitating Multi-Disciplinary Research for Academic Advancement

Authors: Tracey Woods

Abstract:

In the ever-evolving landscape of academia, the traditional role of the librarian has undergone a remarkable transformation. Once considered as custodians of books and gatekeepers of information, librarians have the potential to take on the vital role of facilitators of cross and inter-disciplinary projects. This shift is driven by the growing recognition of the value of interdisciplinary collaboration in addressing complex research questions in pursuit of novel solutions to real-world problems. This paper shall explore the potential of the academic librarian’s role in facilitating innovative, multi-disciplinary projects, both recognising and validating the vital role that the librarian plays in a somewhat underplayed profession. Academic libraries support teaching, the strengthening of knowledge discourse, and, potentially, the development of innovative practices. As the role of the library gradually morphs from a quiet repository of books to a community-based information hub, a potential opportunity arises. The academic librarian’s role is to build knowledge across a wide span of topics, from the advancement of AI to subject-specific information, and, whilst librarians are generally not offered the research opportunities and funding that the traditional academic disciplines enjoy, they are often invited to help build research in support of the academic. This identifies that one of the primary skills of any 21st-century librarian must be the ability to collaborate and facilitate multi-disciplinary projects. In universities seeking to develop research diversity and academic performance, there is an increasing awareness of the need for collaboration between faculties to enable novel directions and advancements. This idea has been documented and discussed by several researchers; however, there is not a great deal of literature available from recent studies. Having a team based in the library that is adept at creating effective collaborative partnerships is valuable for any academic institution. This paper outlines the development of such a project, initiated within and around an identified library-specific need: the replication of fragile special collections for object-based learning. The research was developed as a multi-disciplinary project involving the faculties of engineering (digital twins lab), architecture, design, and education. Centred around methods for developing a fragile archive into a series of tactile objects furthers knowledge and understanding in both the role of the library as a facilitator of projects, chairing and supporting, alongside contributing to the research process and innovating ideas through the bank of knowledge found amongst the staff and their liaising capabilities. This paper shall present the method of project development from the initiation of ideas to the development of prototypes and dissemination of the objects to teaching departments for analysis. The exact replication of artefacts is also balanced with the adaptation and evolutionary speculations initiated by the design team when adapted as a teaching studio method. The dynamic response required from the library to generate and facilitate these multi-disciplinary projects highlights the information expertise and liaison skills that the librarian possesses. As academia embraces this evolution, the potential for groundbreaking discoveries and innovative solutions across disciplines becomes increasingly attainable.

Keywords: Liaison librarian, multi-disciplinary collaborations, library innovations, librarian stakeholders

Procedia PDF Downloads 52
6952 Empirical Green’s Function Technique for Accelerogram Synthesis: The Problem of the Use for Marine Seismic Hazard Assessment

Authors: Artem A. Krylov

Abstract:

Instrumental seismological researches in water areas are complicated and expensive, that leads to the lack of strong motion records in most offshore regions. In the same time the number of offshore industrial infrastructure objects, such as oil rigs, subsea pipelines, is constantly increasing. The empirical Green’s function technique proved to be very effective for accelerograms synthesis under the conditions of poorly described seismic wave propagation medium. But the selection of suitable small earthquake record in offshore regions as an empirical Green’s function is a problem because of short seafloor instrumental seismological investigation results usually with weak micro-earthquakes recordings. An approach based on moving average smoothing in the frequency domain is presented for preliminary processing of weak micro-earthquake records before using it as empirical Green’s function. The method results in significant waveform correction for modeled event. The case study for 2009 L’Aquila earthquake was used to demonstrate the suitability of the method. This work was supported by the Russian Foundation of Basic Research (project № 18-35-00474 mol_a).

Keywords: accelerogram synthesis, empirical Green's function, marine seismology, microearthquakes

Procedia PDF Downloads 311
6951 Modern Proteomics and the Application of Machine Learning Analyses in Proteomic Studies of Chronic Kidney Disease of Unknown Etiology

Authors: Dulanjali Ranasinghe, Isuru Supasan, Kaushalya Premachandra, Ranjan Dissanayake, Ajith Rajapaksha, Eustace Fernando

Abstract:

Proteomics studies of organisms are considered to be significantly information-rich compared to their genomic counterparts because proteomes of organisms represent the expressed state of all proteins of an organism at a given time. In modern top-down and bottom-up proteomics workflows, the primary analysis methods employed are gel–based methods such as two-dimensional (2D) electrophoresis and mass spectrometry based methods. Machine learning (ML) and artificial intelligence (AI) have been used increasingly in modern biological data analyses. In particular, the fields of genomics, DNA sequencing, and bioinformatics have seen an incremental trend in the usage of ML and AI techniques in recent years. The use of aforesaid techniques in the field of proteomics studies is only beginning to be materialised now. Although there is a wealth of information available in the scientific literature pertaining to proteomics workflows, no comprehensive review addresses various aspects of the combined use of proteomics and machine learning. The objective of this review is to provide a comprehensive outlook on the application of machine learning into the known proteomics workflows in order to extract more meaningful information that could be useful in a plethora of applications such as medicine, agriculture, and biotechnology.

Keywords: proteomics, machine learning, gel-based proteomics, mass spectrometry

Procedia PDF Downloads 138
6950 Plant Cell Culture to Produce Valuable Natural Products

Authors: Jehad Dumireih, Malak Dmirieh, Michael Wink

Abstract:

The present work is aimed to use plant cell suspension cultures of Crataegus monogyna for biosynthesis of valuable natural products by using quercetin as an inexpensive precursor. Suspension cell cultures of C. monogyna were established by using Murashige and Skoog medium (MS) supplemented with 1 mg/L 2,4-dichlorophenoxyacetic acid and 1 mg/L kinetin. Cells were harvested from the cultures and extracted by using methanol and ethyl acetate; then the extracts were used for the identification of isoquercetin by HPLC and by mass spectrometry. The incubation of the cells with 0.24 mM quercetin for one week resulted in an 16 fold increase of isoquercetin biosynthesis; the growth rate of the cells increased by 20%. Moreover, the biosynthesis of isoquercetin was enhanced by 40% when we divided the added quercetin into three portions each one with concentration 0.12 mM supplied at 3 days intervals. In addition, we didn’t find any positive effects of adding different concentrations the precursors phenylalanine (0.2 mM) and galactose to the cell cultures. In conclusion, the efficiency of the biotransformation of quercetin into isoquercetin depended on the concentration quercetin, its incubation time and the way of its administration. The results of the present work suggest that the biotechnological methods such as cell suspension cultures could be successfully used to obtain highly valuable natural product starting from inexpensive compound.

Keywords: biosynthesis, biotransformation, Crataegus, isoquercetin

Procedia PDF Downloads 484
6949 Nonlinear Homogenized Continuum Approach for Determining Peak Horizontal Floor Acceleration of Old Masonry Buildings

Authors: Andreas Rudisch, Ralf Lampert, Andreas Kolbitsch

Abstract:

It is a well-known fact among the engineering community that earthquakes with comparatively low magnitudes can cause serious damage to nonstructural components (NSCs) of buildings, even when the supporting structure performs relatively well. Past research works focused mainly on NSCs of nuclear power plants and industrial plants. Particular attention should also be given to architectural façade elements of old masonry buildings (e.g. ornamental figures, balustrades, vases), which are very vulnerable under seismic excitation. Large numbers of these historical nonstructural components (HiNSCs) can be found in highly frequented historical city centers and in the event of failure, they pose a significant danger to persons. In order to estimate the vulnerability of acceleration sensitive HiNSCs, the peak horizontal floor acceleration (PHFA) is used. The PHFA depends on the dynamic characteristics of the building, the ground excitation, and induced nonlinearities. Consequently, the PHFA can not be generalized as a simple function of height. In the present research work, an extensive case study was conducted to investigate the influence of induced nonlinearity on the PHFA for old masonry buildings. Probabilistic nonlinear FE time-history analyses considering three different hazard levels were performed. A set of eighteen synthetically generated ground motions was used as input to the structure models. An elastoplastic macro-model (multiPlas) for nonlinear homogenized continuum FE-calculation was calibrated to multiple scales and applied, taking specific failure mechanisms of masonry into account. The macro-model was calibrated according to the results of specific laboratory and cyclic in situ shear tests. The nonlinear macro-model is based on the concept of multi-surface rate-independent plasticity. Material damage or crack formation are detected by reducing the initial strength after failure due to shear or tensile stress. As a result, shear forces can only be transmitted to a limited extent by friction when the cracking begins. The tensile strength is reduced to zero. The first goal of the calibration was the consistency of the load-displacement curves between experiment and simulation. The calibrated macro-model matches well with regard to the initial stiffness and the maximum horizontal load. Another goal was the correct reproduction of the observed crack image and the plastic strain activities. Again the macro-model proved to work well in this case and shows very good correlation. The results of the case study show that there is significant scatter in the absolute distribution of the PHFA between the applied ground excitations. An absolute distribution along the normalized building height was determined in the framework of probability theory. It can be observed that the extent of nonlinear behavior varies for the three hazard levels. Due to the detailed scope of the present research work, a robust comparison with code-recommendations and simplified PHFA distributions are possible. The chosen methodology offers a chance to determine the distribution of PHFA along the building height of old masonry structures. This permits a proper hazard assessment of HiNSCs under seismic loads.

Keywords: nonlinear macro-model, nonstructural components, time-history analysis, unreinforced masonry

Procedia PDF Downloads 154
6948 Identify the Factors Affecting Employment and Prioritize in the Economic Sector Jobs of Increased Employment MADM approach of using SAW and TOPSIS and POSET: Ministry of Cooperatives, Do Varamin City Social Welfare

Authors: Mina Rahmani Pour

Abstract:

Negative consequences of unemployment are: increasing age at marriage, addiction, depression, drug trafficking, divorce, immigration, elite, frustration, delinquency, theft, murder, etc., has led to addressing the issue of employment by economic planners, public authorities, chief executive economic conditions in different countries and different time is important. All countries are faced with the problem of unemployment. By identifying the influential factors of occupational employment and employing strengths in the basic steps can be taken to reduce unemployment. In this study, the most significant factors affecting employment has identified 12 variables based on interviews conducted Choose Vtasyrafzaysh engaged in three main business is discussed. DRGAM next question the 8 expert ministry to respond to it is distributed and for weight Horns AZFN Shannon entropy and the ranking criteria of the (SAW, TOPSIS) used. According to the results of the above methods are not compatible with each other, to reach a general consensus on the rating criteria of the technique of integrating (POSET) involving average, Borda, copeland is used. Ultimately, there is no difference between the employments in the economic sector jobs of increased employment.

Keywords: employment, effective techniques, SAW, TOPSIS

Procedia PDF Downloads 220
6947 A System Dynamics Approach to Exploring Personality Traits in Young Children

Authors: Misagh Faezipour

Abstract:

System dynamics is a systems engineering approach that can help address the complex challenges in different systems. Little is known about how the brain represents people to predict behavior. This work is based on how the brain simulates different personal behavior and responds to them in the case of young children ages one to five. As we know, children’s minds/brains are just as clean as a crystal, and throughout time, in their surroundings, families, and education center, they grow to develop and have different kinds of behavior towards the world and the society they live in. Hence, this work aims to identify how young children respond to various personality behavior and observes their reactions towards them from a system dynamics perspective. We will be exploring the Big Five personality traits in young children. A causal model is developed in support of the system dynamics approach. These models graphically present the factors and factor relationships that contribute to the big five personality traits and provide a better understanding of the entire behavior model. A simulator will be developed that includes a set of causal model factors and factor relationships. The simulator models the behavior of different factors related to personality traits and their impacts and can help make more informed decisions in a risk-free environment.

Keywords: personality traits, systems engineering, system dynamics, causal model, behavior model

Procedia PDF Downloads 83
6946 A Study on Multidimensional Locus of Control and the Procrastinating Behavior in Employees

Authors: Richa Mishra, Sonia Munjal

Abstract:

In this increasingly hectic and competitive climate, employees are expected to manage the resources available to them to perform their work. However, many are wasting the most precious and scarce resource at their disposal, time, by procrastinating on tasks and thereby costing themselves and their organizations. As timely performance is a requirement of most jobs, procrastination is particularly problematic in the workplace. Evidence suggests that procrastination and poor performance go hand-in-hand, as procrastinators miss more deadlines than non-procrastinators and make more errors and work at a slower speed than non-procrastinators when performing timed tasks. This research is hence an effort to add a little in the sparse knowledge base. It is an effort to throw light on the relationship of Levenson’s multi dimensions of locus of control and also an effort to identify if it is one of the causes and of employees procrastination which have not been explored earlier. The study also explores the effect and relationship of multidimensional locus of control and various levels of stress on procrastination. The results of the research have ascertained that there is significant impact of LOC dimensions on the procrastinating behavior of the employees. One of the major findings to emerge from the current research that managers with powerful others as their LOC dimensions were least procrastinating, contradicts the previous research results that external procrastinate more than internals.

Keywords: Multidimensional Locus of Control, workplace procrastination, employee behaviour, manufacturing industry

Procedia PDF Downloads 230