Search results for: animal artificial insemination
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3157

Search results for: animal artificial insemination

1387 Subacute Toxicity Study of Total Alkaloids of Seeds of Peganum harmala in Female Rat

Authors: Mahdeb Nadia, Ghadjati Nadhra, Bettihi Sara, Daamouche Z. El Youm, Bouzidi Abdelouahab

Abstract:

The effects of subacute administration of total alkaloids of seeds Peganum harmala were studied in female Albino-Wistar rats. After intraperitoneal administration of dose 50 mg/kg for 10 days and 40 mg/kg for 7 days of total alkaloids to the seeds of Peganum harmala (animal treatment lasted 17 days), there were remarkable changes in general appearance and deaths occurred in experimental group. After 17 days a significant reduction was observed in the surviving animals treated with total alkaloid seeds.The Red Blood Cells (RBC), Hematocrit (HCT), Hemoglobin (HGB) and White blood cells (WBCs), show significant reduction in the treated groups. There were no statistical differences in Glutamic-Oxaloacetic Transaminase (GOT), Glutamic-pyruvic Transaminase (GPT) and Alkaline Phosphatase (ALP), total protein, glucose and creatinine observed between groups. However the urea was significantly higher in the treated female rats than the control group. Histological examination of liver showed no histopathological changes. Alkaloids of Peganum harmala showed significant toxicity in female rats.

Keywords: Peganum harmala, rat, liver, kidney, alkaloids, toxicity

Procedia PDF Downloads 430
1386 Optimization of Strategies and Models Review for Optimal Technologies-Based on Fuzzy Schemes for Green Architecture

Authors: Ghada Elshafei, A. Elazim Negm

Abstract:

Recently, Green architecture becomes a significant way to a sustainable future. Green building designs involve finding the balance between comfortable homebuilding and sustainable environment. Moreover, the utilization of the new technologies such as artificial intelligence techniques are used to complement current practices in creating greener structures to keep the built environment more sustainable. The most common objectives are green buildings should be designed to minimize the overall impact of the built environment on ecosystems in general and particularly on human health and on the natural environment. This will lead to protecting occupant health, improving employee productivity, reducing pollution and sustaining the environmental. In green building design, multiple parameters which may be interrelated, contradicting, vague and of qualitative/quantitative nature are broaden to use. This paper presents a comprehensive critical state of art review of current practices based on fuzzy and its combination techniques. Also, presented how green architecture/building can be improved using the technologies that been used for analysis to seek optimal green solutions strategies and models to assist in making the best possible decision out of different alternatives.

Keywords: green architecture/building, technologies, optimization, strategies, fuzzy techniques, models

Procedia PDF Downloads 464
1385 Regulatory and Economic Challenges of AI Integration in Cyber Insurance

Authors: Shreyas Kumar, Mili Shangari

Abstract:

Integrating artificial intelligence (AI) in the cyber insurance sector represents a significant advancement, offering the potential to revolutionize risk assessment, fraud detection, and claims processing. However, this integration introduces a range of regulatory and economic challenges that must be addressed to ensure responsible and effective deployment of AI technologies. This paper examines the multifaceted regulatory landscape governing AI in cyber insurance and explores the economic implications of compliance, innovation, and market dynamics. AI's capabilities in processing vast amounts of data and identifying patterns make it an invaluable tool for insurers in managing cyber risks. Yet, the application of AI in this domain is subject to stringent regulatory scrutiny aimed at safeguarding data privacy, ensuring algorithmic transparency, and preventing biases. Regulatory bodies, such as the European Union with its General Data Protection Regulation (GDPR), mandate strict compliance requirements that can significantly impact the deployment of AI systems. These regulations necessitate robust data protection measures, ethical AI practices, and clear accountability frameworks, all of which entail substantial compliance costs for insurers. The economic implications of these regulatory requirements are profound. Insurers must invest heavily in upgrading their IT infrastructure, implementing robust data governance frameworks, and training personnel to handle AI systems ethically and effectively. These investments, while essential for regulatory compliance, can strain financial resources, particularly for smaller insurers, potentially leading to market consolidation. Furthermore, the cost of regulatory compliance can translate into higher premiums for policyholders, affecting the overall affordability and accessibility of cyber insurance. Despite these challenges, the potential economic benefits of AI integration in cyber insurance are significant. AI-enhanced risk assessment models can provide more accurate pricing, reduce the incidence of fraudulent claims, and expedite claims processing, leading to overall cost savings and increased efficiency. These efficiencies can improve the competitiveness of insurers and drive innovation in product offerings. However, balancing these benefits with regulatory compliance is crucial to avoid legal penalties and reputational damage. The paper also explores the potential risks associated with AI integration, such as algorithmic biases that could lead to unfair discrimination in policy underwriting and claims adjudication. Regulatory frameworks need to evolve to address these issues, promoting fairness and transparency in AI applications. Policymakers play a critical role in creating a balanced regulatory environment that fosters innovation while protecting consumer rights and ensuring market stability. In conclusion, the integration of AI in cyber insurance presents both regulatory and economic challenges that require a coordinated approach involving regulators, insurers, and other stakeholders. By navigating these challenges effectively, the industry can harness the transformative potential of AI, driving advancements in risk management and enhancing the resilience of the cyber insurance market. This paper provides insights and recommendations for policymakers and industry leaders to achieve a balanced and sustainable integration of AI technologies in cyber insurance.

Keywords: artificial intelligence (AI), cyber insurance, regulatory compliance, economic impact, risk assessment, fraud detection, cyber liability insurance, risk management, ransomware

Procedia PDF Downloads 22
1384 Application of Data Mining Techniques for Tourism Knowledge Discovery

Authors: Teklu Urgessa, Wookjae Maeng, Joong Seek Lee

Abstract:

Application of five implementations of three data mining classification techniques was experimented for extracting important insights from tourism data. The aim was to find out the best performing algorithm among the compared ones for tourism knowledge discovery. Knowledge discovery process from data was used as a process model. 10-fold cross validation method is used for testing purpose. Various data preprocessing activities were performed to get the final dataset for model building. Classification models of the selected algorithms were built with different scenarios on the preprocessed dataset. The outperformed algorithm tourism dataset was Random Forest (76%) before applying information gain based attribute selection and J48 (C4.5) (75%) after selection of top relevant attributes to the class (target) attribute. In terms of time for model building, attribute selection improves the efficiency of all algorithms. Artificial Neural Network (multilayer perceptron) showed the highest improvement (90%). The rules extracted from the decision tree model are presented, which showed intricate, non-trivial knowledge/insight that would otherwise not be discovered by simple statistical analysis with mediocre accuracy of the machine using classification algorithms.

Keywords: classification algorithms, data mining, knowledge discovery, tourism

Procedia PDF Downloads 286
1383 A Literature Review of the Trend towards Indoor Dynamic Thermal Comfort

Authors: James Katungyi

Abstract:

The Steady State thermal comfort model which dominates thermal comfort practice and which posits the ideal thermal conditions in a narrow range of thermal conditions does not deliver the expected comfort levels among occupants. Furthermore, the buildings where this model is applied consume a lot of energy in conditioning. This paper reviews significant literature about thermal comfort in dynamic indoor conditions including the adaptive thermal comfort model and alliesthesia. A major finding of the paper is that the adaptive thermal comfort model is part of a trend from static to dynamic indoor environments in aspects such as lighting, views, sounds and ventilation. Alliesthesia or thermal delight is consistent with this trend towards dynamic thermal conditions. It is within this trend that the two fold goal of increased thermal comfort and reduced energy consumption lies. At the heart of this trend is a rediscovery of the link between the natural environment and human well-being, a link that was partially severed by over-reliance on mechanically dominated artificial indoor environments. The paper concludes by advocating thermal conditioning solutions that integrate mechanical with natural thermal conditioning in a balanced manner in order to meet occupant thermal needs without endangering the environment.

Keywords: adaptive thermal comfort, alliesthesia, energy, natural environment

Procedia PDF Downloads 211
1382 In-silico Analysis of Plumbagin against Cancer Receptors

Authors: Arpita Roy, Navneeta Bharadvaja

Abstract:

Cancer is an uncontrolled growth of abnormal cells in the body. It is one of the most serious diseases on which extensive research work has been going on all over the world. Structure-based drug designing is a computational approach which helps in the identification of potential leads that can be used for the development of a drug. Plumbagin is a naphthoquinone derivative from Plumbago zeylanica roots and belongs to one of the largest and diverse groups of plant metabolites. Anticancer and antiproliferative activities of plumbagin have been observed in animal models as well as in cell cultures. Plumbagin shows inhibitory effects on multiple cancer-signaling proteins; however, the binding mode and the molecular interactions have not yet been elucidated for most of these protein targets. In this investigation, an attempt to provide structural insights into the binding mode of plumbagin against four cancer receptors using molecular docking was performed. Plumbagin showed minimal energy against targeted cancer receptors, therefore suggested its stability and potential towards different cancers. The least binding energies of plumbagin with COX-2, TACE, and CDK6 are -5.39, -4.93, -and 4.81 kcal/mol, respectively. Comparison studies of plumbagin with different receptors showed that it is a promising compound for cancer treatment. It was also found that plumbagin obeys the Lipinski’s Rule of 5 and computed ADMET properties which showed drug likeliness and improved bioavailability. Since plumbagin is from a natural source, it has reduced side effects, and these results would be useful for cancer treatment.

Keywords: cancer, receptor, plumbagin, docking

Procedia PDF Downloads 136
1381 A Real Time Monitoring System of the Supply Chain Conditions, Products and Means of Transport

Authors: Dimitris E. Kontaxis, George Litainas, Dimitris P. Ptochos

Abstract:

Real-time monitoring of the supply chain conditions and procedures is a critical element for the optimal coordination and safety of the deliveries, as well as for the minimization of the delivery time and cost. Real-time monitoring requires IoT data streams, which are related to the conditions of the products and the means of transport (e.g., location, temperature/humidity conditions, kinematic state, ambient light conditions, etc.). These streams are generated by battery-based IoT tracking devices, equipped with appropriate sensors, and are transmitted to a cloud-based back-end system. Proper handling and processing of the IoT data streams, using predictive and artificial intelligence algorithms, can provide significant and useful results, which can be exploited by the supply chain stakeholders in order to enhance their financial benefits, as well as the efficiency, security, transparency, coordination, and sustainability of the supply chain procedures. The technology, the features, and the characteristics of a complete, proprietary system, including hardware, firmware, and software tools -developed in the context of a co-funded R&D programme- are addressed and presented in this paper.

Keywords: IoT embedded electronics, real-time monitoring, tracking device, sensor platform

Procedia PDF Downloads 171
1380 Intelligent Tutor Using Adaptive Learning to Partial Discharges with Virtual Reality Systems

Authors: Hernández Yasmín, Ochoa Alberto, Hurtado Diego

Abstract:

The aim of this study is developing an intelligent tutoring system for electrical operators training with virtual reality systems at the laboratory center of partials discharges LAPEM. The electrical domain requires efficient and well trained personnel, due to the danger involved in the partials discharges field, qualified electricians are required. This paper presents an overview of the intelligent tutor adaptive learning design and user interface with VR. We propose the develop of constructing a model domain of a subset of partial discharges enables adaptive training through a trainee model which represents the affective and knowledge states of trainees. According to the success of the intelligent tutor system with VR, it is also hypothesized that the trainees will able to learn the electrical domain installations of partial discharges and gain knowledge more efficient and well trained than trainees using traditional methods of teaching without running any risk of being in danger, traditional methods makes training lengthily, costly and dangerously.

Keywords: intelligent tutoring system, artificial intelligence, virtual reality, partials discharges, adaptive learning

Procedia PDF Downloads 311
1379 A Study of Traditional Mode in the Framework of Sustainable Urban Transportation

Authors: Juanita, B. Kombaitan, Iwan Pratoyo Kusumantoro

Abstract:

The traditional mode is a non-motorized vehicle powered by human or animal power. The objective of the study was to define the strategy of using traditional modes by the framework of sustainable urban transport in support of urban tourism activities. The study of the traditional mode does not include a modified mode using the engine power as motor tricycles are often called ‘bentor ‘in Indonesia. The use of non-motorized traditional mode in Indonesia has begun to shift, and its use began to be eliminated by the change of propulsion using the machine. In an effort to push back the use of traditional mode one of them with tourism activities. Strategies for the use of traditional modes within the framework of sustainable urban transport are seen from three dimensions: social, economic and environmental. The social dimension related to accessibility and livability, an economic dimension related to traditional modes can promote products and tourist attractions, while the environmental dimension related to the needs of the users/groups with respect to safety, comfort. The traditional mode is rarely noticed by the policy makers, and public opinion in its use needs attention. The involvement of policy-making between stakeholders and the community is needed in the development of sustainable traditional mode strategies in support of urban tourism activities.

Keywords: traditional mode, sustainable, urban, transportation

Procedia PDF Downloads 259
1378 Diabetes Diagnosis Model Using Rough Set and K- Nearest Neighbor Classifier

Authors: Usiobaifo Agharese Rosemary, Osaseri Roseline Oghogho

Abstract:

Diabetes is a complex group of disease with a variety of causes; it is a disorder of the body metabolism in the digestion of carbohydrates food. The application of machine learning in the field of medical diagnosis has been the focus of many researchers and the use of recognition and classification model as a decision support tools has help the medical expert in diagnosis of diseases. Considering the large volume of medical data which require special techniques, experience, and high diagnostic skill in the diagnosis of diseases, the application of an artificial intelligent system to assist medical personnel in order to enhance their efficiency and accuracy in diagnosis will be an invaluable tool. In this study will propose a diabetes diagnosis model using rough set and K-nearest Neighbor classifier algorithm. The system consists of two modules: the feature extraction module and predictor module, rough data set is used to preprocess the attributes while K-nearest neighbor classifier is used to classify the given data. The dataset used for this model was taken for University of Benin Teaching Hospital (UBTH) database. Half of the data was used in the training while the other half was used in testing the system. The proposed model was able to achieve over 80% accuracy.

Keywords: classifier algorithm, diabetes, diagnostic model, machine learning

Procedia PDF Downloads 328
1377 Unconventional Strategies for Combating Multidrug Resistant Bacterial Biofilms

Authors: Soheir Mohamed Fathey

Abstract:

Biofilms are complex biological communities which are hard to be eliminated by conventional antibiotic administration and implemented in eighty percent of humans infections. Green remedies have been used for centuries and have shown obvious effects in hindering and combating microbial biofilm infections. Nowadays, there has been a growth in the number of researches on the anti-biofilm performance of natural agents such as plant essential oil (EOs) and propolis. In this study, we investigated the antibiofilm performance of various natural agents, including four essential oils (EOs), cinnamon (Cinnamomum cassia), tea tree (Melaleuca alternifolia), and clove (Syzygium aromaticum), as well as propolis versus the biofilm of both Gram-positive pathogenic bacterium Staphylococcus aureus and Gram-negative pathogenic bacterium Pseudomonas aeruginosa which are major human and animal pathogens rendering a high risk due to their biofilm development ability. The antibiofilm activity of the tested agents was evaluated by crystal violet staining assay and detected by scanning electron and fluorescent microscopy. Antibiofilm performance declared a potent effect of the tested products versus the tested bacterial biofilms.

Keywords: biofilm, essential oils, electron microscopy, fluorescent

Procedia PDF Downloads 90
1376 Comparison of Selected Behavioural Patterns of German Shepherd Puppies in Open-Field Test by Practical Assessment Report

Authors: Igor Miňo, Lenka Lešková

Abstract:

Over the past 80 years, open-field method has evolved as a commonly used tool for the analysis of animal behaviour. The study was carried out using 50 kennel-reared purebred puppies of the German Shepherd dog breed. All dogs were tested in 5th, 7th, and 9th week of age. For the purpose of behavioural analysis, an open-field evaluation report was designed prior to testing to ensure the most convenient, rapid, and suitable way to assess selected behavioural patterns in field conditions. Onset of vocalisation, intensity of vocalisation, level of physical activity, response to sound, and overall behaviour was monitored in the study. Correlations between measures of height, weight and chest circumference, and behavioural characteristics in the 5th, 7th, and 9th week of age were not statistically significant. Onset of vocalisation, intensity of vocalisation, level of physical activity and response to sound differed on statistically significant level between 5th, 7th, and 9th week of age. Results suggest that our practical assessment report may be used as an applicable method to evaluate the suitability of service dog puppies for future working roles.

Keywords: dog, behaviour, open-field, testing

Procedia PDF Downloads 214
1375 Effect of Selenium Source on Meat Quality of Bonsmara Bull Calves

Authors: J. van Soest, B. Bruneel, J. Smit, N. Williams, P. Swiegers

Abstract:

Selenium (Se) is an essential trace mineral involved in reducing oxidative stress, enhancing immune status, improving reproduction, and regulating growth. During finishing period, selenium supplementation can be applied to improve meat quality. Dietary selenium can be provided in inorganic or organic forms. Specifically, L-selenomethionine (organic selenium) allows for selenium storage in animal protein which supports the animal during periods of high oxidative stress. The objective of this study was to investigate the effects of synthetically produced, single amino acid, L-selenomethionine (Excential Selenium 4000, Orffa Additives BV) on production parameters, health status, and meat quality of Bonsmara bull calves. 24 calves, 7 months of age, completed a 60-day initial growing period at a commercial feedlot, after which they were transported to research station Rumen-8 (Bethlehem, South-Africa). After a ten-day adaptation period, the bulls were allocated to a control (n=12) or treatment (n=12) group. Each group was divided over 3 pens based on weight. Both groups received Total Mixed Ration supplemented with 5.25 mg Se/head per day. The control group was supplemented with sodium selenite as Se source, whilst the treatment group was supplemented with L-selenomethionine (Excential Selenium 4000, Orffa Additives BV). Animals were limited to 10 kg feed intake per head per day to ensure similar Se intake. Treatment period lasted 1.5 months. A beta-adrenergic agonist was included in the feed for the last 30 days. During the treatment period, average daily gain, average daily feed intake, and feed conversion ratio were recorded. Blood parameters were measured at day 1, day 25, and before slaughter (day 47). After slaughter, carcass weight, dressing percentage, grading, and meat quality (pH, tenderness, colour, odour, purge, proximate analyses, acid detergent fibre, and neutral detergent fibre) were determined. No differences between groups were found in performance. A higher number of animals with cortisol levels below detection limit (27.6 nmol/l) was recorded for the treatment group. Other blood parameters showed no differences. No differences were found regarding carcass weight and dressing percentage. Important parameters of meat quality were significantly improved in the treatment group: instrumental tenderness at 14 days ageing was 2.8 and 3.4 for treatment and control respectively (P=0.010), and a 0.5% decrease in purge (of fresh samples) was shown, 1.5% and 2.0% for treatment group and control respectively (p=0.029). Besides, pH was shown to be numerically reduced in the treatment group. In summary, supplementation with L-selenomethionine as selenium source improved meat quality compared to sodium selenite. Lower instrumental tenderness (Warner Bratzler Shear Force, WBSF) was recorded for the treatment group. This indicates less tough meat and highest consumer satisfaction. Regarding purge, control was just below 2.0%, an important threshold for consumer acceptation. Treatment group scored 0.5% lower for purge than control, indicating higher consumer satisfaction. The lower pH in the treatment group could be an indication of higher glycogen reserves in muscle which could contribute to a reduced risk of Dark Firm Dry carcasses. More animals showed cortisol levels below detection limit in the treatment group, indicating lower levels of stress when animals receive L-selenomethionine.

Keywords: calves, meat quality, nutrition, selenium

Procedia PDF Downloads 172
1374 Cross Matching: An Improved Method to Obtain Comprehensive and Consolidated Evidence

Authors: Tuula Heinonen, Wilhelm Gaus

Abstract:

At present safety, assessment starts with animal tests although their predictivity is often poor. Even after extended human use experimental data are often judged as the core information for risk assessment. However, the best opportunity to generate true evidence is to match all available information. Cross matching methodology combines the different fields of knowledge and types of data (e.g. in-vitro and in-vivo experiments, clinical observations, clinical and epidemiological studies, and daily life observations) and gives adequate weight to individual findings. To achieve a consolidated outcome, the information from all available sources is analysed and compared with each other. If single pieces of information fit together a clear picture becomes visible. If pieces of information are inconsistent or contradictory careful consideration is necessary. 'Cross' can be understood as 'orthographic' in geometry or as 'independent' in mathematics. Results coming from different sources bring independent and; therefore, they result in new information. Independent information gives a larger contribution to evidence than results coming repeatedly from the same source. A successful example of cross matching is the assessment of Ginkgo biloba where we were able to come to the conclusive result: Ginkgo biloba leave extract is well tolerated and safe for humans.

Keywords: cross-matching, human use, safety assessment, Ginkgo biloba leave extract

Procedia PDF Downloads 278
1373 Daylightophil Approach towards High-Performance Architecture for Hybrid-Optimization of Visual Comfort and Daylight Factor in BSk

Authors: Mohammadjavad Mahdavinejad, Hadi Yazdi

Abstract:

The greatest influence we have from the world is shaped through the visual form, thus light is an inseparable element in human life. The use of daylight in visual perception and environment readability is an important issue for users. With regard to the hazards of greenhouse gas emissions from fossil fuels, and in line with the attitudes on the reduction of energy consumption, the correct use of daylight results in lower levels of energy consumed by artificial lighting, heating and cooling systems. Windows are usually the starting points for analysis and simulations to achieve visual comfort and energy optimization; therefore, attention should be paid to the orientation of buildings to minimize electrical energy and maximize the use of daylight. In this paper, by using the Design Builder Software, the effect of the orientation of an 18m2(3m*6m) room with 3m height in city of Tehran has been investigated considering the design constraint limitations. In these simulations, the dimensions of the building have been changed with one degree and the window is located on the smaller face (3m*3m) of the building with 80% ratio. The results indicate that the orientation of building has a lot to do with energy efficiency to meet high-performance architecture and planning goals and objectives.

Keywords: daylight, window, orientation, energy consumption, design builder

Procedia PDF Downloads 224
1372 Shaping Lexical Concept of 'Mage' through Image Schemas in Dragon Age 'Origins'

Authors: Dean Raiyasmi, Elvi Citraresmana, Sutiono Mahdi

Abstract:

Language shapes the human mind and its concept toward things. Using image schemas, in nowadays technology, even AI (artificial intelligence) can concept things in response to their creator negativity or positivity. This is reflected inside one of the most selling game around the world in 2012 called Dragon Age Origins. The AI in form of NPC (Non-Playable Character) inside the game reflects on the creator of the game on negativity or positivity toward the lexical concept of mage. Through image schemas, shaping the lexical concept of mage deemed possible and proved the negativity or positivity creator of the game toward mage. This research analyses the cognitive-semantic process of image schema and shaping the concept of ‘mage’ by describing kinds of image schemas exist in the Dragon Age Origin Game. This research is also aimed to analyse kinds of image schemas and describing the image schemas which shaping the concept of ‘mage’ itself. The methodology used in this research is qualitative where participative observation is employed with five stages and documentation. The results shows that there are four image schemas exist in the game and those image schemas shaping the lexical concept of ‘mage’.

Keywords: cognitive semantic, image-schema, conceptual metaphor, video game

Procedia PDF Downloads 433
1371 Optimal Design Solution in "The Small Module" Within the Possibilities of Ecology, Environmental Science/Engineering, and Economics

Authors: Hassan Wajid

Abstract:

We will commend accommodating an environmentally friendly architectural proposal that is extremely common/usual but whose features will make it a sustainable space. In this experiment, the natural and artificial built space is being proposed in such a way that deals with Environmental, Ecological, and Economic Criteria under different climatic conditions. Moreover, the criteria against ecology-environment-economics reflect in the different modules which are being experimented with and analyzed by multiple research groups. The ecological, environmental, and economic services are provided used as units of production side by side, resulting in local job creation and saving resources, for instance, conservation of rainwater, soil formation or protection, less energy consumption to achieve Net Zero, and a stable climate as a whole. The synthesized results from the collected data suggest several aspects to consider when designing buildings for beginning the design process under the supervision of instructors/directors who are responsible for developing curricula and sustainable goals. Hence, the results of the research and the suggestions will benefit the sustainable design through multiple results, heat analysis of different small modules, and comparisons. As a result, it is depleted as the resources are either consumed or the pollution contaminates the resources.

Keywords: optimization, ecology, environment, sustainable solution

Procedia PDF Downloads 64
1370 Quasi-Photon Monte Carlo on Radiative Heat Transfer: An Importance Sampling and Learning Approach

Authors: Utkarsh A. Mishra, Ankit Bansal

Abstract:

At high temperature, radiative heat transfer is the dominant mode of heat transfer. It is governed by various phenomena such as photon emission, absorption, and scattering. The solution of the governing integrodifferential equation of radiative transfer is a complex process, more when the effect of participating medium and wavelength properties are taken into consideration. Although a generic formulation of such radiative transport problem can be modeled for a wide variety of problems with non-gray, non-diffusive surfaces, there is always a trade-off between simplicity and accuracy of the problem. Recently, solutions of complicated mathematical problems with statistical methods based on randomization of naturally occurring phenomena have gained significant importance. Photon bundles with discrete energy can be replicated with random numbers describing the emission, absorption, and scattering processes. Photon Monte Carlo (PMC) is a simple, yet powerful technique, to solve radiative transfer problems in complicated geometries with arbitrary participating medium. The method, on the one hand, increases the accuracy of estimation, and on the other hand, increases the computational cost. The participating media -generally a gas, such as CO₂, CO, and H₂O- present complex emission and absorption spectra. To model the emission/absorption accurately with random numbers requires a weighted sampling as different sections of the spectrum carries different importance. Importance sampling (IS) was implemented to sample random photon of arbitrary wavelength, and the sampled data provided unbiased training of MC estimators for better results. A better replacement to uniform random numbers is using deterministic, quasi-random sequences. Halton, Sobol, and Faure Low-Discrepancy Sequences are used in this study. They possess better space-filling performance than the uniform random number generator and gives rise to a low variance, stable Quasi-Monte Carlo (QMC) estimators with faster convergence. An optimal supervised learning scheme was further considered to reduce the computation costs of the PMC simulation. A one-dimensional plane-parallel slab problem with participating media was formulated. The history of some randomly sampled photon bundles is recorded to train an Artificial Neural Network (ANN), back-propagation model. The flux was calculated using the standard quasi PMC and was considered to be the training target. Results obtained with the proposed model for the one-dimensional problem are compared with the exact analytical and PMC model with the Line by Line (LBL) spectral model. The approximate variance obtained was around 3.14%. Results were analyzed with respect to time and the total flux in both cases. A significant reduction in variance as well a faster rate of convergence was observed in the case of the QMC method over the standard PMC method. However, the results obtained with the ANN method resulted in greater variance (around 25-28%) as compared to the other cases. There is a great scope of machine learning models to help in further reduction of computation cost once trained successfully. Multiple ways of selecting the input data as well as various architectures will be tried such that the concerned environment can be fully addressed to the ANN model. Better results can be achieved in this unexplored domain.

Keywords: radiative heat transfer, Monte Carlo Method, pseudo-random numbers, low discrepancy sequences, artificial neural networks

Procedia PDF Downloads 217
1369 Examination of Woody Taxa in Urban Parks in the Context of Climate Change: Resat Oyal Kulturpark and Hudavendigar Urban Park Samples

Authors: Murat Zencirkıran, Elvan Ender

Abstract:

Climate change, which has become effective on a global scale, is accompanied by an increase in negative conditions for human, plant and animal life. Especially these negative conditions (drought, warming, glowing, etc.) are felt more rapidly in urban life and affect the sustainability of green areas which are of great importance in terms of life comfort. In this context, the choice of woody taxa used in the design and design of green spaces in the city increase one more time. Within the scope of this study, two of four urban parks located in the city center of Bursa province were selected and evaluated for woody taxa. Urban parks have been identified as the oldest and newest urban park in Bursa, and it has been tried to emphasize the differences that may exist over time. It was determined that 54 woody taxa took place in Resat Oyal Kulturpark and 76 woody taxa in Hudavendigar Urban Park. These taxa have been evaluated in terms of water consumption and ecological tolerances by taking into account climate change, and suggestions have been developed against possible problems.

Keywords: ecological hardiness, urban park, water consumption, woody plants

Procedia PDF Downloads 293
1368 Heterogeneous Catalytic Hydroesterification of Soybean Oil to Develop a Biodiesel Formation

Authors: O. Mowla, E. Kennedy, M. Stockenhuber

Abstract:

Finding alternative renewable resources of energy has attracted the attentions in consequence of limitation of the traditional fossil fuel resources, increasing of crude oil price and environmental concern over greenhouse gas emissions. Biodiesel (or Fatty Acid Methyl Esters (FAME)), an alternative energy source, is synthesised from renewable sources such as vegetable oils and animal fats and can be produced from waste oils. FAME can be produced via hydroesterification of oils. The process involves two stages. In the first stage of this process, fatty acids and glycerol are being obtained by hydrolysis of the feed stock oil. In the second stage, the recovered fatty acids are then esterified with an alcohol to methyl esters. The presence of a catalyst accelerates the rate of the hydroesterification reaction of oils. The overarching aim of this study is to find the effect of using zeolite as a catalyst in the heterogeneous hydroesterification of soybean oil. Both stages of the catalytic hydroesterification of soybean oil had been conducted at atmospheric and high-pressure conditions using reflux glass reactor and Parr reactor, respectively. The effect of operating parameters such as temperature and reaction time on the overall yield of biodiesel formation was also investigated.

Keywords: biodiesel, heterogeneous catalytic hydroesterification, soybean oil, zeolite

Procedia PDF Downloads 423
1367 Hydro-Gravimetric Ann Model for Prediction of Groundwater Level

Authors: Jayanta Kumar Ghosh, Swastik Sunil Goriwale, Himangshu Sarkar

Abstract:

Groundwater is one of the most valuable natural resources that society consumes for its domestic, industrial, and agricultural water supply. Its bulk and indiscriminate consumption affects the groundwater resource. Often, it has been found that the groundwater recharge rate is much lower than its demand. Thus, to maintain water and food security, it is necessary to monitor and management of groundwater storage. However, it is challenging to estimate groundwater storage (GWS) by making use of existing hydrological models. To overcome the difficulties, machine learning (ML) models are being introduced for the evaluation of groundwater level (GWL). Thus, the objective of this research work is to develop an ML-based model for the prediction of GWL. This objective has been realized through the development of an artificial neural network (ANN) model based on hydro-gravimetry. The model has been developed using training samples from field observations spread over 8 months. The developed model has been tested for the prediction of GWL in an observation well. The root means square error (RMSE) for the test samples has been found to be 0.390 meters. Thus, it can be concluded that the hydro-gravimetric-based ANN model can be used for the prediction of GWL. However, to improve the accuracy, more hydro-gravimetric parameter/s may be considered and tested in future.

Keywords: machine learning, hydro-gravimetry, ground water level, predictive model

Procedia PDF Downloads 120
1366 Intra-Erythrocytic Trace Elements Profile of EMU (Dromaius novaehollandiae) Le Souef 1907

Authors: Adebayo Adewumi

Abstract:

Emu Dromaius novaehollandiae the second largest bird in the world started its domestication in the United States in the early 1980's and the present trend in the production of emu in the U.S can be compared with cattle industry. As the population of many wildlife species in Nigeria declined due to unsustainable harvest of bush meat, animals like snails, antelopes,Ostrich, Emu and rodents have been domesticated. Although this improved livestock production in Nigeria, the basic physiological parameters of these mini- livestock are not known. Especially the intra-erythrocyte trace elements of domesticated emu, For this study, emu blood was obtained from Ajanla farms, Ibadan, Oyo State, Nigeria. There, 16 emus at age of 20 months were bled through jugular vein in a semi-intensive system for a period of 12 months. The intra-erythrocyte trace elements Cu, Zn, and Mg in healthy Emu were measured. The influences of sex and age on these parameters were investigated. No age or sex differences were observed in intra-erythrocytic Cu levels. Intra-erythrocytic Zn and Mg levels were significantly higher (P<0.05) in young Emu than in adults while males used significantly (P<0.05) higher intra erythrocytic Mg than females. intra-erythrocyte trace elements Cu, Zn and Mg is a good pointer to haemoglobin concentration which determines the state of wellness of an animal. The information from this work has provided a baseline data for further understanding of erythrocyte biochemistry of Emu in Nigeria.

Keywords: intra erythrocyte, trace elements, Emu, biochemistry

Procedia PDF Downloads 602
1365 Raising Test of English for International Communication (TOEIC) Scores through Purpose-Driven Vocabulary Acquisition

Authors: Edward Sarich, Jack Ryan

Abstract:

In contrast to learning new vocabulary incidentally in one’s first language, foreign language vocabulary is often acquired purposefully, because a lack of natural exposure requires it to be studied in an artificial environment. It follows then that foreign language vocabulary may be more efficiently acquired if it is purpose-driven, or linked to a clear and desirable outcome. The research described in this paper relates to the early stages of what is seen as a long-term effort to measure the effectiveness of a methodology for purpose-driven foreign language vocabulary instruction, specifically by analyzing whether directed studying from high-frequency vocabulary lists leads to an improvement in Test of English for International Communication (TOEIC) scores. The research was carried out in two sections of a first-year university English composition class at a small university in Japan. The results seem to indicate that purposeful study from relevant high-frequency vocabulary lists can contribute to raising TOEIC scores and that the test preparation methodology used in this study was thought by students to be beneficial in helping them to prepare to take this high-stakes test.

Keywords: corpus vocabulary, language asssessment, second language vocabulary acquisition, TOEIC test preparation

Procedia PDF Downloads 143
1364 A Survey and Theory of the Effects of Various Hamlet Videos on Viewers’ Brains

Authors: Mark Pizzato

Abstract:

How do ideas, images, and emotions in stage-plays and videos affect us? Do they evoke a greater awareness (or cognitive reappraisal of emotions) through possible shifts between left-cortical, right-cortical, and subcortical networks? To address these questions, this presentation summarizes the research of various neuroscientists, especially Bernard Baars and others involved in Global Workspace Theory, Matthew Lieberman in social neuroscience, Iain McGilchrist on left and right cortical functions, and Jaak Panksepp on the subcortical circuits of primal emotions. Through such research, this presentation offers an ‘inner theatre’ model of the brain, regarding major hubs of neural networks and our animal ancestry. It also considers recent experiments, by Mario Beauregard, on the cognitive reappraisal of sad, erotic, and aversive film clips. Finally, it applies the inner-theatre model and related research to survey results of theatre students who read and then watched the ‘To be or not to be’ speech in 8 different video versions (from stage and screen productions) of William Shakespeare’s Hamlet. Findings show that students become aware of left-cortical, right-cortical, and subcortical brain functions—and shifts between them—through staging and movie-making choices in each of the different videos.

Keywords: cognitive reappraisal, Hamlet, neuroscience, Shakespeare, theatre

Procedia PDF Downloads 305
1363 Biochemical and Molecular Analysis of Staphylococcus aureus Various Isolates from Different Places

Authors: Kiran Fatima, Kashif Ali

Abstract:

Staphylococcus aureus is an opportunistic human as well as animal pathogen that causes a variety of diseases. A total of 70 staphylococci isolates were obtained from soil, water, yogurt, and clinical samples. The likely staphylococci clinical isolates were identified phenotypically by different biochemical tests. Molecular identification was done by PCR using species-specific 16S rRNA primer pairs, and finally, 50 isolates were found to be positive as Staphylococcus aureus, sciuri, xylous and cohnii. Screened isolates were further analyzed by several microbiological diagnostics tests, including gram staining, coagulase, capsule, hemolysis, fermentation of glucose, lactose, maltose, and sucrose tests enzymatic reactions. It was found that 78%, 81%, and 51% of isolates were positive for gelatin hydrolysis, protease, and lipase activities, respectively. Antibiogram analysis of isolated Staphylococcus aureus strains with respect to different antimicrobial agents revealed resistance patterns ranging from 57 to 96%. Our study also shows 70% of strains to be MRSA, 54.3% as VRSA, and 54.3% as both MRSA and VRSA. All the identified isolates were subjected to detection of mecA, nuc, and hlb genes, and 70%, 84%, and 40% were found to harbour mecA, nuc, and hlb genes, respectively. The current investigation is highly important and informative for the high-level multidrug-resistant Staphylococcus aureus infections inclusive also of methicillin and vancomycin.

Keywords: MRSA, VRSA, mecA, MSSA

Procedia PDF Downloads 123
1362 Sardine Oil as a Source of Lipid in the Diet of Giant Freshwater Prawn (Macrobrachium rosenbergii)

Authors: A. T. Ramachandra Naik, H. Shivananda Murthy, H. n. Anjanayappa

Abstract:

The freshwater prawn, Macrobrachium rosenbergii is a more popular crustacean cultured widely in monoculture system in India. It has got high nutritional value in the human diet. Hence, understanding its enzymatic and body composition is important in order to judge its flesh quality. Fish oil specially derived from Indian oil sardine is a good source of highly unsaturated fatty acid and lipid source in fish/prawn diet. A 35% crude protein diet with graded levels of Sardine oil as a source of fat was incorporated at four levels viz, 2.07, 4.07, 6.07 and 8.07% maintaining a total lipid level of feed at 8.11, 10.24, 12.28 and 14.33% respectively. Diet without sardine oil (6.05% total lipid) was served as basal treatment. The giant freshwater prawn, Macrobrachium rosenbergii was used as test animal and the experiment was lost for 112 days. Significantly, higher gain in weight of prawn was recorded in the treatment with 6.07% sardine oil incorporation followed by higher specific growth rate, food conversion rate and protein efficiency ratio. The 8.07% sardine oil diet produced the highest RNA: DNA ratio in the prawn muscle. Digestive enzyme analyses in the digestive tract and mid-gut gland showed the greatest activity in prawns fed the 8.07% diet.

Keywords: digestive enzyme, fish diet, Macrobrachium rosenbergii, sardine oil

Procedia PDF Downloads 320
1361 [Keynote Talk]: Analysis of Intelligent Based Fault Tolerant Capability System for Solar Photovoltaic Energy Conversion

Authors: Albert Alexander Stonier

Abstract:

Due to the fossil fuel exhaustion and environmental pollution, renewable energy sources especially solar photovoltaic system plays a predominant role in providing energy to the consumers. It has been estimated that by 2050 the renewable energy sources will satisfy 50% of the total energy requirement of the world. In this context, the faults in the conversion process require a special attention which is considered as a major problem. A fault which remains even for a few seconds will cause undesirable effects to the system. The presentation comprises of the analysis, causes, effects and mitigation methods of various faults occurring in the entire solar photovoltaic energy conversion process. In order to overcome the faults in the system, an intelligent based artificial neural networks and fuzzy logic are proposed which can significantly mitigate the faults. Hence the presentation intends to find the problem in renewable energy and provides the possible solution to overcome it with simulation and experimental results. The work performed in a 3kWp solar photovoltaic plant whose results cites the improvement in reliability, availability, power quality and fault tolerant ability.

Keywords: solar photovoltaic, power electronics, power quality, PWM

Procedia PDF Downloads 273
1360 Data Clustering in Wireless Sensor Network Implemented on Self-Organization Feature Map (SOFM) Neural Network

Authors: Krishan Kumar, Mohit Mittal, Pramod Kumar

Abstract:

Wireless sensor network is one of the most promising communication networks for monitoring remote environmental areas. In this network, all the sensor nodes are communicated with each other via radio signals. The sensor nodes have capability of sensing, data storage and processing. The sensor nodes collect the information through neighboring nodes to particular node. The data collection and processing is done by data aggregation techniques. For the data aggregation in sensor network, clustering technique is implemented in the sensor network by implementing self-organizing feature map (SOFM) neural network. Some of the sensor nodes are selected as cluster head nodes. The information aggregated to cluster head nodes from non-cluster head nodes and then this information is transferred to base station (or sink nodes). The aim of this paper is to manage the huge amount of data with the help of SOM neural network. Clustered data is selected to transfer to base station instead of whole information aggregated at cluster head nodes. This reduces the battery consumption over the huge data management. The network lifetime is enhanced at a greater extent.

Keywords: artificial neural network, data clustering, self organization feature map, wireless sensor network

Procedia PDF Downloads 507
1359 Analysis of Cyber Activities of Potential Business Customers Using Neo4j Graph Databases

Authors: Suglo Tohari Luri

Abstract:

Data analysis is an important aspect of business performance. With the application of artificial intelligence within databases, selecting a suitable database engine for an application design is also very crucial for business data analysis. The application of business intelligence (BI) software into some relational databases such as Neo4j has proved highly effective in terms of customer data analysis. Yet what remains of great concern is the fact that not all business organizations have the neo4j business intelligence software applications to implement for customer data analysis. Further, those with the BI software lack personnel with the requisite expertise to use it effectively with the neo4j database. The purpose of this research is to demonstrate how the Neo4j program code alone can be applied for the analysis of e-commerce website customer visits. As the neo4j database engine is optimized for handling and managing data relationships with the capability of building high performance and scalable systems to handle connected data nodes, it will ensure that business owners who advertise their products at websites using neo4j as a database are able to determine the number of visitors so as to know which products are visited at routine intervals for the necessary decision making. It will also help in knowing the best customer segments in relation to specific goods so as to place more emphasis on their advertisement on the said websites.

Keywords: data, engine, intelligence, customer, neo4j, database

Procedia PDF Downloads 189
1358 A Neural Network Model to Simulate Urban Air Temperatures in Toulouse, France

Authors: Hiba Hamdi, Thomas Corpetti, Laure Roupioz, Xavier Briottet

Abstract:

Air temperatures are generally higher in cities than in their rural surroundings. The overheating of cities is a direct consequence of increasing urbanization, characterized by the artificial filling of soils, the release of anthropogenic heat, and the complexity of urban geometry. This phenomenon, referred to as urban heat island (UHI), is more prevalent during heat waves, which have increased in frequency and intensity in recent years. In the context of global warming and urban population growth, helping urban planners implement UHI mitigation and adaptation strategies is critical. In practice, the study of UHI requires air temperature information at the street canyon level, which is difficult to obtain. Many urban air temperature simulation models have been proposed (mostly based on physics or statistics), all of which require a variety of input parameters related to urban morphology, land use, material properties, or meteorological conditions. In this paper, we build and evaluate a neural network model based on Urban Weather Generator (UWG) model simulations and data from meteorological stations that simulate air temperature over Toulouse, France, on days favourable to UHI.

Keywords: air temperature, neural network model, urban heat island, urban weather generator

Procedia PDF Downloads 79