Search results for: open source software
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 11292

Search results for: open source software

8622 Using Printouts as Social Media Evidence and Its Authentication in the Courtroom

Authors: Chih-Ping Chang

Abstract:

Different from traditional objective evidence, social media evidence has its own characteristics with easily tampering, recoverability, and cannot be read without using other devices (such as a computer). Simply taking a screenshot from social network sites must be questioned its original identity. When the police search and seizure digital information, a common way they use is to directly print out digital data obtained and ask the signature of the parties at the presence, without taking original digital data back. In addition to the issue on its original identity, this conduct to obtain evidence may have another two results. First, it will easily allege that is tampering evidence because the police wanted to frame the suspect and falsified evidence. Second, it is not easy to discovery hidden information. The core evidence associated with crime may not appear in the contents of files. Through discovery the original file, data related to the file, such as the original producer, creation time, modification date, and even GPS location display can be revealed from hidden information. Therefore, how to show this kind of evidence in the courtroom will be arguably the most important task for ruling social media evidence. This article, first, will introduce forensic software, like EnCase, TCT, FTK, and analyze their function to prove the identity with another digital data. Then turning back to the court, the second part of this article will discuss legal standard for authentication of social media evidence and application of that forensic software in the courtroom. As the conclusion, this article will provide a rethinking, that is, what kind of authenticity is this rule of evidence chase for. Does legal system automatically operate the transcription of scientific knowledge? Or furthermore, it wants to better render justice, not only under scientific fact, but through multivariate debating.

Keywords: federal rule of evidence, internet forensic, printouts as evidence, social media evidence, United States v. Vayner

Procedia PDF Downloads 278
8621 Study of the Biochemical Properties of the Protease Coagulant Milk Extracted from Sunflower Cake: Manufacturing Test of Cheeses Uncooked Dough Press and Analysis of Sensory Properties

Authors: Kahlouche Amal, Touzene F. Zohra, Betatache Fatihaet Nouani Abdelouahab

Abstract:

The development of the world production of the cheese these last decades, as well as agents' greater request cheap coagulants, accentuated the search for new surrogates of the rennet. What about the interest to explore the vegetable biodiversity, the source well cheap of many naturals metabolites that the scientists today praise it (thistle, latex of fig tree, Cardoon, seeds of melon). Indeed, a big interest is concerned the search for surrogates of vegetable origin. The objective of the study is to show the possibility of extracting a protease coagulant the milk from the cake of Sunflower, available raw material and the potential source of surrogates of rennet. so, the determination of the proteolytic activity of raw extracts, the purification, the elimination of the pigments of tint of the enzymatic preparations, a better knowledge of the coagulative properties through study of the effect of certain factors (temperature, pH, concentration in CaCl2) are so many factors which contribute to value milk particularly those produced by the small ruminants of the Algerian dairy exploitations. Otherwise, extracts coagulants of vegetable origin allowed today to value traditional, in addition, although the extract coagulants of vegetable origin made it possible today to develop traditional cheeses whose Iberian peninsula is the promoter, but the test of 'pressed paste not cooked' cheese manufacturing led to the semi-scale pilot; and that, by using the enzymatic extract of sunflower (Helianthus annus) which gave satisfactory results as well to the level of outputs as on the sensory level,which, statistically,did not give any significant difference between studied cheeses. These results confirm the possibility of use of this coagulase as a substitute of rennet commercial on an industrial scale.

Keywords: characterization, cheese, Rennet, sunflower

Procedia PDF Downloads 336
8620 An Enhanced Support Vector Machine Based Approach for Sentiment Classification of Arabic Tweets of Different Dialects

Authors: Gehad S. Kaseb, Mona F. Ahmed

Abstract:

Arabic Sentiment Analysis (SA) is one of the most common research fields with many open areas. Few studies apply SA to Arabic dialects. This paper proposes different pre-processing steps and a modified methodology to improve the accuracy using normal Support Vector Machine (SVM) classification. The paper works on two datasets, Arabic Sentiment Tweets Dataset (ASTD) and Extended Arabic Tweets Sentiment Dataset (Extended-AATSD), which are publicly available for academic use. The results show that the classification accuracy approaches 86%.

Keywords: Arabic, classification, sentiment analysis, tweets

Procedia PDF Downloads 131
8619 Biomass Production Improvement of Beauveria bassiana at Laboratory Scale for a Biopesticide Development

Authors: G. Quiroga-Cubides, M. Cruz, E. Grijalba, J. Sanabria, A. Ceballos, L. García, M. Gómez

Abstract:

Beauveria sp. has been used as an entomopathogenic microorganism for biological control of various plant pests such as whitefly, thrips, aphids and chrysomelidaes (including Cerotoma tingomariana species), which affect soybean crops in Colombia´s Altillanura region. Therefore, a biopesticide prototype based on B. bassiana strain Bv060 was developed at Corpoica laboratories. For the production of B. bassiana conidia, a baseline fermentation was performed at laboratory in a solid medium using broken rice as a substrate, a temperature of 25±2 °C and a relative humidity of 60±10%. The experimental design was completely randomized, with a three-time repetition. These culture conditions resulted in an average conidial concentration of 1.48x10^10 conidia/g, a yield of 13.07 g/kg dry substrate and a productivity of 8.83x10^7 conidia/g*h were achieved. Consequently, the objective of this study was to evaluate the influence of the particle size reduction of rice (<1 mm) and the addition of a complex nitrogen source over conidia production and efficiency parameters in a solid-state fermentation, in a completely randomized experiment with a three-time repetition. For this aim, baseline fermentation conditions of temperature and humidity were employed in a semisolid culture medium with powdered rice (10%) and a complex nitrogen source (8%). As a result, it was possible to increase conidial concentration until 9.87x10^10 conidia/g, yield to 87.07 g/g dry substrate and productivity to 3.43x10^8 conidia/g*h. This suggested that conidial concentration and yield in semisolid fermentation increased almost 7 times compared with baseline while the productivity increased 4 times. Finally, the designed system for semisolid-state fermentation allowed to achieve an easy conidia recovery, which means reduction in time and costs of the production process.

Keywords: Beauveria bassiana, biopesticide, solid state fermentation, semisolid medium culture

Procedia PDF Downloads 291
8618 Technical Option Brought Solution for Safe Waste Water Management in Urban Public Toilet and Improved Ground Water Table

Authors: Chandan Kumar

Abstract:

Background and Context: Population growth and rapid urbanization resulted nearly 2 Lacs migrants along with families moving to Delhi each year in search of jobs. Most of these poor migrant families end up living in slums and constitute an estimated population of 1.87 lacs every year. Further, more than half (52 per cent) of Delhi’s population resides in places such as unauthorized and resettled colonies. Slum population is fully dependent on public toilet to defecate. In Public toilets, manholes either connected with Sewer line or septic tank. Septic tank connected public toilet faces major challenges to dispose of waste water. They have to dispose of waste water in outside open drain and waste water struck out side of public toilet complex and near to the slum area. As a result, outbreak diseases such as Malaria, Dengue and Chikungunya in slum area due to stagnated waste water. Intervention and Innovation took place by Save the Children in 21 Public Toilet Complexes of South Delhi and North Delhi. These public toilet complexes were facing same waste water disposal problem. They were disposing of minimum 1800 liters waste water every day in open drain. Which caused stagnated water-borne diseases among the nearest community. Construction of Soak Well: Construction of soak well in urban context was an innovative approach to minimizing the problem of waste water management and increased water table of existing borewell in toilet complex. This technique made solution in Ground water recharging system, and additional water was utilized in vegetable gardening within the complex premises. Soak well had constructed with multiple filter media with inlet and safeguarding bed on surrounding surface. After construction, soak well started exhausting 2000 liters of waste water to raise ground water level through different filter media. Finally, we brought a change in the communities by constructing soak well and with zero maintenance system. These Public Toilet Complexes were empowered by safe disposing waste water mechanism and reduced stagnated water-borne diseases.

Keywords: diseases, ground water recharging system, soak well, toilet complex, waste water

Procedia PDF Downloads 535
8617 Preparation and Sealing of Polymer Microchannels Using EB Lithography and Laser Welding

Authors: Ian Jones, Jonathan Griffiths

Abstract:

Laser welding offers the potential for making very precise joints in plastics products, both in terms of the joint location and the amount of heating applied. These methods have allowed the production of complex products such as microfluidic devices where channels and structure resolution below 100 µm is regularly used. However, to date, the dimension of welds made using lasers has been limited by the focus spot size that is achievable from the laser source. Theoretically, the minimum spot size possible from a laser is comparable to the wavelength of the radiation emitted. Practically, with reasonable focal length optics the spot size achievable is a few factors larger than this, and the melt zone in a plastics weld is larger again than this. The narrowest welds feasible to date have therefore been 10-20 µm wide using a near-infrared laser source. The aim of this work was to prepare laser absorber tracks and channels less than 10 µm wide in PMMA thermoplastic using EB lithography followed by sealing of channels using laser welding to carry out welds with widths of the order of 1 µm, below the resolution limit of the near-infrared laser used. Welded joints with a width of 1 µm have been achieved as well as channels with a width of 5 µm. The procedure was based on the principle of transmission laser welding using a thin coating of infrared absorbent material at the joint interface. The coating was patterned using electron-beam lithography to obtain the required resolution in a reproducible manner and that resolution was retained after the transmission laser welding process. The joint strength was ratified using larger scale samples. The results demonstrate that plastics products could be made with a high density of structure with resolution below 1 um, and that welding can be applied without excessively heating regions beyond the weld lines. This may be applied to smaller scale sensor and analysis chips, micro-bio and chemical reactors and to microelectronic packaging.

Keywords: microchannels, polymer, EB lithography, laser welding

Procedia PDF Downloads 388
8616 Biomechanical Performance of the Synovial Capsule of the Glenohumeral Joint with a BANKART Lesion through Finite Element Analysis

Authors: Duvert A. Puentes T., Javier A. Maldonado E., Ivan Quintero., Diego F. Villegas

Abstract:

Mechanical Computation is a great tool to study the performance of complex models. An example of it is the study of the human body structure. This paper took advantage of different types of software to make a 3D model of the glenohumeral joint and apply a finite element analysis. The main objective was to study the change in the biomechanical properties of the joint when it presents an injury. Specifically, a BANKART lesion, which consists in the detachment of the anteroinferior labrum from the glenoid. Stress and strain distribution of the soft tissues were the focus of this study. First, a 3D model was made of a joint without any pathology, as a control sample, using segmentation software for the bones with the support of medical imagery and a cadaveric model to represent the soft tissue. The joint was built to simulate a compression and external rotation test using CAD to prepare the model in the adequate position. When the healthy model was finished, it was submitted to a finite element analysis and the results were validated with experimental model data. With the validated model, it was sensitized to obtain the best mesh measurement. Finally, the geometry of the 3D model was changed to imitate a BANKART lesion. Then, the contact zone of the glenoid with the labrum was slightly separated simulating a tissue detachment. With this new geometry, the finite element analysis was applied again, and the results were compared with the control sample created initially. With the data gathered, this study can be used to improve understanding of the labrum tears. Nevertheless, it is important to remember that the computational analysis are approximations and the initial data was taken from an in vitro assay.

Keywords: biomechanics, computational model, finite elements, glenohumeral joint, bankart lesion, labrum

Procedia PDF Downloads 146
8615 Interface Designer as Cultural Producer: A Dialectic Materialist Approach to the Role of Visual Designer in the Present Digital Era

Authors: Cagri Baris Kasap

Abstract:

In this study, how interface designers can be viewed as producers of culture in the current era will be interrogated from a critical theory perspective. Walter Benjamin was a German Jewish literary critical theorist who, during 1930s, was engaged in opposing and criticizing the Nazi use of art and media. ‘The Author as Producer’ is an essay that Benjamin has read at the Communist Institute for the Study of Fascism in Paris. In this article, Benjamin relates directly to the dialectics between base and superstructure and argues that authors, normally placed within the superstructure should consider how writing and publishing is production and directly related to the base. Through it, he discusses what it could mean to see author as producer of his own text, as a producer of writing, understood as an ideological construct that rests on the apparatus of production and distribution. So Benjamin concludes that the author must write in ways that relate to the conditions of production, he must do so in order to prepare his readers to become writers and even make this possible for them by engineering an ‘improved apparatus’ and must work toward turning consumers to producers and collaborators. In today’s world, it has become a leading business model within Web 2.0 services of multinational Internet technologies and culture industries like Amazon, Apple and Google, to transform readers, spectators, consumers or users into collaborators and co-producers through platforms such as Facebook, YouTube and Amazon’s CreateSpace Kindle Direct Publishing print-on-demand, e-book and publishing platforms. However, the way this transformation happens is tightly controlled and monitored by combinations of software and hardware. In these global-market monopolies, it has become increasingly difficult to get insight into how one’s writing and collaboration is used, captured, and capitalized as a user of Facebook or Google. In the lens of this study, it could be argued that this criticism could very well be considered by digital producers or even by the mass of collaborators in contemporary social networking software. How do software and design incorporate users and their collaboration? Are they truly empowered, are they put in a position where they are able to understand the apparatus and how their collaboration is part of it? Or has the apparatus become a means against the producers? Thus, when using corporate systems like Google and Facebook, iPhone and Kindle without any control over the means of production, which is closed off by opaque interfaces and licenses that limit our rights of use and ownership, we are already the collaborators that Benjamin calls for. For example, the iPhone and the Kindle combine a specific use of technology to distribute the relations between the ‘authors’ and the ‘prodUsers’ in ways that secure their monopolistic business models by limiting the potential of the technology.

Keywords: interface designer, cultural producer, Walter Benjamin, materialist aesthetics, dialectical thinking

Procedia PDF Downloads 130
8614 Seismic Retrofits – A Catalyst for Minimizing the Building Sector’s Carbon Footprint

Authors: Juliane Spaak

Abstract:

A life-cycle assessment was performed, looking at seven retrofit projects in New Zealand using LCAQuickV3.5. The study found that retrofits save up to 80% of embodied carbon emissions for the structural elements compared to a new building. In other words, it is only a 20% carbon investment to transform and extend a building’s life. In addition, the systems were evaluated by looking at environmental impacts over the design life of these buildings and resilience using FEMA P58 and PACT software. With the increasing interest in Zero Carbon targets, significant changes in the building and construction sector are required. Emissions for buildings arise from both embodied carbon and operations. Based on the significant advancements in building energy technology, the focus is moving more toward embodied carbon, a large portion of which is associated with the structure. Since older buildings make up most of the real estate stock of our cities around the world, their reuse through structural retrofit and wider refurbishment plays an important role in extending the life of a building’s embodied carbon. New Zealand’s building owners and engineers have learned a lot about seismic issues following a decade of significant earthquakes. Recent earthquakes have brought to light the necessity to move away from constructing code-minimum structures that are designed for life safety but are frequently ‘disposable’ after a moderate earthquake event, especially in relation to a structure’s ability to minimize damage. This means weaker buildings sit as ‘carbon liabilities’, with considerably more carbon likely to be expended remediating damage after a shake. Renovating and retrofitting older assets plays a big part in reducing the carbon profile of the buildings sector, as breathing new life into a building’s structure is vastly more sustainable than the highest quality ‘green’ new builds, which are inherently more carbon-intensive. The demolition of viable older buildings (often including heritage buildings) is increasingly at odds with society’s desire for a lower carbon economy. Bringing seismic resilience and carbon best practice together in decision-making can open the door to commercially attractive outcomes, with retrofits that include structural and sustainability upgrades transforming the asset’s revenue generation. Across the global real estate market, tenants are increasingly demanding the buildings they occupy be resilient and aligned with their own climate targets. The relationship between seismic performance and ‘sustainable design’ has yet to fully mature, yet in a wider context is of profound consequence. A whole-of-life carbon perspective on a building means designing for the likely natural hazards within the asset’s expected lifespan, be that earthquake, storms, damage, bushfires, fires, and so on, ¬with financial mitigation (e.g., insurance) part, but not all, of the picture.

Keywords: retrofit, sustainability, earthquake, reuse, carbon, resilient

Procedia PDF Downloads 58
8613 Lucilia Sericata Netrin-A: Secreted by Salivary Gland Larvae as a Potential to Neuroregeneration

Authors: Hamzeh Alipour, Masoumeh Bagheri, Tahereh Karamzadeh, Abbasali Raz, Kourosh Azizi

Abstract:

Netrin-A, a protein identified for conducting commissural axons, has a similar role in angiogenesis. In addition, studies have shown that one of the netrin-A receptors is expressed in the growing cells of small capillaries. It will be interesting to study this new group of molecules because their role in wound healing will become clearer in the future due to angiogenesis. The greenbottle blowfly Luciliasericata (L. sericata) larvae are increasingly used in maggot therapy of chronic wounds. This aim of this was the identification of moleculareatures of Netrin-A in L. sericata larvae. Larvae were reared under standard maggotarium conditions. The nucleic acid sequence of L. sericataNetrin-A (LSN-A) was then identified using Rapid Amplification of cDNA Ends (RACE) and Rapid Amplification of Genomic Ends (RAGE). Parts of the Netrin-A gene, including the middle, 3′-, and 5′-ends were identified, TA cloned in pTG19 plasmid, and transferred into DH5ɑ Escherichia coli. Each part was sequenced and assembled using SeqMan software. This gene structure was further subjected to in silico analysis. The DNA of LSN-A was identified to be 2407 bp, while its mRNA sequence was recognized as 2115 bp by Oligo0.7 software. It translated the Netrin-A protein with 704 amino acid residues. Its molecular weight is estimated to be 78.6 kDa. The 3-D structure ofNetrin-A drawn by SWISS-MODEL revealed its similarity to the Netrin-1 of humans with 66.8% identity. The LSN-A protein conduces to repair the myelin membrane in neuronal cells. Ultimately, it can be an effective candidate in neural regeneration and wound healing. Furthermore, our next attempt is to deplore recombinant proteins for use in medical sciences.

Keywords: maggot therapy, netrin-A, RACE, RAGE, lucilia sericata

Procedia PDF Downloads 90
8612 Exploring Counting Methods for the Vertices of Certain Polyhedra with Uncertainties

Authors: Sammani Danwawu Abdullahi

Abstract:

Vertex Enumeration Algorithms explore the methods and procedures of generating the vertices of general polyhedra formed by system of equations or inequalities. These problems of enumerating the extreme points (vertices) of general polyhedra are shown to be NP-Hard. This lead to exploring how to count the vertices of general polyhedra without listing them. This is also shown to be #P-Complete. Some fully polynomial randomized approximation schemes (fpras) of counting the vertices of some special classes of polyhedra associated with Down-Sets, Independent Sets, 2-Knapsack problems and 2 x n transportation problems are presented together with some discovered open problems.

Keywords: counting with uncertainties, mathematical programming, optimization, vertex enumeration

Procedia PDF Downloads 337
8611 Low-Temperature Poly-Si Nanowire Junctionless Thin Film Transistors with Nickel Silicide

Authors: Yu-Hsien Lin, Yu-Ru Lin, Yung-Chun Wu

Abstract:

This work demonstrates the ultra-thin poly-Si (polycrystalline Silicon) nanowire junctionless thin film transistors (NWs JL-TFT) with nickel silicide contact. For nickel silicide film, this work designs to use two-step annealing to form ultra-thin, uniform and low sheet resistance (Rs) Ni silicide film. The NWs JL-TFT with nickel silicide contact exhibits the good electrical properties, including high driving current (>10⁷ Å), subthreshold slope (186 mV/dec.), and low parasitic resistance. In addition, this work also compares the electrical characteristics of NWs JL-TFT with nickel silicide and non-silicide contact. Nickel silicide techniques are widely used for high-performance devices as the device scaling due to the source/drain sheet resistance issue. Therefore, the self-aligned silicide (salicide) technique is presented to reduce the series resistance of the device. Nickel silicide has several advantages including low-temperature process, low silicon consumption, no bridging failure property, smaller mechanical stress, and smaller contact resistance. The junctionless thin-film transistor (JL-TFT) is fabricated simply by heavily doping the channel and source/drain (S/D) regions simultaneously. Owing to the special doping profile, JL-TFT has some advantages such as lower thermal the budget which can integrate with high-k/metal-gate easier than conventional MOSFETs (Metal Oxide Semiconductor Field-Effect Transistors), longer effective channel length than conventional MOSFETs, and avoidance of complicated source/drain engineering. To solve JL-TFT has turn-off problem, JL-TFT needs ultra-thin body (UTB) structure to reach fully depleted channel region in off-state. On the other hand, the drive current (Iᴅ) is declined as transistor features are scaled. Therefore, this work demonstrates ultra thin poly-Si nanowire junctionless thin film transistors with nickel silicide contact. This work investigates the low-temperature formation of nickel silicide layer by physical-chemical deposition (PVD) of a 15nm Ni layer on the poly-Si substrate. Notably, this work designs to use two-step annealing to form ultrathin, uniform and low sheet resistance (Rs) Ni silicide film. The first step was promoted Ni diffusion through a thin interfacial amorphous layer. Then, the unreacted metal was lifted off after the first step. The second step was annealing for lower sheet resistance and firmly merged the phase.The ultra-thin poly-Si nanowire junctionless thin film transistors NWs JL-TFT with nickel silicide contact is demonstrated, which reveals high driving current (>10⁷ Å), subthreshold slope (186 mV/dec.), and low parasitic resistance. In silicide film analysis, the second step of annealing was applied to form lower sheet resistance and firmly merge the phase silicide film. In short, the NWs JL-TFT with nickel silicide contact has exhibited a competitive short-channel behavior and improved drive current.

Keywords: poly-Si, nanowire, junctionless, thin-film transistors, nickel silicide

Procedia PDF Downloads 222
8610 Rest Behavior and Restoration: Searching for Patterns through a Textual Analysis

Authors: Sandra Christina Gressler

Abstract:

Resting is essentially the physical and mental relaxation. So, can behaviors that go beyond the merely physical relaxation to some extent be understood as a behavior of restoration? Studies on restorative environments emphasize the physical, mental and social benefits that some environments can provide and suggest that activities in natural environments reduce the stress of daily lives, promoting recovery against the daily wear. These studies, though specific in their results, do not unify the different possibilities of restoration. Considering the importance of restorative environments by promoting well-being, this research aims to verify the applicability of the theory on restorative environments in a Brazilian context, inquiring about the environment/behavior of rest. The research sought to achieve its goals by; a) identifying daily ways of how participants interact/connect with nature; b) identifying the resting environments/behavior; c) verifying if rest strategies match the restorative environments suggested by restorative studies; and d) verifying different rest strategies related to time. Workers from different companies in which certain functions require focused attention, and high school students from different schools, participated in this study. An interview was used to collect data and information. The data obtained were compared with studies of attention restoration theory and stress recovery. The collected data were analyzed through the basic descriptive inductive statistics and the use of the software ALCESTE® (Analyse Lexicale par Contexte d'un Ensemble de Segments de Texte). The open questions investigate perception of nature on a daily basis – analysis using ALCESTE; rest periods – daily, weekends and holidays – analysis using ALCESTE with tri-croisé; and resting environments and activities – analysis using a simple descriptive statistics. According to the results, environments with natural characteristics that are compatible with personal desires (physical aspects and distance) and residential environments when they fulfill the characteristics of refuge, safety, and self-expression, characteristics of primary territory, meet the requirements of restoration. Analyzes suggest that the perception of nature has a wide range that goes beyond the objects nearby and possible to be touched, as well as observation and contemplation of details. The restoration processes described in the studies of attention restoration theory occur gradually (hierarchically), starting with being away, following compatibility, fascination, and extent. They are also associated with the time that is available for rest. The relation between rest behaviors and the bio-demographic characteristics of the participants are noted. It reinforces, in studies of restoration, the need to insert not only investigations regarding the physical characteristics of the environment but also behavior, social relationship, subjective reactions, distance and time available. The complexity of the theme indicates the necessity for multimethod studies. Practical contributions provide subsidies for developing strategies to promote the welfare of the population.

Keywords: attention restoration theory, environmental psychology, rest behavior, restorative environments

Procedia PDF Downloads 168
8609 Virtual Reality and Avatars in Education

Authors: Michael Brazley

Abstract:

Virtual Reality (VR) and 3D videos are the most current generation of learning technology today. Virtual Reality and 3D videos are being used in professional offices and Schools now for marketing and education. Technology in the field of design has progress from two dimensional drawings to 3D models, using computers and sophisticated software. Virtual Reality is being used as collaborative means to allow designers and others to meet and communicate inside models or VR platforms using avatars. This research proposes to teach students from different backgrounds how to take a digital model into a 3D video, then into VR, and finally VR with multiple avatars communicating with each other in real time. The next step would be to develop the model where people from three or more different locations can meet as avatars in real time, in the same model and talk to each other. This research is longitudinal, studying the use of 3D videos in graduate design and Virtual Reality in XR (Extended Reality) courses. The research methodology is a combination of quantitative and qualitative methods. The qualitative methods begin with the literature review and case studies. The quantitative methods come by way of student’s 3D videos, survey, and Extended Reality (XR) course work. The end product is to develop a VR platform with multiple avatars being able to communicate in real time. This research is important because it will allow multiple users to remotely enter your model or VR platform from any location in the world and effectively communicate in real time. This research will lead to improved learning and training using Virtual Reality and Avatars; and is generalizable because most Colleges, Universities, and many citizens own VR equipment and computer labs. This research did produce a VR platform with multiple avatars having the ability to move and speak to each other in real time. Major implications of the research include but not limited to improved: learning, teaching, communication, marketing, designing, planning, etc. Both hardware and software played a major role in project success.

Keywords: virtual reality, avatars, education, XR

Procedia PDF Downloads 84
8608 Development of a Plant-Based Dietary Supplement to Address Critical Micronutrient Needs of Women of Child-Bearing Age in Europe

Authors: Sara D. Garduno-Diaz, Ramona Milcheva, Chanyu Xu

Abstract:

Women’s reproductive stages (pre-pregnancy, pregnancy, and lactation) represent a time of higher micronutrient needs. With a healthy food selection as the first path of choice to cover these increased needs, tandem micronutrient supplementation is often required. Because pregnancy and lactation should be treated with care, all supplements consumed should be of quality ingredients and manufactured through controlled processes. This work describes the process followed for the development of plant-based multiple micronutrient supplements aimed at addressing the growing demand for natural ingredients of non-animal origin. A list of key nutrients for inclusion was prioritized, followed by the identification and selection of qualified raw ingredient providers. Nutrient absorption into the food matrix was carried out through natural processes. The outcome is a new line of products meeting the set criteria of being gluten and lactose-free, suitable for vegans/vegetarians, and without artificial conservatives. In addition, each product provides the consumer with 10 vitamins, 6 inorganic nutrients, 1 source of essential fatty acids, and 1 source of phytonutrients each (maca, moringa, and chlorella). Each raw material, as well as the final product, was submitted to microbiological control three-fold (in-house and external). The final micronutrient mix was then tested for human factor contamination, pesticides, total aerobic microbial count, total yeast count, and total mold count. The product was created with the aim of meeting product standards for the European Union, as well as specific requirements for the German market in the food and pharma fields. The results presented here reach the point of introduction of the newly developed product to the market, with acceptability and effectiveness results to be published at a later date.

Keywords: fertility, lactation, organic, pregnancy, vegetarian

Procedia PDF Downloads 131
8607 Production and Purification of Pectinase by Aspergillus Niger

Authors: M. Umar Dahot, G. S. Mangrio

Abstract:

In this study Agro-industrial waste was used as a carbon source, which is a low cost substrate. Along with this, various sugars and molasses of 2.5% and 5% were investigated as substrate/carbon source for the growth of A.niger and Pectinase production. Different nitrogen sources were also used. An overview of results obtained show that 5% sucrose, 5% molasses and 0.4% (NH4)2SO4 were found the best carbon and nitrogen sources for the production of pectinase by A. niger. The maximum production of pectinase (26.87units/ml) was observed at pH 6.0 after 72 hrs incubation. The optimum temperature for the maximum production of pectinase was achieved at 35ºC when maximum production of pectinase was obtained as 28.25Units/ml.Pectinase enzyme was purified with ammonium sulphate precipitation and dialyzed sample was finally applied on gel filtration chromatography (Sephadex G-100) and Ion Exchange DEAE A-50. The enzyme was purified 2.5 fold by gel chromatography on Sephadex G-100 and Four fractions were obtained, Fraction 1, 2, 4 showed single band while Fraction -3 showed multiple bands on SDS Page electrophoresis. Fraction -3 was pooled, dialyzed and separated on Sephdex A-50 and two fractions 3a and 3b showed single band. The molecular weights of the purified fractions were detected in the range of 33000 ± 2000 and 38000± 2000 Daltons. The purified enzyme was specifically most active with pure pectin, while pectin, Lemon pectin and orange peel given lower activity as compared to (control). The optimum pH and temperature for pectinase activity was found between pH 5.0 and 6.0 and 40°- 50°C, respectively. The enzyme was stable over the pH range 3.0-8.0. The thermostability of was determined and it was observed that the pectinase activity is heat stable and retains activity more than 40% when incubated at 90°C for 10 minutes. The pectinase activity of F3a and F3b was increased with different metal ions. The Pectinase activity was stimulated in the presence of CaCl2 up to 10-30%. ZnSO4, MnSO4 and Mg SO4 showed higher activity in fractions F3a and F3b, which indicates that the pectinase belongs to metalo-enzymes. It is concluded that A. niger is capable to produce pH stable and thermostable pectinase, which can be used for industrial purposes.

Keywords: pectinase, a. niger, production, purification, characterization

Procedia PDF Downloads 397
8606 Experimental Study of Boost Converter Based PV Energy System

Authors: T. Abdelkrim, K. Ben Seddik, B. Bezza, K. Benamrane, Aeh. Benkhelifa

Abstract:

This paper proposes an implementation of boost converter for a resistive load using photovoltaic energy as a source. The model of photovoltaic cell and operating principle of boost converter are presented. A PIC micro controller is used in the close loop control to generate pulses for controlling the converter circuit. To performance evaluation of boost converter, a variation of output voltage of PV panel is done by shading one and two cells.

Keywords: boost converter, microcontroller, photovoltaic power generation, shading cells

Procedia PDF Downloads 859
8605 The Usage of Adobe in Historical Structures of Van City

Authors: Mustafa Gülen, Eylem Güzel, Soner Guler

Abstract:

The studies concentrated on the historical background of Van show the fact that Van has had a significant position as a settlement since ancient times and that it has hosted many civilizations during history. With the dominance of Ottoman Empire in 16th century, the region had been re-constructed by building new walls at the southern side of Van Castle. These construction activities had mostly been fulfilled by the usage of adobe which had been a fundamental material for thousands of years. As a result of natural disasters, battles and the move at the threshold of 20th century to the new settlement which is 9 kilometers away from the Ancient City Van is an open-air museum with the ruins of churches, mosques and baths. In this study, the usage of adobe in historical structures of Van city is evaluated in detail.

Keywords: historical structures, adobe, Van city, adobe

Procedia PDF Downloads 595
8604 A Study on Characteristics of Runoff Analysis Methods at the Time of Rainfall in Rural Area, Okinawa Prefecture Part 2: A Case of Kohatu River in South Central Part of Okinawa Pref

Authors: Kazuki Kohama, Hiroko Ono

Abstract:

The rainfall in Japan is gradually increasing every year according to Japan Meteorological Agency and Intergovernmental Panel on Climate Change Fifth Assessment Report. It means that the rainfall difference between rainy season and non-rainfall is increasing. In addition, the increasing trend of strong rain for a short time clearly appears. In recent years, natural disasters have caused enormous human injuries in various parts of Japan. Regarding water disaster, local heavy rain and floods of large rivers occur frequently, and it was decided on a policy to promote hard and soft sides as emergency disaster prevention measures with water disaster prevention awareness social reconstruction vision. Okinawa prefecture in subtropical region has torrential rain and water disaster several times a year such as river flood, in which is caused in specific rivers from all 97 rivers. Also, the shortage of capacity and narrow width are characteristic of river in Okinawa and easily cause river flood in heavy rain. This study focuses on Kohatu River that is one of the specific rivers. In fact, the water level greatly rises over the river levee almost once a year but non-damage of buildings around. On the other hand in some case, the water level reaches to ground floor height of house and has happed nine times until today. The purpose of this research is to figure out relationship between precipitation, surface outflow and total treatment water quantity of Kohatu River. For the purpose, we perform hydrological analysis although is complicated and needs specific details or data so that, the method is mainly using Geographic Information System software and outflow analysis system. At first, we extract watershed and then divided to 23 catchment areas to understand how much surface outflow flows to runoff point in each 10 minutes. On second, we create Unit Hydrograph indicating the area of surface outflow with flow area and time. This index shows the maximum amount of surface outflow at 2400 to 3000 seconds. Lastly, we compare an estimated value from Unit Hydrograph to a measured value. However, we found that measure value is usually lower than measured value because of evaporation and transpiration. In this study, hydrograph analysis was performed using GIS software and outflow analysis system. Based on these, we could clarify the flood time and amount of surface outflow.

Keywords: disaster prevention, water disaster, river flood, GIS software

Procedia PDF Downloads 125
8603 The Routes of Human Suffering: How Point-Source and Destination-Source Mapping Can Help Victim Services Providers and Law Enforcement Agencies Effectively Combat Human Trafficking

Authors: Benjamin Thomas Greer, Grace Cotulla, Mandy Johnson

Abstract:

Human trafficking is one of the fastest growing international crimes and human rights violations in the world. The United States Department of State (State Department) approximates some 800,000 to 900,000 people are annually trafficked across sovereign borders, with approximately 14,000 to 17,500 of these people coming into the United States. Today’s slavery is conducted by unscrupulous individuals who are often connected to organized criminal enterprises and transnational gangs, extracting huge monetary sums. According to the International Labour Organization (ILO), human traffickers collect approximately $32 billion worldwide annually. Surpassed only by narcotics dealing, trafficking of humans is tied with illegal arms sales as the second largest criminal industry in the world and is the fastest growing field in the 21st century. Perpetrators of this heinous crime abound. They are not limited to single or “sole practitioners” of human trafficking, but rather, often include Transnational Criminal Organizations (TCO), domestic street gangs, labor contractors, and otherwise seemingly ordinary citizens. Monetary gain is being elevated over territorial disputes and street gangs are increasingly operating in a collaborative effort with TCOs to further disguise their criminal activity; to utilizing their vast networks, in an attempt to avoid detection. Traffickers rely on a network of clandestine routes to sell their commodities with impunity. As law enforcement agencies seek to retard the expansion of transnational criminal organization’s entry into human trafficking, it is imperative that they develop reliable trafficking mapping of known exploitative routes. In a recent report given to the Mexican Congress, The Procuraduría General de la República (PGR) disclosed, from 2008 to 2010 they had identified at least 47 unique criminal networking routes used to traffic victims and that Mexico’s estimated domestic victims number between 800,000 adults and 20,000 children annually. Designing a reliable mapping system is a crucial step to effective law enforcement response and deploying a successful victim support system. Creating this mapping analytic is exceedingly difficult. Traffickers are constantly changing the way they traffic and exploit their victims. They swiftly adapt to local environmental factors and react remarkably well to market demands, exploiting limitations in the prevailing laws. This article will highlight how human trafficking has become one of the fastest growing and most high profile human rights violations in the world today; compile current efforts to map and illustrate trafficking routes; and will demonstrate how the proprietary analytical mapping analysis of point-source and destination-source mapping can help local law enforcement, governmental agencies and victim services providers effectively respond to the type and nature of trafficking to their specific geographical locale. Trafficking transcends state and international borders. It demands an effective and consistent cooperation between local, state, and federal authorities. Each region of the world has different impact factors which create distinct challenges for law enforcement and victim services. Our mapping system lays the groundwork for a targeted anti-trafficking response.

Keywords: human trafficking, mapping, routes, law enforcement intelligence

Procedia PDF Downloads 364
8602 Developing a Model to Objectively Assess the Culture of Individuals and Teams in Order to Effectively and Efficiently Achieve Sustainability in the Manpower

Authors: Ahmed Mohamed Elnady Mohamed Elsafty

Abstract:

This paper explains a developed applied objective model to measure the culture qualitatively and quantitatively, whether in individuals or in teams, in order to be able to use culture correctly or modify it efficiently. This model provides precise measurements and consistent interpretations by being comprehensive, updateable, and protected from being misled by imitations. Methodically, the provided model divides the culture into seven dimensions (total 43 cultural factors): First dimension is outcome-orientation which consists of five factors and should be highest in leaders. Second dimension is details-orientation which consists of eight factors and should be in highest intelligence members. Third dimension is team-orientation which consists of five factors and should be highest in instructors or coaches. Fourth dimension is change-orientation which consists of five factors and should be highest in soldiers. Fifth dimension is people-orientation which consists of eight factors and should be highest in media members. Sixth dimension is masculinity which consists of seven factors and should be highest in hard workers. Last dimension is stability which consists of seven factors and should be highest in soft workers. In this paper, the details of all cultural factors are explained. Practically, information collection about each cultural factor in the targeted person or team is essential in order to calculate the degrees of all cultural factors using the suggested equation of multiplying 'the score of factor presence' by 'the score of factor strength'. In this paper, the details of how to build each score are explained. Based on the highest degrees - to identify which cultural dimension is the prominent - choosing the tested individual or team in the supposedly right position at the right time will provide a chance to use minimal efforts to make everyone aligned to the organization’s objectives. In other words, making everyone self-motivated by setting him/her at the right source of motivation is the most effective and efficient method to achieve high levels of competency, commitment, and sustainability. Modifying a team culture can be achieved by excluding or including new members with relatively high or low degrees in specific cultural factors. For conclusion, culture is considered as the software of the human beings and it is one of the major compression factors on the managerial discretion. It represents the behaviors, attitudes, and motivations of the human resources which are vital to enhance quality and safety, expanding the market share, and defending against attacks from external environments. Thus, it is tremendously essential and useful to use such a comprehensive model to measure, use, and modify culture.

Keywords: culture dimensions, culture factors, culture measurement, cultural analysis, cultural modification, self-motivation, alignment to objectives, competency, sustainability

Procedia PDF Downloads 155
8601 Motivation, Legal Knowledge and Preference Investigation of Hungarian Law Students

Authors: Zsofia Patyi

Abstract:

While empirical studies under socialism in Hungary focused on the lawyer society as a whole, current research deals with law students in specific. The change of regime and the mutation of legal education have influenced the motivation, efficiency, social background and self-concept of law students. This shift needs to be acknowledged, and the education system improved for students and together with students. A new law student society requires a different legal education system, different legal studies, or, at the minimum, a different approach to teaching law. This is to ensure that competitive lawyers be trained who understand the constantly changing nature of the law and, as a result, can potentially transform or create legislation themselves. A number of developments can affect law students’ awareness of legal relations in a democratic state. In today’s Hungary, these decisive factors are primarily the new regulation of the financing of law students, and secondly, the new Hungarian constitution (henceforth: Alaptörvény), which has modified the base of the Hungarian legal system. These circumstances necessitate a new, comprehensive, and empirical, investigation of law students. To this end, our research team (comprising a professor, a Ph.D. student, and two law students), is conducting a new type of study in February 2017. The first stage of the research project uses the desktop method to open up the research antecedents. Afterward, a structured questionnaire draft will be designed and sent to the Head of Department of Sociology and the Associate Professor of the Department of Constitutional Law at the University of Szeged to have the draft checked and amended. Next, an open workshop for students and teachers will be organized with the aim to discuss the draft and create the final questionnaire. The research team will then contact each Hungarian university with a Faculty of Law to reach all 1st- and 4th-year law students. 1st-year students have not yet studied the Alaptörvény, while 4th-year students have. All students will be asked to fill in the questionnaire (in February). Results are expected to be in at the end of February. In March, the research team will report the results and present the conclusions. In addition, the results will be compared to previous researches. The outcome will help us answer the following research question: How should legal studies and legal education in Hungary be reformed in accordance with law students and the future lawyer society? The aim of the research is to (1) help create a new student- and career-centered teaching method of legal studies, (2) offer a new perspective on legal education, and (3) create a helpful and useful de lege ferenda proposal for the attorney general as regards legal education as part of higher education.

Keywords: change, constitution, investigation, law students, lawyer society, legal education, legal studies, motivation, reform

Procedia PDF Downloads 252
8600 The Determination of the Phosphorous Solubility in the Iron by the Function of the Other Components

Authors: Andras Dezső, Peter Baumli, George Kaptay

Abstract:

The phosphorous is the important components in the steels, because it makes the changing of the mechanical properties and possibly modifying the structure. The phosphorous can be create the Fe3P compounds, what is segregated in the ferrite grain boundary in the intervals of the nano-, or microscale. This intermetallic compound is decreasing the mechanical properties, for example it makes the blue brittleness which means that the brittle created by the segregated particles at 200 ... 300°C. This work describes the phosphide solubility by the other components effect. We make calculations for the Ni, Mo, Cu, S, V, C, Si, Mn, and the Cr elements by the Thermo-Calc software. We predict the effects by approximate functions. The binary Fe-P system has a solubility line, which has a determinating equation. The result is below: lnwo = -3,439 – 1.903/T where the w0 means the weight percent of the maximum soluted concentration of the phosphorous, and the T is the temperature in Kelvin. The equation show that the P more soluble element when the temperature increasing. The nickel, molybdenum, vanadium, silicon, manganese, and the chromium make dependence to the maximum soluted concentration. These functions are more dependent by the elements concentration, which are lower when we put these elements in our steels. The copper, sulphur and carbon do not make effect to the phosphorous solubility. We predict that all of cases the maximum solubility concentration increases when the temperature more and more high. Between 473K and 673 K, in the phase diagram, these systems contain mostly two or three phase eutectoid, and the singe phase, ferritic intervals. In the eutectoid areas the ferrite, the iron-phosphide, and the metal (III)-phospide are in the equilibrium. In these modelling we predicted that which elements are good for avoid the phosphide segregation or not. These datas are important when we make or choose the steels, where the phosphide segregation stopping our possibilities.

Keywords: phosphorous, steel, segregation, thermo-calc software

Procedia PDF Downloads 612
8599 Impact of Fermentation Time and Microbial Source on Physicochemical Properties, Total Phenols and Antioxidant Activity of Finger Millet Malt Beverage

Authors: Henry O. Udeha, Kwaku G. Duodub, Afam I. O. Jideanic

Abstract:

Finger millet (FM) [Eleusine coracana] is considered as a potential ‘‘super grain’’ by the United States National Academies as one of the most nutritious among all the major cereals. The regular consumption of FM-based diets has been associated with reduced risk of diabetes, cataract and gastrointestinal tract disorder. Hyperglycaemic, hypocholesterolaemic and anticataractogenic, and other health improvement properties have been reported. This study examined the effect of fermentation time and microbial source on physicochemical properties, phenolic compounds and antioxidant activity of two finger millet (FM) malt flours. Sorghum was used as an external reference. The grains were malted, mashed and fermented using the grain microflora and Lactobacillus fermentum. The phenolic compounds of the resulting beverage were identified and quantified using ultra-performance liquid chromatography (UPLC) and mass spectrometer system (MS). A fermentation-time dependent decrease in pH and viscosities of the beverages, with a corresponding increase in sugar content were noted. The phenolic compounds found in the FM beverages were protocatechuic acid, catechin and epicatechin. Decrease in total phenolics of the beverages was observed with increased fermentation time. The beverages exhibited 2, 2-diphenyl-1-picrylhydrazyl, 2, 2՛-azinobis-3-ethylbenzthiazoline-6-sulfonic acid radical scavenging action and iron reducing activities, which were significantly (p < 0.05) reduced at 96 h fermentation for both microbial sources. The 24 h fermented beverages retained a higher amount of total phenolics and had higher antioxidant activity compared to other fermentation periods. The study demonstrates that FM could be utilised as a functional grain in the production of non-alcoholic beverage with important phenolic compounds for health promotion and wellness.

Keywords: antioxidant activity, eleusine coracana, fermentation, phenolic compounds

Procedia PDF Downloads 94
8598 [Keynote] Implementation of Quality Control Procedures in Radiotherapy CT Simulator

Authors: B. Petrović, L. Rutonjski, M. Baucal, M. Teodorović, O. Čudić, B. Basarić

Abstract:

Purpose/Objective: Radiotherapy treatment planning requires use of CT simulator, in order to acquire CT images. The overall performance of CT simulator determines the quality of radiotherapy treatment plan, and at the end, the outcome of treatment for every single patient. Therefore, it is strongly advised by international recommendations, to set up a quality control procedures for every machine involved in radiotherapy treatment planning process, including the CT scanner/ simulator. The overall process requires number of tests, which are used on daily, weekly, monthly or yearly basis, depending on the feature tested. Materials/Methods: Two phantoms were used: a dedicated phantom CIRS 062QA, and a QA phantom obtained with the CT simulator. The examined CT simulator was Siemens Somatom Definition as Open, dedicated for radiation therapy treatment planning. The CT simulator has a built in software, which enables fast and simple evaluation of CT QA parameters, using the phantom provided with the CT simulator. On the other hand, recommendations contain additional test, which were done with the CIRS phantom. Also, legislation on ionizing radiation protection requires CT testing in defined periods of time. Taking into account the requirements of law, built in tests of a CT simulator, and international recommendations, the intitutional QC programme for CT imulator is defined, and implemented. Results: The CT simulator parameters evaluated through the study were following: CT number accuracy, field uniformity, complete CT to ED conversion curve, spatial and contrast resolution, image noise, slice thickness, and patient table stability.The following limits are established and implemented: CT number accuracy limits are +/- 5 HU of the value at the comissioning. Field uniformity: +/- 10 HU in selected ROIs. Complete CT to ED curve for each tube voltage must comply with the curve obtained at comissioning, with deviations of not more than 5%. Spatial and contrast resultion tests must comply with the tests obtained at comissioning, otherwise machine requires service. Result of image noise test must fall within the limit of 20% difference of the base value. Slice thickness must meet manufacturer specifications, and patient stability with longitudinal transfer of loaded table must not differ of more than 2mm vertical deviation. Conclusion: The implemented QA tests gave overall basic understanding of CT simulator functionality and its clinical effectiveness in radiation treatment planning. The legal requirement to the clinic is to set up it’s own QA programme, with minimum testing, but it remains user’s decision whether additional testing, as recommended by international organizations, will be implemented, so to improve the overall quality of radiation treatment planning procedure, as the CT image quality used for radiation treatment planning, influences the delineation of a tumor and calculation accuracy of treatment planning system, and finally delivery of radiation treatment to a patient.

Keywords: CT simulator, radiotherapy, quality control, QA programme

Procedia PDF Downloads 515
8597 Planning for Location and Distribution of Regional Facilities Using Central Place Theory and Location-Allocation Model

Authors: Danjuma Bawa

Abstract:

This paper aimed at exploring the capabilities of Location-Allocation model in complementing the strides of the existing physical planning models in the location and distribution of facilities for regional consumption. The paper was designed to provide a blueprint to the Nigerian government and other donor agencies especially the Fertilizer Distribution Initiative (FDI) by the federal government for the revitalization of the terrorism ravaged regions. Theoretical underpinnings of central place theory related to spatial distribution, interrelationships, and threshold prerequisites were reviewed. The study showcased how Location-Allocation Model (L-AM) alongside Central Place Theory (CPT) was applied in Geographic Information System (GIS) environment to; map and analyze the spatial distribution of settlements; exploit their physical and economic interrelationships, and to explore their hierarchical and opportunistic influences. The study was purely spatial qualitative research which largely used secondary data such as; spatial location and distribution of settlements, population figures of settlements, network of roads linking them and other landform features. These were sourced from government ministries and open source consortium. GIS was used as a tool for processing and analyzing such spatial features within the dictum of CPT and L-AM to produce a comprehensive spatial digital plan for equitable and judicious location and distribution of fertilizer deports in the study area in an optimal way. Population threshold was used as yardstick for selecting suitable settlements that could stand as service centers to other hinterlands; this was accomplished using the query syntax in ArcMapTM. ArcGISTM’ network analyst was used in conducting location-allocation analysis for apportioning of groups of settlements around such service centers within a given threshold distance. Most of the techniques and models ever used by utility planners have been centered on straight distance to settlements using Euclidean distances. Such models neglect impedance cutoffs and the routing capabilities of networks. CPT and L-AM take into consideration both the influential characteristics of settlements and their routing connectivity. The study was undertaken in two terrorism ravaged Local Government Areas of Adamawa state. Four (4) existing depots in the study area were identified. 20 more depots in 20 villages were proposed using suitability analysis. Out of the 300 settlements mapped in the study area about 280 of such settlements where optimally grouped and allocated to the selected service centers respectfully within 2km impedance cutoff. This study complements the giant strides by the federal government of Nigeria by providing a blueprint for ensuring proper distribution of these public goods in the spirit of bringing succor to these terrorism ravaged populace. This will ardently at the same time help in boosting agricultural activities thereby lowering food shortage and raising per capita income as espoused by the government.

Keywords: central place theory, GIS, location-allocation, network analysis, urban and regional planning, welfare economics

Procedia PDF Downloads 131
8596 The Combined Effect of Different Levels of Fe(III) in Diet and Cr(III) Supplementation on the Ca Status in Wistar

Authors: Staniek Halina

Abstract:

The inappropriate trace elements supply such as iron(III) and chromium(III) may be risk factors of many metabolic disorders (e.g., anemia, diabetes, as well cause toxic effect). However, little is known about their mutual interactions and their impact on these disturbances. The effects of Cr(III) supplementation with a deficit or excess supply of Fe(III) in vivo conditions are not known yet. The objective of the study was to investigate the combined effect of different Fe(III) levels in the diet and simultaneous Cr(III) supplementation on the Ca distribution in organs in healthy rats. The assessment was based on a two-factor (2x3) experiment carried out on 54 female Wistar rats (Rattus norvegicus). The animals were randomly divided into 9 groups and for 6 weeks, they were fed semi-purified diets AIN-93 with three different Fe(III) levels in the diet as a factor A [control (C) 45 mg/kg (100% Recommended Daily Allowance for rodents), deficient (D) 5 mg/kg (10% RDA), and oversupply (H) 180 mg/kg (400% RDA)]. The second factor (B) was the simultaneous dietary supplementation with Cr(III) at doses of 1, 50 and 500 mg/kg of the diet. Iron(III) citrate was the source of Fe(III). The complex of Cr(III) with propionic acid, also called Cr₃ or chromium(III) propionate (CrProp), was used as a source of Cr(III) in the diet. The Ca content of analysed samples (liver, kidneys, spleen, heart, and femur) was determined with the Atomic Absorption Spectrometry (AAS) method. It was found that different dietary Fe(III) supply as well as Cr(III) supplementation independently and in combination influenced Ca metabolism in healthy rats. Regardless of the supplementation of Cr(III), the oversupply of Fe(III) (180 mg/kg) decreased the Ca content in the liver and kidneys, while it increased the Ca saturation of bone tissue. High Cr(III) doses lowered the hepatic Ca content. Moreover, it tended to decrease the Ca content in the kidneys and heart, but this effect was not statistically significant. The combined effect of the experimental factors on the Ca content in the liver and the femur was observed. With the increase in the Fe(III) content in the diet, there was a decrease in the Ca level in the liver and an increase in bone saturation, and the additional Cr(III) supplementation intensified those effects. The study proved that the different Fe(III) content in the diet, independently and in combination with Cr(III) supplementation, affected the Ca distribution in organisms of healthy rats.

Keywords: calcium, chromium(III), iron(III), rats, supplementation

Procedia PDF Downloads 181
8595 Application of the Best Technique for Estimating the Rest-Activity Rhythm Period in Shift Workers

Authors: Rakesh Kumar Soni

Abstract:

Under free living conditions, human biological clocks show a periodicity of 24 hour for numerous physiological, behavioral and biochemical variables. However, this period is not the original period; rather it merely exhibits synchronization with the solar clock. It is, therefore, most important to investigate characteristics of human circadian clock, essentially in shift workers, who normally confront with contrasting social clocks. Aim of the present study was to investigate rest-activity rhythm and to vouch for the best technique for the computation of periods in this rhythm in subjects randomly selected from different groups of shift workers. The rest-activity rhythm was studied in forty-eight shift workers from three different organizations, namely Newspaper Printing Press (NPP), Chhattisgarh State Electricity Board (CSEB) and Raipur Alloys (RA). Shift workers of NPP (N = 20) were working on a permanent night shift schedule (NS; 20:00-04:00). However, in CSEB (N = 14) and RA (N = 14), shift workers were working in a 3-shift system comprising of rotations from night (NS; 22:00-06:00) to afternoon (AS; 14:00-22:00) and to morning shift (MS; 06:00-14:00). Each subject wore an Actiwatch (AW64, Mini Mitter Co. Inc., USA) for 7 and/or 21 consecutive days, only after furnishing a certificate of consent. One-minute epoch length was chosen for the collection of wrist activity data. Period was determined by using Actiware sleep software (Periodogram), Lomb-Scargle Periodogram (LSP) and Spectral analysis software (Spectre). Other statistical techniques, such as ANOVA and Duncan’s multiple-range test were also used whenever required. A statistically significant circadian rhythm in rest-activity, gauged by cosinor, was documented in all shift workers, irrespective of shift work. Results indicate that the efficiency of the technique to determine the period (τ) depended upon the clipping limits of the τs. It appears that the technique of spectre is more reliable.

Keywords: biological clock, rest activity rhythm, spectre, periodogram

Procedia PDF Downloads 148
8594 Energy Performance Gaps in Residences: An Analysis of the Variables That Cause Energy Gaps and Their Impact

Authors: Amrutha Kishor

Abstract:

Today, with the rising global warming and depletion of resources every industry is moving toward sustainability and energy efficiency. As part of this movement, it is nowadays obligatory for architects to play their part by creating energy predictions for their designs. But in a lot of cases, these predictions do not reflect the real quantities of energy in newly built buildings when operating. These can be described as ‘Energy Performance Gaps’. This study aims to determine the underlying reasons for these gaps. Seven houses designed by Allan Joyce Architects, UK from 1998 until 2019 were considered for this study. The data from the residents’ energy bills were cross-referenced with the predictions made with the software SefairaPro and from energy reports. Results indicated that the predictions did not match the actual energy usage. An account of how energy was used in these seven houses was made by means of personal interviews. The main factors considered in the study were occupancy patterns, heating systems and usage, lighting profile and usage, and appliances’ profile and usage. The study found that the main reasons for the creation of energy gaps were the discrepancies in occupant usage and patterns of energy consumption that are predicted as opposed to the actual ones. This study is particularly useful for energy-conscious architectural firms to fine-tune the approach to designing houses and analysing their energy performance. As the findings reveal that energy usage in homes varies based on the way residents use the space, it helps deduce the most efficient technological combinations. This information can be used to set guidelines for future policies and regulations related to energy consumption in homes. This study can also be used by the developers of simulation software to understand how architects use their product and drive improvements in its future versions.

Keywords: architectural simulation, energy efficient design, energy performance gaps, environmental design

Procedia PDF Downloads 103
8593 Life Cycle Assessment of Residential Buildings: A Case Study in Canada

Authors: Venkatesh Kumar, Kasun Hewage, Rehan Sadiq

Abstract:

Residential buildings consume significant amounts of energy and produce a large amount of emissions and waste. However, there is a substantial potential for energy savings in this sector which needs to be evaluated over the life cycle of residential buildings. Life Cycle Assessment (LCA) methodology has been employed to study the primary energy uses and associated environmental impacts of different phases (i.e., product, construction, use, end of life, and beyond building life) for residential buildings. Four different alternatives of residential buildings in Vancouver (BC, Canada) with a 50-year lifespan have been evaluated, including High Rise Apartment (HRA), Low Rise Apartment (LRA), Single family Attached House (SAH), and Single family Detached House (SDH). Life cycle performance of the buildings is evaluated for embodied energy, embodied environmental impacts, operational energy, operational environmental impacts, total life-cycle energy, and total life cycle environmental impacts. Estimation of operational energy and LCA are performed using DesignBuilder software and Athena Impact estimator software respectively. The study results revealed that over the life span of the buildings, the relationship between the energy use and the environmental impacts are identical. LRA is found to be the best alternative in terms of embodied energy use and embodied environmental impacts; while, HRA showed the best life-cycle performance in terms of minimum energy use and environmental impacts. Sensitivity analysis has also been carried out to study the influence of building service lifespan over 50, 75, and 100 years on the relative significance of embodied energy and total life cycle energy. The life-cycle energy requirements for SDH is found to be a significant component among the four types of residential buildings. The overall disclose that the primary operations of these buildings accounts for 90% of the total life cycle energy which far outweighs minor differences in embodied effects between the buildings.

Keywords: building simulation, environmental impacts, life cycle assessment, life cycle energy analysis, residential buildings

Procedia PDF Downloads 452