Search results for: virtual processing unit
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 6761

Search results for: virtual processing unit

551 Comparison of Monte Carlo Simulations and Experimental Results for the Measurement of Complex DNA Damage Induced by Ionizing Radiations of Different Quality

Authors: Ifigeneia V. Mavragani, Zacharenia Nikitaki, George Kalantzis, George Iliakis, Alexandros G. Georgakilas

Abstract:

Complex DNA damage consisting of a combination of DNA lesions, such as Double Strand Breaks (DSBs) and non-DSB base lesions occurring in a small volume is considered as one of the most important biological endpoints regarding ionizing radiation (IR) exposure. Strong theoretical (Monte Carlo simulations) and experimental evidence suggests an increment of the complexity of DNA damage and therefore repair resistance with increasing linear energy transfer (LET). Experimental detection of complex (clustered) DNA damage is often associated with technical deficiencies limiting its measurement, especially in cellular or tissue systems. Our groups have recently made significant improvements towards the identification of key parameters relating to the efficient detection of complex DSBs and non-DSBs in human cellular systems exposed to IR of varying quality (γ-, X-rays 0.3-1 keV/μm, α-particles 116 keV/μm and 36Ar ions 270 keV/μm). The induction and processing of DSB and non-DSB-oxidative clusters were measured using adaptations of immunofluorescence (γH2AX or 53PB1 foci staining as DSB probes and human repair enzymes OGG1 or APE1 as probes for oxidized purines and abasic sites respectively). In the current study, Relative Biological Effectiveness (RBE) values for DSB and non-DSB induction have been measured in different human normal (FEP18-11-T1) and cancerous cell lines (MCF7, HepG2, A549, MO59K/J). The experimental results are compared to simulation data obtained using a validated microdosimetric fast Monte Carlo DNA Damage Simulation code (MCDS). Moreover, this simulation approach is implemented in two realistic clinical cases, i.e. prostate cancer treatment using X-rays generated by a linear accelerator and a pediatric osteosarcoma case using a 200.6 MeV proton pencil beam. RBE values for complex DNA damage induction are calculated for the tumor areas. These results reveal a disparity between theory and experiment and underline the necessity for implementing highly precise and more efficient experimental and simulation approaches.

Keywords: complex DNA damage, DNA damage simulation, protons, radiotherapy

Procedia PDF Downloads 312
550 Influence of Settlements and Human Activities on Beetle Diversity and Assemblage Structure at Small Islands of the Kepulauan Seribu Marine National Park and Nearby Java

Authors: Shinta Holdsworth, Jan Axmacher, Darren J. Mann

Abstract:

Beetles represent the most diverse insect taxon, and they contribute significantly to a wide range of vital ecological functions. Examples include decomposition by bark beetles, nitrogen recycling and dung processing by dung beetles or pest control by predatory ground beetles. Nonetheless, research into the distribution patterns, species richness and functional diversity of beetles particularly from tropical regions remains extremely limited. In our research, we aim to investigate the distribution and diversity patterns of beetles and the roles they play in small tropical island ecosystems in the Kepulauan Seribu Marine National Park and on Java. Our research furthermore provides insights into the effects anthropogenic activities have on the assemblage composition and diversity of beetles on the small islands. We recorded a substantial number of highly abundant small island species, including a substantial number of unique small island species across the study area, highlighting these islands’ potential importance for the regional conservation of genetic resources. The highly varied patterns observed in relation to the use of different trapping types - pitfall traps and flight interception traps (FITs) - underscores the need for complementary trapping strategies that combine multiple methods for beetle community surveys in tropical islands. The significant impacts of human activities have on the small island beetle faunas were also highlighted in our research. More island beetle species encountered in settlement than forest areas shows clear trend of positive links between anthropogenic activities and the overall beetle species richness. However, undisturbed forests harboured a high number of unique species, also in comparison to disturbed forests. Finally, our study suggests that, with regards to different feeding guilds, the diversity of herbivorous beetles on islands is strongly affected by the different levels of forest cover encountered.

Keywords: beetle diversity, forest disturbance, island biogeography, island settlement

Procedia PDF Downloads 213
549 Solar and Galactic Cosmic Ray Impacts on Ambient Dose Equivalent Considering a Flight Path Statistic Representative to World-Traffic

Authors: G. Hubert, S. Aubry

Abstract:

The earth is constantly bombarded by cosmic rays that can be of either galactic or solar origin. Thus, humans are exposed to high levels of galactic radiation due to altitude aircraft. The typical total ambient dose equivalent for a transatlantic flight is about 50 μSv during quiet solar activity. On the contrary, estimations differ by one order of magnitude for the contribution induced by certain solar particle events. Indeed, during Ground Level Enhancements (GLE) event, the Sun can emit particles of sufficient energy and intensity to raise radiation levels on Earth's surface. Analyses of GLE characteristics occurring since 1942 showed that for the worst of them, the dose level is of the order of 1 mSv and more. The largest of these events was observed on February 1956 for which the ambient dose equivalent rate is in the orders of 10 mSv/hr. The extra dose at aircraft altitudes for a flight during this event might have been about 20 mSv, i.e. comparable with the annual limit for aircrew. The most recent GLE, occurred on September 2017 resulting from an X-class solar flare, and it was measured on the surface of both the Earth and Mars using the Radiation Assessment Detector on the Mars Science Laboratory's Curiosity Rover. Recently, Hubert et al. proposed a GLE model included in a particle transport platform (named ATMORAD) describing the extensive air shower characteristics and allowing to assess the ambient dose equivalent. In this approach, the GCR is based on the Force-Field approximation model. The physical description of the Solar Cosmic Ray (i.e. SCR) considers the primary differential rigidity spectrum and the distribution of primary particles at the top of the atmosphere. ATMORAD allows to determine the spectral fluence rate of secondary particles induced by extensive showers, considering altitude range from ground to 45 km. Ambient dose equivalent can be determined using fluence-to-ambient dose equivalent conversion coefficients. The objective of this paper is to analyze the GCR and SCR impacts on ambient dose equivalent considering a high number statistic of world-flight paths. Flight trajectories are based on the Eurocontrol Demand Data Repository (DDR) and consider realistic flight plan with and without regulations or updated with Radar Data from CFMU (Central Flow Management Unit). The final paper will present exhaustive analyses implying solar impacts on ambient dose equivalent level and will propose detailed analyses considering route and airplane characteristics (departure, arrival, continent, airplane type etc.), and the phasing of the solar event. Preliminary results show an important impact of the flight path, particularly the latitude which drives the cutoff rigidity variations. Moreover, dose values vary drastically during GLE events, on the one hand with the route path (latitude, longitude altitude), on the other hand with the phasing of the solar event. Considering the GLE occurred on 23 February 1956, the average ambient dose equivalent evaluated for a flight Paris - New York is around 1.6 mSv, which is relevant to previous works This point highlights the importance of monitoring these solar events and of developing semi-empirical and particle transport method to obtain a reliable calculation of dose levels.

Keywords: cosmic ray, human dose, solar flare, aviation

Procedia PDF Downloads 203
548 An Analysis System for Integrating High-Throughput Transcript Abundance Data with Metabolic Pathways in Green Algae

Authors: Han-Qin Zheng, Yi-Fan Chiang-Hsieh, Chia-Hung Chien, Wen-Chi Chang

Abstract:

As the most important non-vascular plants, algae have many research applications, including high species diversity, biofuel sources, adsorption of heavy metals and, following processing, health supplements. With the increasing availability of next-generation sequencing (NGS) data for algae genomes and transcriptomes, an integrated resource for retrieving gene expression data and metabolic pathway is essential for functional analysis and systems biology in algae. However, gene expression profiles and biological pathways are displayed separately in current resources, and making it impossible to search current databases directly to identify the cellular response mechanisms. Therefore, this work develops a novel AlgaePath database to retrieve gene expression profiles efficiently under various conditions in numerous metabolic pathways. AlgaePath, a web-based database, integrates gene information, biological pathways, and next-generation sequencing (NGS) datasets in Chlamydomonasreinhardtii and Neodesmus sp. UTEX 2219-4. Users can identify gene expression profiles and pathway information by using five query pages (i.e. Gene Search, Pathway Search, Differentially Expressed Genes (DEGs) Search, Gene Group Analysis, and Co-Expression Analysis). The gene expression data of 45 and 4 samples can be obtained directly on pathway maps in C. reinhardtii and Neodesmus sp. UTEX 2219-4, respectively. Genes that are differentially expressed between two conditions can be identified in Folds Search. Furthermore, the Gene Group Analysis of AlgaePath includes pathway enrichment analysis, and can easily compare the gene expression profiles of functionally related genes in a map. Finally, Co-Expression Analysis provides co-expressed transcripts of a target gene. The analysis results provide a valuable reference for designing further experiments and elucidating critical mechanisms from high-throughput data. More than an effective interface to clarify the transcript response mechanisms in different metabolic pathways under various conditions, AlgaePath is also a data mining system to identify critical mechanisms based on high-throughput sequencing.

Keywords: next-generation sequencing (NGS), algae, transcriptome, metabolic pathway, co-expression

Procedia PDF Downloads 400
547 Application Research of Stilbene Crystal for the Measurement of Accelerator Neutron Sources

Authors: Zhao Kuo, Chen Liang, Zhang Zhongbing, Ruan Jinlu. He Shiyi, Xu Mengxuan

Abstract:

Stilbene, C₁₄H₁₂, is well known as one of the most useful organic scintillators for pulse shape discrimination (PSD) technique for its good scintillation properties. An on-line acquisition system and an off-line acquisition system were developed with several CAMAC standard plug-ins, NIM plug-ins, neutron/γ discriminating plug-in named 2160A and a digital oscilloscope with high sampling rate respectively for which stilbene crystals and photomultiplier tube detectors (PMT) as detector for accelerator neutron sources measurement carried out in China Institute of Atomic Energy. Pulse amplitude spectrums and charge amplitude spectrums were real-time recorded after good neutron/γ discrimination whose best PSD figure-of-merits (FoMs) are 1.756 for D-D accelerator neutron source and 1.393 for D-T accelerator neutron source. The probability of neutron events in total events was 80%, and neutron detection efficiency was 5.21% for D-D accelerator neutron sources, which were 50% and 1.44% for D-T accelerator neutron sources after subtracting the background of scattering observed by the on-line acquisition system. Pulse waveform signals were acquired by the off-line acquisition system randomly while the on-line acquisition system working. The PSD FoMs obtained by the off-line acquisition system were 2.158 for D-D accelerator neutron sources and 1.802 for D-T accelerator neutron sources after waveform digitization off-line processing named charge integration method for just 1000 pulses. In addition, the probabilities of neutron events in total events obtained by the off-line acquisition system matched very well with the probabilities of the on-line acquisition system. The pulse information recorded by the off-line acquisition system could be repetitively used to adjust the parameters or methods of PSD research and obtain neutron charge amplitude spectrums or pulse amplitude spectrums after digital analysis with a limited number of pulses. The off-line acquisition system showed equivalent or better measurement effects compared with the online system with a limited number of pulses which indicated a feasible method based on stilbene crystals detectors for the measurement of prompt neutrons neutron sources like prompt accelerator neutron sources emit a number of neutrons in a short time.

Keywords: stilbene crystal, accelerator neutron source, neutron / γ discrimination, figure-of-merits, CAMAC, waveform digitization

Procedia PDF Downloads 182
546 Treatment of High Concentration Cutting Fluid Wastewater by Ceramic Membrane Bioreactor

Authors: Kai-Shiang Chang, Shiao-Shing Chen, Saikat Sinha Ray, Hung-Te Hsu

Abstract:

In recent years, membrane bioreactors (MBR) have been widely utilized as it can effectively replace conventional activated sludge process (CAS). Membrane bioreactor (MBR) is found to be more effective technology compared to other conventional activated sludge process and advanced membrane separation technique. Additionally, as far as the MBR is concerned, it is having excellent control of sludge retention time (SRT) and hydraulic retention time (HRT) and conducive to the retention of high concentration of sludge biomass. The membrane bioreactor (MBR) can effectively reduce footprint in terms of area and omit the secondary processing procedures in the conventional activated sludge process (CAS). Currently, as per the membrane technology, the ceramic membrane is found to have highly strong anti-acid-base properties, and it is more suitable than polymeric membrane while using for backwash and chemical cleaning. This study is based upon the treatment of Cutting Fluid wastewater, as the Cutting Fluid is widely used in the cutting equipment. However, the Cutting Fluid wastewater is very difficult to treat. In this study, the ceramic membrane was used and combine with of MBR system to treat the Cutting Fluid wastewater. In this present study, different kind of chemical coagulants have been utilized for pretreatment purpose in order to get the supernatant and simultaneously this wastewater (supernatant) was treated by MBR process. Nevertheless, ceramic membrane has three advantages such as high mechanical strength, drug resistance and reuse. During the experiment, the backwash technique was used for every interval of 10 minutes in order to avoid fouling of the membrane. In this study, during pretreatment the Chemical Oxygen Demand (COD) removal efficiency was found to be 71-86% and oil removal efficiency was analyzed to be 83-92%. This pretreatment study suggests that it is quiet effective methodology to reduce COD and oil concentration. Finally, In the MBR system when the HRT is more than 7.5 hour, the COD removal efficiency was found to be 87-93% and could achieve 100% oil removal efficiency. Coagulation test series were seen in Refs coagulants for the treatment of wastewater containing cutting oil with better oil and COD removal efficiency. The results also showed that the oil removal efficiency in the MBR system could reduce the oil content to less than 1 mg / L when the oil quality was 126 mg / L. Therefore, in this paper, the performance of membrane bioreactor by utilizing ceramic membrane has been demonstrated for treatment of Cutting Fluid wastewater.

Keywords: membrane bioreactor, cutting fluid, oil, chemical oxygen demand

Procedia PDF Downloads 310
545 An Adaptive Conversational AI Approach for Self-Learning

Authors: Airy Huang, Fuji Foo, Aries Prasetya Wibowo

Abstract:

In recent years, the focus of Natural Language Processing (NLP) development has been gradually shifting from the semantics-based approach to deep learning one, which performs faster with fewer resources. Although it performs well in many applications, the deep learning approach, due to the lack of semantics understanding, has difficulties in noticing and expressing a novel business case with a pre-defined scope. In order to meet the requirements of specific robotic services, deep learning approach is very labor-intensive and time consuming. It is very difficult to improve the capabilities of conversational AI in a short time, and it is even more difficult to self-learn from experiences to deliver the same service in a better way. In this paper, we present an adaptive conversational AI algorithm that combines both semantic knowledge and deep learning to address this issue by learning new business cases through conversations. After self-learning from experience, the robot adapts to the business cases originally out of scope. The idea is to build new or extended robotic services in a systematic and fast-training manner with self-configured programs and constructed dialog flows. For every cycle in which a chat bot (conversational AI) delivers a given set of business cases, it is trapped to self-measure its performance and rethink every unknown dialog flows to improve the service by retraining with those new business cases. If the training process reaches a bottleneck and incurs some difficulties, human personnel will be informed of further instructions. He or she may retrain the chat bot with newly configured programs, or new dialog flows for new services. One approach employs semantics analysis to learn the dialogues for new business cases and then establish the necessary ontology for the new service. With the newly learned programs, it completes the understanding of the reaction behavior and finally uses dialog flows to connect all the understanding results and programs, achieving the goal of self-learning process. We have developed a chat bot service mounted on a kiosk, with a camera for facial recognition and a directional microphone array for voice capture. The chat bot serves as a concierge with polite conversation for visitors. As a proof of concept. We have demonstrated to complete 90% of reception services with limited self-learning capability.

Keywords: conversational AI, chatbot, dialog management, semantic analysis

Procedia PDF Downloads 130
544 Model of Community Management for Sustainable Utilization

Authors: Luedech Girdwichai, Withaya Mekhum

Abstract:

This research intended to develop the model of community management for sustainable utilization by investigating on 2 groups of population, the family heads and the community management team. The population of the former group consisted of family heads from 511 families in 12 areas to complete the questionnaires which were returned at 479 sets. The latter group consisted of the community management team of 12 areas with 1 representative from each area to give the interview. The questionnaires for the family heads consisted of 2 main parts; general information such as occupations, etc. in the form of checklist. The second part dealt with the data on self reliance community development based on 4P Framework, i.e., People (human resource) development, Place (area) development, Product (economic and income source) development, and Plan (community plan) development in the form of rating scales. Data in the 1st part were calculated to find frequency and percentage while those in the 2nd part were analyzed to find arithmetic mean and SD. Data from the 2nd group of population or the community management team were derived from focus group to find factors influencing successful management together with the in depth interview which were analyzed by descriptive statistics. The results showed that 479 family heads reported that the aspect on the implementation of community plan to self reliance community activities based on Sufficient Economy Philosophy and the 4P was at the average of 3.28 or moderate level. When considering in details, it was found that the 1st aspect was on the area development with the mean of 3.71 or high level followed by human resource development with the mean of 3.44 or moderate level, then, economic and source of income development with the mean of 3.09 or moderate level. The last aspect was community plan development with the mean of 2.89. The results from the small group discussion revealed some factors and guidelines for successful community management as follows: 1) on the People (human resource) development aspect, there was a project to support and develop community leaders. 2) On the aspect of Place (area) development, there was a development on conservative tourism areas. 3) On the aspect of Product (economic and source of income) development, the community leaders promoted the setting of occupational group, saving group, and product processing group. 4) On the aspect of Plan (community plan) development, there was a prioritization through public hearing.

Keywords: model of community management, sustainable utilization, family heads, community management team

Procedia PDF Downloads 331
543 CT Images Based Dense Facial Soft Tissue Thickness Measurement by Open-source Tools in Chinese Population

Authors: Ye Xue, Zhenhua Deng

Abstract:

Objectives: Facial soft tissue thickness (FSTT) data could be obtained from CT scans by measuring the face-to-skull distances at sparsely distributed anatomical landmarks by manually located on face and skull. However, automated measurement using 3D facial and skull models by dense points using open-source software has become a viable option due to the development of computed assisted imaging technologies. By utilizing dense FSTT information, it becomes feasible to generate plausible automated facial approximations. Therefore, establishing a comprehensive and detailed, densely calculated FSTT database is crucial in enhancing the accuracy of facial approximation. Materials and methods: This study utilized head CT scans from 250 Chinese adults of Han ethnicity, with 170 participants originally born and residing in northern China and 80 participants in southern China. The age of the participants ranged from 14 to 82 years, and all samples were divided into five non-overlapping age groups. Additionally, samples were also divided into three categories based on BMI information. The 3D Slicer software was utilized to segment bone and soft tissue based on different Hounsfield Unit (HU) thresholds, and surface models of the face and skull were reconstructed for all samples from CT data. Following procedures were performed unsing MeshLab, including converting the face models into hollowed cropped surface models amd automatically measuring the Hausdorff Distance (referred to as FSTT) between the skull and face models. Hausdorff point clouds were colorized based on depth value and exported as PLY files. A histogram of the depth distributions could be view and subdivided into smaller increments. All PLY files were visualized of Hausdorff distance value of each vertex. Basic descriptive statistics (i.e., mean, maximum, minimum and standard deviation etc.) and distribution of FSTT were analysis considering the sex, age, BMI and birthplace. Statistical methods employed included Multiple Regression Analysis, ANOVA, principal component analysis (PCA). Results: The distribution of FSTT is mainly influenced by BMI and sex, as further supported by the results of the PCA analysis. Additionally, FSTT values exceeding 30mm were found to be more sensitive to sex. Birthplace-related differences were observed in regions such as the forehead, orbital, mandibular, and zygoma. Specifically, there are distribution variances in the depth range of 20-30mm, particularly in the mandibular region. Northern males exhibit thinner FSTT in the frontal region of the forehead compared to southern males, while females shows fewer distribution differences between the northern and southern, except for the zygoma region. The observed distribution variance in the orbital region could be attributed to differences in orbital size and shape. Discussion: This study provides a database of Chinese individuals distribution of FSTT and suggested opening source tool shows fine function for FSTT measurement. By incorporating birthplace as an influential factor in the distribution of FSTT, a greater level of detail can be achieved in facial approximation.

Keywords: forensic anthropology, forensic imaging, cranial facial reconstruction, facial soft tissue thickness, CT, open-source tool

Procedia PDF Downloads 56
542 Navigating through Organizational Change: TAM-Based Manual for Digital Skills and Safety Transitions

Authors: Margarida Porfírio Tomás, Paula Pereira, José Palma Oliveira

Abstract:

Robotic grasping is advancing rapidly, but transferring techniques from rigid to deformable objects remains a challenge. Deformable and flexible items, such as food containers, demand nuanced handling due to their changing shapes. Bridging this gap is crucial for applications in food processing, surgical robotics, and household assistance. AGILEHAND, a Horizon project, focuses on developing advanced technologies for sorting, handling, and packaging soft and deformable products autonomously. These technologies serve as strategic tools to enhance flexibility, agility, and reconfigurability within the production and logistics systems of European manufacturing companies. Key components include intelligent detection, self-adaptive handling, efficient sorting, and agile, rapid reconfiguration. The overarching goal is to optimize work environments and equipment, ensuring both efficiency and safety. As new technologies emerge in the food industry, there will be some implications, such as labour force, safety problems and acceptance of the new technologies. To overcome these implications, AGILEHAND emphasizes the integration of social sciences and humanities, for example, the application of the Technology Acceptance Model (TAM). The project aims to create a change management manual, that will outline strategies for developing digital skills and managing health and safety transitions. It will also provide best practices and models for organizational change. Additionally, AGILEHAND will design effective training programs to enhance employee skills and knowledge. This information will be obtained through a combination of case studies, structured interviews, questionnaires, and a comprehensive literature review. The project will explore how organizations adapt during periods of change and identify factors influencing employee motivation and job satisfaction. This project received funding from European Union’s Horizon 2020/Horizon Europe research and innovation program under grant agreement No101092043 (AGILEHAND).

Keywords: change management, technology acceptance model, organizational change, health and safety

Procedia PDF Downloads 38
541 Investigation of Poly P-Dioxanone as Promising Biodegradable Polymer for Short-Term Medical Application

Authors: Stefanie Ficht, Lukas Schübel, Magdalena Kleybolte, Markus Eblenkamp, Jana Steger, Dirk Wilhelm, Petra Mela

Abstract:

Although 3D printing as transformative technology has become of increasing interest in the medical field and the demand for biodegradable polymers has developed to a considerable extent, there are only a few additively manufactured, biodegradable implants on the market. Additionally, the sterilization of such implants and its side effects on degradation have still not been sufficiently studied. Within this work, thermosensitive poly p-dioxanone (PPDO) samples were printed with fused filament fabrication (FFF) and investigated. Subsequently, H₂O₂ plasma and gamma radiation were used as low-temperature sterilization techniques and compared among each other and the control group (no sterilization). In order to assess the effect of different sterilization on the degradation behavior of PPDO, the samples were immersed in phosphate-buffered solution (PBS) over 28 days, and surface morphology, thermal properties, molecular weight, inherent viscosity, and mechanical properties were examined at regular time intervals. The study demonstrates that PPDO was printed with great success and that thermal properties, molecular weight (Mw), and inherent viscosity (IV) were not significantly affected by the printing process itself. H₂O₂ plasma sterilization did not significantly harm the thermosensitive polymer, while gamma radiation lowered IV and Mw statistically significantly compared to the control group (p < 0.001). During immersion in PBS, a decrease in Mw and mechanical strength occurred for all samples. However, gamma sterilized samples were affected to a much higher extent compared to the two other sample groups both in final values and timeline. This was confirmed by scanning electron microscopy showing no changes of surface morphology of (non-sterilized) control samples, first microcracks appearing on plasma sterilized samples after two weeks while being present on gamma sterilized samples already immediately after radiation to then further deteriorate over immersion duration. To conclude, we demonstrated that FFF and H₂O₂ plasma sterilization are well suited for processing thermosensitive, biodegradable polymers used for the development of innovative short-term medical applications.

Keywords: additive manufacturing, sterilization, biodegradable, thermosensitive, medical application

Procedia PDF Downloads 117
540 The Regulation of Reputational Information in the Sharing Economy

Authors: Emre Bayamlıoğlu

Abstract:

This paper aims to provide an account of the legal and the regulative aspects of the algorithmic reputation systems with a special emphasis on the sharing economy (i.e., Uber, Airbnb, Lyft) business model. The first section starts with an analysis of the legal and commercial nature of the tripartite relationship among the parties, namely, the host platform, individual sharers/service providers and the consumers/users. The section further examines to what extent an algorithmic system of reputational information could serve as an alternative to legal regulation. Shortcomings are explained and analyzed with specific examples from Airbnb Platform which is a pioneering success in the sharing economy. The following section focuses on the issue of governance and control of the reputational information. The section first analyzes the legal consequences of algorithmic filtering systems to detect undesired comments and how a delicate balance could be struck between the competing interests such as freedom of speech, privacy and the integrity of the commercial reputation. The third section deals with the problem of manipulation by users. Indeed many sharing economy businesses employ certain techniques of data mining and natural language processing to verify consistency of the feedback. Software agents referred as "bots" are employed by the users to "produce" fake reputation values. Such automated techniques are deceptive with significant negative effects for undermining the trust upon which the reputational system is built. The third section is devoted to explore the concerns with regard to data mobility, data ownership, and the privacy. Reputational information provided by the consumers in the form of textual comment may be regarded as a writing which is eligible to copyright protection. Algorithmic reputational systems also contain personal data pertaining both the individual entrepreneurs and the consumers. The final section starts with an overview of the notion of reputation as a communitarian and collective form of referential trust and further provides an evaluation of the above legal arguments from the perspective of public interest in the integrity of reputational information. The paper concludes with certain guidelines and design principles for algorithmic reputation systems, to address the above raised legal implications.

Keywords: sharing economy, design principles of algorithmic regulation, reputational systems, personal data protection, privacy

Procedia PDF Downloads 463
539 Object-Scene: Deep Convolutional Representation for Scene Classification

Authors: Yanjun Chen, Chuanping Hu, Jie Shao, Lin Mei, Chongyang Zhang

Abstract:

Traditional image classification is based on encoding scheme (e.g. Fisher Vector, Vector of Locally Aggregated Descriptor) with low-level image features (e.g. SIFT, HoG). Compared to these low-level local features, deep convolutional features obtained at the mid-level layer of convolutional neural networks (CNN) have richer information but lack of geometric invariance. For scene classification, there are scattered objects with different size, category, layout, number and so on. It is crucial to find the distinctive objects in scene as well as their co-occurrence relationship. In this paper, we propose a method to take advantage of both deep convolutional features and the traditional encoding scheme while taking object-centric and scene-centric information into consideration. First, to exploit the object-centric and scene-centric information, two CNNs that trained on ImageNet and Places dataset separately are used as the pre-trained models to extract deep convolutional features at multiple scales. This produces dense local activations. By analyzing the performance of different CNNs at multiple scales, it is found that each CNN works better in different scale ranges. A scale-wise CNN adaption is reasonable since objects in scene are at its own specific scale. Second, a fisher kernel is applied to aggregate a global representation at each scale and then to merge into a single vector by using a post-processing method called scale-wise normalization. The essence of Fisher Vector lies on the accumulation of the first and second order differences. Hence, the scale-wise normalization followed by average pooling would balance the influence of each scale since different amount of features are extracted. Third, the Fisher vector representation based on the deep convolutional features is followed by a linear Supported Vector Machine, which is a simple yet efficient way to classify the scene categories. Experimental results show that the scale-specific feature extraction and normalization with CNNs trained on object-centric and scene-centric datasets can boost the results from 74.03% up to 79.43% on MIT Indoor67 when only two scales are used (compared to results at single scale). The result is comparable to state-of-art performance which proves that the representation can be applied to other visual recognition tasks.

Keywords: deep convolutional features, Fisher Vector, multiple scales, scale-specific normalization

Procedia PDF Downloads 324
538 Limos Lactobacillus Fermentum from Buffalo Milk Is Suitable for Potential Biotechnological Process Development

Authors: Sergio D’Ambrosioa, Azza Dobousa, Chiara Schiraldia, Donatella Ciminib

Abstract:

Probiotics are living microorganisms that give beneficial effects while consumed. Lactic acid bacteria and bifidobacteria are among the most representative strains assessed as probiotics and exploited as food supplements. Numerous studies demonstrated their potential as a therapeutic candidate for a variety of diseases (restoring gut flora, lowering cholesterol, immune response-enhancing, anti-inflammation and anti-oxidation activities). These beneficial actions are also due to biomolecules produced by probiotics, such as exopolysaccharides (EPSs), that demonstrate plenty of beneficial properties such as antimicrobial, antitumor, anti-biofilm, antiviral and immunomodulatory activities. Limosilactobacillus fermentum is a widely studied member of probiotics; however, few data are available on the development of fermentation and downstream processes for the production of viable biomasses for potential industrial applications. However, few data are available on the development of fermentation processes for the large-scale production of probiotics biomass for industrial applications and for purification processes of EPSs at an industrial scale. For this purpose, L. fermentum strain was isolated from buffalo milk and used as a test example for biotechnological process development. The strain was able to produce up to 109 CFU/mL on a (glucose-based) semi-defined medium deprived of animal-derived raw materials up to the pilot scale (150 L), demonstrating improved results compared to commonly used, although industrially not suitable, media-rich of casein and beef extract. Biomass concentration via microfiltration on hollow fibers, and subsequent spray-drying allowed to recover of about 5.7 × 1010CFU/gpowder of viable cells, indicating strain resistance to harsh processing conditions. Overall, these data demonstrate the possibility of obtaining and maintaining adequate levels of viable L. fermentum cells by using a simple approach that is potentially suitable for industrial development. A downstream EPS purification protocol based on ultrafiltration, precipitation and activated charcoal treatments showed a purity of the recovered polysaccharides of about 70-80%.

Keywords: probiotics, fermentation, exopolysaccharides (EPSs), purification

Procedia PDF Downloads 75
537 The Application of Sensory Integration Techniques in Science Teaching Students with Autism

Authors: Joanna Estkowska

Abstract:

The Sensory Integration Method is aimed primarily at children with learning disabilities. It can also be used as a complementary method in treatment of children with cerebral palsy, autistic, mentally handicapped, blind and deaf. Autism is holistic development disorder that manifests itself in the specific functioning of a child. The most characteristic are: disorders in communication, difficulties in social relations, rigid patterns of behavior and impairment in sensory processing. In addition to these disorders may occur abnormal intellectual development, attention deficit disorders, perceptual disorders and others. This study was focused on the application sensory integration techniques in science education of autistic students. The lack of proper sensory integration causes problems with complicated processes such as motor coordination, movement planning, visual or auditory perception, speech, writing, reading or counting. Good functioning and cooperation of proprioceptive, tactile and vestibular sense affect the child’s mastery of skills that require coordination of both sides of the body and synchronization of the cerebral hemispheres. These include, for example, all sports activities, precise manual skills such writing, as well as, reading and counting skills. All this takes place in stages. Achieving skills from the first stage determines the development of fitness from the next level. Any deficit in the scope of the first three stages can affect the development of new skills. This ultimately reflects on the achievements at school and in further professional and personal life. After careful analysis symptoms from the emotional and social spheres appear to be secondary to deficits of sensory integration. During our research, the students gained knowledge and skills in the classroom of experience by learning biology, chemistry and physics with application sensory integration techniques. Sensory integration therapy aims to teach the child an adequate response to stimuli coming to him from both the outside world and the body. Thanks to properly selected exercises, a child can improve perception and interpretation skills, motor skills, coordination of movements, attention and concentration or self-awareness, as well as social and emotional functioning.

Keywords: autism spectrum disorder, science education, sensory integration, special educational needs

Procedia PDF Downloads 176
536 Stretchable and Flexible Thermoelectric Polymer Composites for Self-Powered Volatile Organic Compound Vapors Detection

Authors: Petr Slobodian, Pavel Riha, Jiri Matyas, Robert Olejnik, Nuri Karakurt

Abstract:

Thermoelectric devices generate an electrical current when there is a temperature gradient between the hot and cold junctions of two dissimilar conductive materials typically n-type and p-type semiconductors. Consequently, also the polymeric semiconductors composed of polymeric matrix filled by different forms of carbon nanotubes with proper structural hierarchy can have thermoelectric properties which temperature difference transfer into electricity. In spite of lower thermoelectric efficiency of polymeric thermoelectrics in terms of the figure of merit, the properties as stretchability, flexibility, lightweight, low thermal conductivity, easy processing, and low manufacturing cost are advantages in many technological and ecological applications. Polyethylene-octene copolymer based highly elastic composites filled with multi-walled carbon nanotubes (MWCTs) were prepared by sonication of nanotube dispersion in a copolymer solution followed by their precipitation pouring into non-solvent. The electronic properties of MWCNTs were moderated by different treatment techniques such as chemical oxidation, decoration by Ag clusters or addition of low molecular dopants. In this concept, for example, the amounts of oxygenated functional groups attached on MWCNT surface by HNO₃ oxidation increase p-type charge carriers. p-type of charge carriers can be further increased by doping with molecules of triphenylphosphine. For partial altering p-type MWCNTs into less p-type ones, Ag nanoparticles were deposited on MWCNT surface and then doped with 7,7,8,8-tetracyanoquino-dimethane. Both types of MWCNTs with the highest difference in generated thermoelectric power were combined to manufacture polymeric based thermoelectric module generating thermoelectric voltage when the temperature difference is applied between hot and cold ends of the module. Moreover, it was found that the generated voltage by the thermoelectric module at constant temperature gradient was significantly affected when exposed to vapors of different volatile organic compounds representing then a self-powered thermoelectric sensor for chemical vapor detection.

Keywords: carbon nanotubes, polymer composites, thermoelectric materials, self-powered gas sensor

Procedia PDF Downloads 145
535 Representational Issues in Learning Solution Chemistry at Secondary School

Authors: Lam Pham, Peter Hubber, Russell Tytler

Abstract:

Students’ conceptual understandings of chemistry concepts/phenomena involve capability to coordinate across the three levels of Johnston’s triangle model. This triplet model is based on reasoning about chemical phenomena across macro, sub-micro and symbolic levels. In chemistry education, there is a need for further examining inquiry-based approaches that enhance students’ conceptual learning and problem solving skills. This research adopted a directed inquiry pedagogy based on students constructing and coordinating representations, to investigate senior school students’ capabilities to flexibly move across Johnston’ levels when learning dilution and molar concentration concepts. The participants comprise 50 grade 11 and 20 grade 10 students and 4 chemistry teachers who were selected from 4 secondary schools located in metropolitan Melbourne, Victoria. This research into classroom practices used ethnographic methodology, involved teachers working collaboratively with the research team to develop representational activities and lesson sequences in the instruction of a unit on solution chemistry. The representational activities included challenges (Representational Challenges-RCs) that used ‘representational tools’ to assist students to move across Johnson’s three levels for dilution phenomena. In this report, the ‘representational tool’ called ‘cross and portion’ model was developed and used in teaching and learning the molar concentration concept. Students’ conceptual understanding and problem solving skills when learning with this model are analysed through group case studies of year 10 and 11 chemistry students. In learning dilution concepts, students in both group case studies actively conducted a practical experiment, used their own language and visualisation skills to represent dilution phenomena at macroscopic level (RC1). At the sub-microscopic level, students generated and negotiated representations of the chemical interactions between solute and solvent underpinning the dilution process. At the symbolic level, students demonstrated their understandings about dilution concepts by drawing chemical structures and performing mathematical calculations. When learning molar concentration with a ‘cross and portion’ model (RC2), students coordinated across visual and symbolic representational forms and Johnson’s levels to construct representations. The analysis showed that in RC1, Year 10 students needed more ‘scaffolding’ in inducing to representations to explicit the form and function of sub-microscopic representations. In RC2, Year 11 students showed clarity in using visual representations (drawings) to link to mathematics to solve representational challenges about molar concentration. In contrast, year 10 students struggled to get match up the two systems, symbolic system of mole per litre (‘cross and portion’) and visual representation (drawing). These conceptual problems do not lie in the students’ mathematical calculation capability but rather in students’ capability to align visual representations with the symbolic mathematical formulations. This research also found that students in both group case studies were able to coordinate representations when probed about the use of ‘cross and portion’ model (in RC2) to demonstrate molar concentration of diluted solutions (in RC1). Students mostly succeeded in constructing ‘cross and portion’ models to represent the reduction of molar concentration of the concentration gradients. In conclusion, this research demonstrated how the strategic introduction and coordination of chemical representations across modes and across the macro, sub-micro and symbolic levels, supported student reasoning and problem solving in chemistry.

Keywords: cross and portion, dilution, Johnston's triangle, molar concentration, representations

Procedia PDF Downloads 134
534 Targeting APP IRE mRNA to Combat Amyloid -β Protein Expression in Alzheimer’s Disease

Authors: Mateen A Khan, Taj Mohammad, Md. Imtaiyaz Hassan

Abstract:

Alzheimer’s disease is characterized by the accumulation of the processing products of the amyloid beta peptide cleaved by amyloid precursor protein (APP). Iron increases the synthesis of amyloid beta peptides, which is why iron is present in Alzheimer's disease patients' amyloid plaques. Iron misregulation in the brain is linked to the overexpression of APP protein, which is directly related to amyloid-β aggregation in Alzheimer’s disease. The APP 5'-UTR region encodes a functional iron-responsive element (IRE) stem-loop that represents a potential target for modulating amyloid production. Targeted regulation of APP gene expression through the modulation of 5’-UTR sequence function represents a novel approach for the potential treatment of AD because altering APP translation can be used to improve both the protective brain iron balance and provide anti-amyloid efficacy. The molecular docking analysis of APP IRE RNA with eukaryotic translation initiation factors yields several models exhibiting substantial binding affinity. The finding revealed that the interaction involved a set of functionally active residues within the binding sites of eIF4F. Notably, APP IRE RNA and eIF4F interaction were stabilized by multiple hydrogen bonds with residues of APP IRE RNA and eIF4F. It was evident that APP IRE RNA exhibited a structural complementarity that tightly fit within binding pockets of eIF4F. The simulation studies further revealed the stability of the complexes formed between RNA and eIF4F, which is crucial for assessing the strength of these interactions and subsequent roles in the pathophysiology of Alzheimer’s disease. In addition, MD simulations would capture conformational changes in the IRE RNA and protein molecules during their interactions, illustrating the mechanism of interaction, conformational change, and unbinding events and how it may affect aggregation propensity and subsequent therapeutic implications. Our binding studies correlated well with the translation efficiency of APP mRNA. Overall, the outcome of this study suggests that the genomic modification and/or inhibiting the expression of amyloid protein by targeting APP IRE RNA can be a viable strategy to identify potential therapeutic targets for AD and subsequently be exploited for developing novel therapeutic approaches.

Keywords: Alzheimer's disease, Protein-RNA interaction analysis, molecular docking simulations, conformational dynamics, binding stability, binding kinetics, protein synthesis.

Procedia PDF Downloads 51
533 Developing the Collaboration Model of Physical Education and Sport Sciences Faculties with Service Section of Sport Industrial

Authors: Vahid Saatchian, Seyyed Farideh Hadavi

Abstract:

The main aim of this study was developing the collaboration model of physical education and sport sciences faculties with service section of sport industrial.The research methods of this study was a qualitative. So researcher with of identifying the priority list of collaboration between colleges and service section of sport industry and according to sampling based of subjective and snowball approach, conducted deep interviews with 22 elites that study around the field of research topic. indeed interviews were analyzed through qualitative coding (open, axial and selective) with 5 category such as causal condition, basic condition, intervening conditions, action/ interaction and strategy. Findings exposed that in causal condition 10 labels appeared. So because of heterogeneity of labes, researcher categorized in total subject. In basic condition 59 labels in open coding identified this categorized in 14 general concepts. Furthermore with composition of the declared category and relationship between them, 5 final and internal categories (culture, intelligence, marketing, environment and ultra-powers) were appeared. Also an intervening condition in the study includes 5 overall scopes of social factors, economic, cultural factors, and the management of the legal and political factors that totally named macro environment. Indeed for identifying strategies, 8 areas that covered with internal and external challenges relationship management were appeared. These are including, understanding, outside awareness, manpower, culture, integrated management, the rules and regulations and marketing. Findings exposed 8 labels in open coding which covered the internal and external of challenges of relation management of two sides and these concepts were knowledge and awareness, external view, human source, madding organizational culture, parties’ thoughts, unit responsible for/integrated management, laws and regulations and marketing. Eventually the consequences categorized in line of strategies and were at scope of the cultural development, general development, educational development, scientific development, under development, international development, social development, economic development, technology development and political development that consistent with strategies. The research findings could help the sport managers witch use to scientific collaboration management and the consequences of this in those sport institutions. Finally, the consequences that identified as a result of the devopmental strategies include: cultural, governmental, educational, scientific, infrastructure, international, social, economic, technological and political that is largely consistent with strategies. With regard to the above results, enduring and systematic relation with long term cooperation between the two sides requires strategic planning were based on cooperation of all stakeholders. Through this, in the turbulent constantly changing current sustainable environment, competitive advantage for university and industry obtained. No doubt that lack of vision and strategic thinking for cooperation in the planning of the university and industry from its capability and instead of using the opportunity, lead the opportunities to problems.

Keywords: university and industry collaboration, sport industry, physical education and sport science college, service section of sport industry

Procedia PDF Downloads 375
532 A Multilingual Model in the Multicultural World

Authors: Marina Petrova

Abstract:

Language policy issues related to the preservation and development of the native languages of the Russian peoples and the state languages of the national republics are increasingly becoming the focus of recent attention of educators and parents, public and national figures. Is it legal to teach the national language or the mother tongue as the state language? Due to that dispute language phobia moods easily evolve into xenophobia among the population. However, a civilized, intelligent multicultural personality can only be formed if the country develops bilingualism and multilingualism, and languages as a political tool help to find ‘keys’ to sufficiently closed national communities both within a poly-ethnic state and in internal relations of multilingual countries. The purpose of this study is to design and theoretically substantiate an efficient model of language education in the innovatively developing Republic of Sakha. 800 participants from different educational institutions of Yakutia worked at developing a multilingual model of education. This investigation is of considerable practical importance because researchers could build a methodical system designed to create conditions for the formation of a cultural language personality and the development of the multilingual communicative competence of Yakut youth, necessary for communication in native, Russian and foreign languages. The selected methodology of humane-personal and competence approaches is reliable and valid. Researchers used a variety of sources of information, including access to related scientific fields (philosophy of education, sociology, humane and social pedagogy, psychology, effective psychotherapy, methods of teaching Russian, psycholinguistics, socio-cultural education, ethnoculturology, ethnopsychology). Of special note is the application of theoretical and empirical research methods, a combination of academic analysis of the problem and experienced training, positive results of experimental work, representative series, correct processing and statistical reliability of the obtained data. It ensures the validity of the investigation’s findings as well as their broad introduction into practice of life-long language education.

Keywords: intercultural communication, language policy, multilingual and multicultural education, the Sakha Republic of Yakutia

Procedia PDF Downloads 219
531 Spectral Mapping of Hydrothermal Alteration Minerals for Geothermal Exploration Using Advanced Spaceborne Thermal Emission and Reflection Radiometer Short Wave Infrared Data

Authors: Aliyu J. Abubakar, Mazlan Hashim, Amin B. Pour

Abstract:

Exploiting geothermal resources for either power, home heating, Spa, greenhouses, industrial or tourism requires an initial identification of suitable areas. This can be done cost-effectively using remote sensing satellite imagery which has synoptic capabilities of covering large areas in real time and by identifying possible areas of hydrothermal alteration and minerals related to Geothermal systems. Earth features and minerals are known to have unique diagnostic spectral reflectance characteristics that can be used to discriminate them. The focus of this paper is to investigate the applicability of mapping hydrothermal alteration in relation to geothermal systems (thermal springs) at Yankari Park Northeastern Nigeria, using Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) satellite data for resource exploration. The ASTER Short Wave Infrared (SWIR) bands are used to highlight and discriminate alteration areas by employing sophisticated digital image processing techniques including image transformations and spectral mapping methods. Field verifications are conducted at the Yankari Park using hand held Global Positioning System (GPS) monterra to identify locations of hydrothermal alteration and rock samples obtained at the vicinity and surrounding areas of the ‘Mawulgo’ and ‘Wikki’ thermal springs. X-Ray Diffraction (XRD) results of rock samples obtained from the field validated hydrothermal alteration by the presence of indicator minerals including; Dickite, Kaolinite, Hematite and Quart. The study indicated the applicability of mapping geothermal anomalies for resource exploration in unmapped sparsely vegetated savanna environment characterized by subtle surface manifestations such as thermal springs. The results could have implication for geothermal resource exploration especially at the prefeasibility stages by narrowing targets for comprehensive surveys and in unexplored savanna regions where expensive airborne surveys are unaffordable.

Keywords: geothermal exploration, image enhancement, minerals, spectral mapping

Procedia PDF Downloads 355
530 Web-Based Tools to Increase Public Understanding of Nuclear Technology and Food Irradiation

Authors: Denise Levy, Anna Lucia C. H. Villavicencio

Abstract:

Food irradiation is a processing and preservation technique to eliminate insects and parasites and reduce disease-causing microorganisms. Moreover, the process helps to inhibit sprouting and delay ripening, extending fresh fruits and vegetables shelf-life. Nevertheless, most Brazilian consumers seem to misunderstand the difference between irradiated food and radioactive food and the general public has major concerns about the negative health effects and environmental contamination. Society´s judgment and decision making are directly linked to perceived benefits and risks. The web-based project entitled ‘Scientific information about food irradiation: Internet as a tool to approach science and society’ was created by the Nuclear and Energetic Research Institute (IPEN), in order to offer an interdisciplinary approach to science education, integrating economic, ethical, social and political aspects of food irradiation. This project takes into account that, misinformation and unfounded preconceived ideas impact heavily on the acceptance of irradiated food and purchase intention by the Brazilian consumer. Taking advantage of the potential value of the Internet to enhance communication and education among general public, a research study was carried out regarding the possibilities and trends of Information and Communication Technologies among the Brazilian population. The content includes concepts, definitions and Frequently Asked Questions (FAQ) about processes, safety, advantages, limitations and the possibilities of food irradiation, including health issues, as well as its impacts on the environment. The project counts on eight self-instructional interactive web courses, situating scientific content in relevant social contexts in order to encourage self-learning and further reflections. Communication is a must to improve public understanding of science. The use of information technology for quality scientific divulgation shall contribute greatly to provide information throughout the country, spreading information to as many people as possible, minimizing geographic distances and stimulating communication and development.

Keywords: food irradiation, multimedia learning tools, nuclear science, society and education

Procedia PDF Downloads 244
529 Extremophilic Amylases of Mycelial Fungi Strains Isolated in South Caucasus for Starch Processing

Authors: T. Urushadze, R. Khvedelidze, L. Kutateladze, M. Jobava, T. Burduli, T. Alexidze

Abstract:

There is an increasing interest in reliable, wasteless, ecologically friendly technologies. About 40% of enzymes produced all over the world are used for production of syrups with high concentration of glucose-fructose. One of such technologies complies obtaining fermentable sugar glucose from raw materials containing starch by means of amylases. In modern alcohol-producing factories this process is running in two steps, involving two enzymes of different origin: bacterial α-amylase and fungal glucoamylase, as generally fungal amylases are less thermostable as compared to bacterial amylases. Selection of stable and operable at 700С and higher temperatures enzyme preparation with both α- and glucoamylase activities will allow conducting this process in one step. S. Durmishidze Institute of Biochemistry and Biotechnology owns unique collection of mycelial fungi, isolated from different ecological niches of Caucasus. As a result of screening our collection 39 strains poducing amylases were revealed. Most of them belong to the genus Aspergillus. Optimum temperatures of action of selected amylases from three producers were estableshed to be within the range 67-80°C. A. niger B-6 showed higher α-amylase activity at 67°C, and glucoamylase activity at 62°C, A. niger 6-12 showed higher α-amylase activity at 72°C, and glucoamylase activity at 65°C, Aspergillus niger p8-3 showed higher activities at 82°C and 70°C, for α-amylase and glucoamylase activities, respectively. Exhaustive hydrolysis process of starch solutions of different concentrations (3, 5, 15, and 30 %) with cultural liquid and technical preparation of Aspergillus niger p8-3 enzyme was studied. In case of low concentrations exhaustive hydrolysis of starch lasts 40–60 minutes, in case of high concentrations hydrolysis takes longer time. 98, 6% yield of glucose can be reached at incubation during 12 hours with enzyme cultural liquid and 8 hours incubation with technical preparation of the enzyme at gradual increase of temperature from 50°C to 82°C during the first 20 minutes and further decrease of temperature to 70°C. Temperature setting for high yield of glucose and high hydrolysis (pasteurizing), optimal for activity of these strains is the prerequisite to be able to carry out hydrolysis of starch to glucose in one step, and consequently, using one strain, what will be economically justified.

Keywords: amylase, glucose hydrolisis, stability, starch

Procedia PDF Downloads 344
528 Effects of Mild Heat Treatment on the Physical and Microbial Quality of Salak Apricot Cultivar

Authors: Bengi Hakguder Taze, Sevcan Unluturk

Abstract:

Şalak apricot (Prunus armeniaca L., cv. Şalak) is a specific variety grown in Igdir, Turkey. The fruit has distinctive properties distinguish it from other cultivars, such as its unique size, color, taste and higher water content. Drying is the widely used method for preservation of apricots. However, fresh consumption is preferred for Şalak apricot instead of drying due to its low dry matter content. Higher amounts of water in the structure and climacteric nature make the fruit sensitive against rapid quality loss during storage. Hence, alternative processing methods need to be introduced to extend the shelf life of the fresh produce. Mild heat (MH) treatment is of great interest as it can reduce the microbial load and inhibit enzymatic activities. Therefore, the aim of this study was to evaluate the impact of mild heat treatment on the natural microflora found on Şalak apricot surfaces and some physical quality parameters of the fruit, such as color and firmness. For this purpose, apricot samples were treated at different temperatures between 40 and 60 ℃ for different periods ranging between 10 to 60 min using a temperature controlled water bath. Natural flora on the fruit surfaces was examined using standard plating technique both before and after the treatment. Moreover, any changes in color and firmness of the fruit samples were also monitored. It was found that control samples were initially containing 7.5 ± 0.32 log CFU/g of total aerobic plate count (TAPC), 5.8±0.31 log CFU/g of yeast and mold count (YMC), and 5.17 ± 0.22 log CFU/g of coliforms. The highest log reductions in TAPC and YMC were observed as 3.87-log and 5.8-log after the treatments at 60 ℃ and 50 ℃, respectively. Nevertheless, the fruit lost its characteristic aroma at temperatures above 50 ℃. Furthermore, great color changes (ΔE ˃ 6) were observed and firmness of the apricot samples was reduced at these conditions. On the other hand, MH treatment at 41 ℃ for 10 min resulted in 1.6-log and 0.91-log reductions in TAPC and YMC, respectively, with slightly noticeable changes in color (ΔE ˂ 3). In conclusion, application of temperatures higher than 50 ℃ caused undesirable changes in physical quality of Şalak apricots. Although higher microbial reductions were achieved at those temperatures, temperatures between 40 and 50°C should be further investigated considering the fruit quality parameters. Another strategy may be the use of high temperatures for short time periods not exceeding 1-5 min. Besides all, MH treatment with UV-C light irradiation can be also considered as a hurdle strategy for better inactivation results.

Keywords: color, firmness, mild heat, natural flora, physical quality, şalak apricot

Procedia PDF Downloads 130
527 Detecting Tomato Flowers in Greenhouses Using Computer Vision

Authors: Dor Oppenheim, Yael Edan, Guy Shani

Abstract:

This paper presents an image analysis algorithm to detect and count yellow tomato flowers in a greenhouse with uneven illumination conditions, complex growth conditions and different flower sizes. The algorithm is designed to be employed on a drone that flies in greenhouses to accomplish several tasks such as pollination and yield estimation. Detecting the flowers can provide useful information for the farmer, such as the number of flowers in a row, and the number of flowers that were pollinated since the last visit to the row. The developed algorithm is designed to handle the real world difficulties in a greenhouse which include varying lighting conditions, shadowing, and occlusion, while considering the computational limitations of the simple processor in the drone. The algorithm identifies flowers using an adaptive global threshold, segmentation over the HSV color space, and morphological cues. The adaptive threshold divides the images into darker and lighter images. Then, segmentation on the hue, saturation and volume is performed accordingly, and classification is done according to size and location of the flowers. 1069 images of greenhouse tomato flowers were acquired in a commercial greenhouse in Israel, using two different RGB Cameras – an LG G4 smartphone and a Canon PowerShot A590. The images were acquired from multiple angles and distances and were sampled manually at various periods along the day to obtain varying lighting conditions. Ground truth was created by manually tagging approximately 25,000 individual flowers in the images. Sensitivity analyses on the acquisition angle of the images, periods throughout the day, different cameras and thresholding types were performed. Precision, recall and their derived F1 score were calculated. Results indicate better performance for the view angle facing the flowers than any other angle. Acquiring images in the afternoon resulted with the best precision and recall results. Applying a global adaptive threshold improved the median F1 score by 3%. Results showed no difference between the two cameras used. Using hue values of 0.12-0.18 in the segmentation process provided the best results in precision and recall, and the best F1 score. The precision and recall average for all the images when using these values was 74% and 75% respectively with an F1 score of 0.73. Further analysis showed a 5% increase in precision and recall when analyzing images acquired in the afternoon and from the front viewpoint.

Keywords: agricultural engineering, image processing, computer vision, flower detection

Procedia PDF Downloads 319
526 Waste Management Option for Bioplastics Alongside Conventional Plastics

Authors: Dan Akesson, Gauthaman Kuzhanthaivelu, Martin Bohlen, Sunil K. Ramamoorthy

Abstract:

Bioplastics can be defined as polymers derived partly or completely from biomass. Bioplastics can be biodegradable such as polylactic acid (PLA) and polyhydroxyalkonoates (PHA); or non-biodegradable (biobased polyethylene (bio-PE), polypropylene (bio-PP), polyethylene terephthalate (bio-PET)). The usage of such bioplastics is expected to increase in the future due to new found interest in sustainable materials. At the same time, these plastics become a new type of waste in the recycling stream. Most countries do not have separate bioplastics collection for it to be recycled or composted. After a brief introduction of bioplastics such as PLA in the UK, these plastics are once again replaced by conventional plastics by many establishments due to lack of commercial composting. Recycling companies fear the contamination of conventional plastic in the recycling stream and they said they would have to invest in expensive new equipment to separate bioplastics and recycle it separately. This project studies what happens when bioplastics contaminate conventional plastics. Three commonly used conventional plastics were selected for this study: polyethylene (PE), polypropylene (PP) and polyethylene terephthalate (PET). In order to simulate contamination, two biopolymers, either polyhydroxyalkanoate (PHA) or thermoplastic starch (TPS) were blended with the conventional polymers. The amount of bioplastics in conventional plastics was either 1% or 5%. The blended plastics were processed again to see the effect of degradation. The results from contamination showed that the tensile strength and the modulus of PE was almost unaffected whereas the elongation is clearly reduced indicating the increase in brittleness of the plastic. Generally, it can be said that PP is slightly more sensitive to the contamination than PE. This can be explained by the fact that the melting point of PP is higher than for PE and as a consequence, the biopolymer will degrade more quickly. However, the reduction of the tensile properties for PP is relatively modest. Impact strength is generally a more sensitive test method towards contamination. Again, PE is relatively unaffected by the contamination but for PP there is a relatively large reduction of the impact properties already at 1% contamination. PET is polyester, and it is, by its very nature, more sensitive to degradation than PE and PP. PET also has a much higher melting point than PE and PP, and as a consequence, the biopolymer will quickly degrade at the processing temperature of PET. As for the tensile strength, PET can tolerate 1% contamination without any reduction of the tensile strength. However, when the impact strength is examined, it is clear that already at 1% contamination, there is a strong reduction of the properties. The thermal properties show the change in the crystallinity. The blends were also characterized by SEM. Biphasic morphology can be seen as the two polymers are not truly blendable which also contributes to reduced mechanical properties. The study shows that PE is relatively robust against contamination, while polypropylene (PP) is sensitive and polyethylene terephthalate (PET) can be quite sensitive towards contamination.

Keywords: bioplastics, contamination, recycling, waste management

Procedia PDF Downloads 220
525 Enhancing the Implementation Strategy of Simultaneous Operations (SIMOPS) for the Major Turnaround at Pertamina Plaju Refinery

Authors: Fahrur Rozi, Daniswara Krisna Prabatha, Latief Zulfikar Chusaini

Abstract:

Amidst the backdrop of Pertamina Plaju Refinery, which stands as the oldest and historically less technologically advanced among Pertamina's refineries, lies a unique challenge. Originally integrating facilities established by Shell in 1904 and Stanvac (originally Standard Oil) in 1926, the primary challenge at Plaju Refinery does not solely revolve around complexity; instead, it lies in ensuring reliability, considering its operational history of over a century. After centuries of existence, Plaju Refinery has never undergone a comprehensive major turnaround encompassing all its units. The usual practice involves partial turnarounds that are sequentially conducted across its primary, secondary, and tertiary units (utilities and offsite). However, a significant shift is on the horizon. In the Q-IV of 2023, the refinery embarks on its first-ever major turnaround since its establishment. This decision was driven by the alignment of maintenance timelines across various units. Plaju Refinery's major turnaround was scheduled for October-November 2023, spanning 45 calendar days, with the objective of enhancing the operational reliability of all refinery units. The extensive job list for this turnaround encompasses 1583 tasks across 18 units/areas, involving approximately 9000 contracted workers. In this context, the Strategy of Simultaneous Operations (SIMOPS) execution emerges as a pivotal tool to optimize time efficiency and ensure safety. A Hazard Effect Management Process (HEMP) has been employed to assess the risk ratings of each task within the turnaround. Out of the tasks assessed, 22 are deemed high-risk and necessitate mitigation. The SIMOPS approach serves as a preventive measure against potential incidents. It is noteworthy that every turnaround period at Pertamina Plaju Refinery involves SIMOPS-related tasks. In this context, enhancing the implementation strategy of "Simultaneous Operations (SIMOPS)" becomes imperative to minimize the occurrence of incidents. At least four improvements have been introduced in the enhancement process for the major turnaround at Refinery Plaju. The first improvement involves conducting systematic risk assessment and potential hazard mitigation studies for SIMOPS tasks before task execution, as opposed to the previous on-site approach. The second improvement includes the completion of SIMOPS Job Mitigation and Work Matrices Sheets, which was often neglected in the past. The third improvement emphasizes comprehensive awareness to workers/contractors regarding potential hazards and mitigation strategies for SIMOPS tasks before and during the major turnaround. The final improvement is the introduction of a daily program for inspecting and observing work in progress for SIMOPS tasks. Prior to these improvements, there was no established program for monitoring ongoing activities related to SIMOPS tasks during the turnaround. This study elucidates the steps taken to enhance SIMOPS within Pertamina, drawing from the experiences of Plaju Refinery as a guide. A real actual case study will be provided from our experience in the operational unit. In conclusion, these efforts are essential for the success of the first-ever major turnaround at Plaju Refinery, with the SIMOPS strategy serving as a central component. Based on these experiences, enhancements have been made to Pertamina's official Internal Guidelines for Executing SIMOPS Risk Mitigation, benefiting all Pertamina units.

Keywords: process safety management, turn around, oil refinery, risk assessment

Procedia PDF Downloads 62
524 Gis Based Flash Flood Runoff Simulation Model of Upper Teesta River Besin - Using Aster Dem and Meteorological Data

Authors: Abhisek Chakrabarty, Subhraprakash Mandal

Abstract:

Flash flood is one of the catastrophic natural hazards in the mountainous region of India. The recent flood in the Mandakini River in Kedarnath (14-17th June, 2013) is a classic example of flash floods that devastated Uttarakhand by killing thousands of people.The disaster was an integrated effect of high intensityrainfall, sudden breach of Chorabari Lake and very steep topography. Every year in Himalayan Region flash flood occur due to intense rainfall over a short period of time, cloud burst, glacial lake outburst and collapse of artificial check dam that cause high flow of river water. In Sikkim-Derjeeling Himalaya one of the probable flash flood occurrence zone is Teesta Watershed. The Teesta River is a right tributary of the Brahmaputra with draining mountain area of approximately 8600 Sq. km. It originates in the Pauhunri massif (7127 m). The total length of the mountain section of the river amounts to 182 km. The Teesta is characterized by a complex hydrological regime. The river is fed not only by precipitation, but also by melting glaciers and snow as well as groundwater. The present study describes an attempt to model surface runoff in upper Teesta basin, which is directly related to catastrophic flood events, by creating a system based on GIS technology. The main object was to construct a direct unit hydrograph for an excess rainfall by estimating the stream flow response at the outlet of a watershed. Specifically, the methodology was based on the creation of a spatial database in GIS environment and on data editing. Moreover, rainfall time-series data collected from Indian Meteorological Department and they were processed in order to calculate flow time and the runoff volume. Apart from the meteorological data, background data such as topography, drainage network, land cover and geological data were also collected. Clipping the watershed from the entire area and the streamline generation for Teesta watershed were done and cross-sectional profiles plotted across the river at various locations from Aster DEM data using the ERDAS IMAGINE 9.0 and Arc GIS 10.0 software. The analysis of different hydraulic model to detect flash flood probability ware done using HEC-RAS, Flow-2D, HEC-HMS Software, which were of great importance in order to achieve the final result. With an input rainfall intensity above 400 mm per day for three days the flood runoff simulation models shows outbursts of lakes and check dam individually or in combination with run-off causing severe damage to the downstream settlements. Model output shows that 313 Sq. km area were found to be most vulnerable to flash flood includes Melli, Jourthang, Chungthang, and Lachung and 655sq. km. as moderately vulnerable includes Rangpo,Yathang, Dambung,Bardang, Singtam, Teesta Bazarand Thangu Valley. The model was validated by inserting the rain fall data of a flood event took place in August 1968, and 78% of the actual area flooded reflected in the output of the model. Lastly preventive and curative measures were suggested to reduce the losses by probable flash flood event.

Keywords: flash flood, GIS, runoff, simulation model, Teesta river basin

Procedia PDF Downloads 308
523 Investigation of Dry-Blanching and Freezing Methods of Fruits

Authors: Epameinondas Xanthakis, Erik Kaunisto, Alain Le-Bail, Lilia Ahrné

Abstract:

Fruits and vegetables are characterized as perishable food matrices due to their short shelf life as several deterioration mechanisms are being involved. Prior to the common preservation methods like freezing or canning, fruits and vegetables are being blanched in order to inactivate deteriorative enzymes. Both conventional blanching pretreatments and conventional freezing methods hide drawbacks behind their beneficial impacts on the preservation of those matrices. Conventional blanching methods may require longer processing times, leaching of minerals and nutrients due to the contact with the warm water which in turn leads to effluent production with large BOD. An important issue of freezing technologies is the size of the formed ice crystals which is also critical for the final quality of the frozen food as it can cause irreversible damage to the cellular structure and subsequently to degrade the texture and the colour of the product. Herein, the developed microwave blanching methodology and the results regarding quality aspects and enzyme inactivation will be presented. Moreover, heat transfer phenomena, mass balance, temperature distribution, and enzyme inactivation (such as Pectin Methyl Esterase and Ascorbic Acid Oxidase) of our microwave blanching approach will be evaluated based on measurements and computer modelling. The present work is part of the COLDμWAVE project which aims to the development of an innovative environmentally sustainable process for blanching and freezing of fruits and vegetables with improved textural and nutritional quality. In this context, COLDµWAVE will develop tailored equipment for MW blanching of vegetables that has very high energy efficiency and no water consumption. Furthermore, the next steps of this project regarding the development of innovative pathways in MW assisted freezing to improve the quality of frozen vegetables, by exploring in depth previous results acquired by the authors, will be presented. The application of MW assisted freezing process on fruits and vegetables it is expected to lead to improved quality characteristics compared to the conventional freezing. Acknowledgments: COLDμWAVE has received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Sklodowska-Curie grand agreement No 660067.

Keywords: blanching, freezing, fruits, microwave blanching, microwave

Procedia PDF Downloads 263
522 AIR SAFE: an Internet of Things System for Air Quality Management Leveraging Artificial Intelligence Algorithms

Authors: Mariangela Viviani, Daniele Germano, Simone Colace, Agostino Forestiero, Giuseppe Papuzzo, Sara Laurita

Abstract:

Nowadays, people spend most of their time in closed environments, in offices, or at home. Therefore, secure and highly livable environmental conditions are needed to reduce the probability of aerial viruses spreading. Also, to lower the human impact on the planet, it is important to reduce energy consumption. Heating, Ventilation, and Air Conditioning (HVAC) systems account for the major part of energy consumption in buildings [1]. Devising systems to control and regulate the airflow is, therefore, essential for energy efficiency. Moreover, an optimal setting for thermal comfort and air quality is essential for people’s well-being, at home or in offices, and increases productivity. Thanks to the features of Artificial Intelligence (AI) tools and techniques, it is possible to design innovative systems with: (i) Improved monitoring and prediction accuracy; (ii) Enhanced decision-making and mitigation strategies; (iii) Real-time air quality information; (iv) Increased efficiency in data analysis and processing; (v) Advanced early warning systems for air pollution events; (vi) Automated and cost-effective m onitoring network; and (vii) A better understanding of air quality patterns and trends. We propose AIR SAFE, an IoT-based infrastructure designed to optimize air quality and thermal comfort in indoor environments leveraging AI tools. AIR SAFE employs a network of smart sensors collecting indoor and outdoor data to be analyzed in order to take any corrective measures to ensure the occupants’ wellness. The data are analyzed through AI algorithms able to predict the future levels of temperature, relative humidity, and CO₂ concentration [2]. Based on these predictions, AIR SAFE takes actions, such as opening/closing the window or the air conditioner, to guarantee a high level of thermal comfort and air quality in the environment. In this contribution, we present the results from the AI algorithm we have implemented on the first s et o f d ata c ollected i n a real environment. The results were compared with other models from the literature to validate our approach.

Keywords: air quality, internet of things, artificial intelligence, smart home

Procedia PDF Downloads 86