Search results for: RGB channel extraction
394 Object-Scene: Deep Convolutional Representation for Scene Classification
Authors: Yanjun Chen, Chuanping Hu, Jie Shao, Lin Mei, Chongyang Zhang
Abstract:
Traditional image classification is based on encoding scheme (e.g. Fisher Vector, Vector of Locally Aggregated Descriptor) with low-level image features (e.g. SIFT, HoG). Compared to these low-level local features, deep convolutional features obtained at the mid-level layer of convolutional neural networks (CNN) have richer information but lack of geometric invariance. For scene classification, there are scattered objects with different size, category, layout, number and so on. It is crucial to find the distinctive objects in scene as well as their co-occurrence relationship. In this paper, we propose a method to take advantage of both deep convolutional features and the traditional encoding scheme while taking object-centric and scene-centric information into consideration. First, to exploit the object-centric and scene-centric information, two CNNs that trained on ImageNet and Places dataset separately are used as the pre-trained models to extract deep convolutional features at multiple scales. This produces dense local activations. By analyzing the performance of different CNNs at multiple scales, it is found that each CNN works better in different scale ranges. A scale-wise CNN adaption is reasonable since objects in scene are at its own specific scale. Second, a fisher kernel is applied to aggregate a global representation at each scale and then to merge into a single vector by using a post-processing method called scale-wise normalization. The essence of Fisher Vector lies on the accumulation of the first and second order differences. Hence, the scale-wise normalization followed by average pooling would balance the influence of each scale since different amount of features are extracted. Third, the Fisher vector representation based on the deep convolutional features is followed by a linear Supported Vector Machine, which is a simple yet efficient way to classify the scene categories. Experimental results show that the scale-specific feature extraction and normalization with CNNs trained on object-centric and scene-centric datasets can boost the results from 74.03% up to 79.43% on MIT Indoor67 when only two scales are used (compared to results at single scale). The result is comparable to state-of-art performance which proves that the representation can be applied to other visual recognition tasks.Keywords: deep convolutional features, Fisher Vector, multiple scales, scale-specific normalization
Procedia PDF Downloads 331393 Identification and Characterization of in Vivo, in Vitro and Reactive Metabolites of Zorifertinib Using Liquid Chromatography Lon Trap Mass Spectrometry
Authors: Adnan A. Kadi, Nasser S. Al-Shakliah, Haitham Al-Rabiah
Abstract:
Zorifertinib is a novel, potent, oral, a small molecule used to treat non-small cell lung cancer (NSCLC). zorifertinib is an Epidermal Growth Factor Receptor (EGFR) inhibitor and has good blood–brain barrier permeability for (NSCLC) patients with EGFR mutations. zorifertinibis currently at phase II/III clinical trials. The current research reports the characterization and identification of in vitro, in vivo and reactive intermediates of zorifertinib. Prediction of susceptible sites of metabolism and reactivity pathways (cyanide and GSH) of zorifertinib were performed by the Xenosite web predictor tool. In-vitro metabolites of zorifertinib were performed by incubation with rat liver microsomes (RLMs) and isolated perfused rat liver hepatocytes. Extraction of zorifertinib and it's in vitro metabolites from the incubation mixtures were done by protein precipitation. In vivo metabolism was done by giving a single oral dose of zorifertinib(10 mg/Kg) to Sprague Dawely rats in metabolic cages by using oral gavage. Urine was gathered and filtered at specific time intervals (0, 6, 12, 18, 24, 48, 72,96and 120 hr) from zorifertinib dosing. A similar volume of ACN was added to each collected urine sample. Both layers (organic and aqueous) were injected into liquid chromatography ion trap mass spectrometry(LC-IT-MS) to detect vivozorifertinib metabolites. N-methyl piperizine ring and quinazoline group of zorifertinib undergoe metabolism forming iminium and electro deficient conjugated system respectively, which are very reactive toward nucleophilic macromolecules. Incubation of zorifertinib with RLMs in the presence of 1.0 mM KCN and 1.0 Mm glutathione were made to check reactive metabolites as it is often responsible for toxicities associated with this drug. For in vitro metabolites there were nine in vitro phase I metabolites, four in vitro phase II metabolites, eleven reactive metabolites(three cyano adducts, five GSH conjugates metabolites, and three methoxy metabolites of zorifertinib were detected by LC-IT-MS. For in vivo metabolites, there were eight in vivo phase I, tenin vivo phase II metabolitesofzorifertinib were detected by LC-IT-MS. In vitro and in vivo phase I metabolic pathways wereN- demthylation, O-demethylation, hydroxylation, reduction, defluorination, and dechlorination. In vivo phase II metabolic reaction was direct conjugation of zorifertinib with glucuronic acid and sulphate.Keywords: in vivo metabolites, in vitro metabolites, cyano adducts, GSH conjugate
Procedia PDF Downloads 198392 Investigating the Use of Seaweed Extracts as Biopesticides
Authors: Emma O’ Keeffe, Helen Hughes, Peter McLoughlin, Shiau Pin Tan, Nick McCarthy
Abstract:
Biosecurity is emerging as one of the most important issues facing the agricultural and forestry community. This is as a result of increased invasion from new pests and diseases with the main protocol for dealing with these species being the use of synthetic pesticides. However, these chemicals have been shown to exhibit negative effects on the environment. Seaweeds represent a vast untapped resource of bio-molecules with a broad range of biological activities including pesticidal. This project investigated both the antifungal and antibacterial activity of seaweed species against two problematic root rot fungi, Armillaria mellea and Heterobasidion annosum and ten quarantine bacterial plant pathogens including Xanthomonas arboricola, Xanthomonas fragariae, and Erwinia amylovora. Four seaweed species were harvested from the South-East coast of Ireland including brown, red and green varieties. The powdered seaweeds were extracted using four different solvents by liquid extraction. The poisoned food technique was employed to establish the antifungal efficacy, and the standard disc diffusion assay was used to assess the antibacterial properties of the seaweed extracts. It was found that extracts of the green seaweed exhibited antifungal activity against H. annosum, with approximately 50% inhibition compared to the negative control. The protectant activities of the active extracts were evaluated on disks of Picea sitchensis, a plant species sensitive to infection from H. annosum and compared to the standard chemical control product urea. The crude extracts exhibited very similar activity to the 10% and 20% w/v concentrations of urea, demonstrating the ability of seaweed extracts to compete with commercially available products. Antibacterial activity was exhibited by a number of seaweed extracts with the red seaweed illustrating the strongest activity, with a zone of inhibition of 15.83 ± 0.41 mm exhibited against X. arboricola whilst the positive control (10 μg/disk of chloramphenicol) had a zone of 26.5 ± 0.71 mm. These results highlight the potential application of seaweed extracts in the forestry and agricultural industries for use as biopesticides. Further work is now required to identify the bioactive molecules that are responsible for this antifungal and antibacterial activity in the seaweed extracts, including toxicity studies to ensure the extracts are non-toxic to plants and humans.Keywords: antibacterial, antifungal, biopesticides, seaweeds
Procedia PDF Downloads 173391 Transcriptomic Analysis for Differential Expression of Genes Involved in Secondary Metabolite Production in Narcissus Bulb and in vitro Callus
Authors: Aleya Ferdausi, Meriel Jones, Anthony Halls
Abstract:
The Amaryllidaceae genus Narcissus contains secondary metabolites, which are important sources of bioactive compounds such as pharmaceuticals indicating that their biological activity extends from the native plant to humans. Transcriptome analysis (RNA-seq) is an effective platform for the identification and functional characterization of candidate genes as well as to identify genes encoding uncharacterized enzymes. The biotechnological production of secondary metabolites in plant cell or organ cultures has become a tempting alternative to the extraction of whole plant material. The biochemical pathways for the production of secondary metabolites require primary metabolites to undergo a series of modifications catalyzed by enzymes such as cytochrome P450s, methyltransferases, glycosyltransferases, and acyltransferases. Differential gene expression analysis of Narcissus was obtained from two conditions, i.e. field and in vitro callus. Callus was obtained from modified MS (Murashige and Skoog) media supplemented with growth regulators and twin-scale explants from Narcissus cv. Carlton bulb. A total of 2153 differentially expressed transcripts were detected in Narcissus bulb and in vitro callus, and 78.95% of those were annotated. It showed the expression of genes involved in the biosynthesis of alkaloids were present in both conditions i.e. cytochrome P450s, O-methyltransferase (OMTs), NADP/NADPH dehydrogenases or reductases, SAM-synthetases or decarboxylases, 3-ketoacyl-CoA, acyl-CoA, cinnamoyl-CoA, cinnamate 4-hydroxylase, alcohol dehydrogenase, caffeic acid, N-methyltransferase, and NADPH-cytochrome P450s. However, cytochrome P450s and OMTs involved in the later stage of Amaryllidaceae alkaloids biosynthesis were mainly up-regulated in field samples. Whereas, the enzymes involved in initial biosynthetic pathways i.e. fructose biphosphate adolase, aminotransferases, dehydrogenases, hydroxyl methyl glutarate and glutamate synthase leading to the biosynthesis of precursors; tyrosine, phenylalanine and tryptophan for secondary metabolites were up-regulated in callus. The knowledge of probable genes involved in secondary metabolism and their regulation in different tissues will provide insight into the Narcissus plant biology related to alkaloid production.Keywords: narcissus, callus, transcriptomics, secondary metabolites
Procedia PDF Downloads 143390 Ultra-Sensitive Point-Of-Care Detection of PSA Using an Enzyme- and Equipment-Free Microfluidic Platform
Authors: Ying Li, Rui Hu, Shizhen Chen, Xin Zhou, Yunhuang Yang
Abstract:
Prostate cancer is one of the leading causes of cancer-related death among men. Prostate-specific antigen (PSA), a specific product of prostatic epithelial cells, is an important indicator of prostate cancer. Though PSA is not a specific serum biomarker for the screening of prostate cancer, it is recognized as an indicator for prostate cancer recurrence and response to therapy for patient’s post-prostatectomy. Since radical prostatectomy eliminates the source of PSA production, serum PSA levels fall below 50 pg/mL, and may be below the detection limit of clinical immunoassays (current clinical immunoassay lower limit of detection is around 10 pg/mL). Many clinical studies have shown that intervention at low PSA levels was able to improve patient outcomes significantly. Therefore, ultra-sensitive and precise assays that can accurately quantify extremely low levels of PSA (below 1-10 pg/mL) will facilitate the assessment of patients for the possibility of early adjuvant or salvage treatment. Currently, the commercially available ultra-sensitive ELISA kit (not used clinically) can only reach a detection limit of 3-10 pg/mL. Other platforms developed by different research groups could achieve a detection limit as low as 0.33 pg/mL, but they relied on sophisticated instruments to get the final readout. Herein we report a microfluidic platform for point-of-care (POC) detection of PSA with a detection limit of 0.5 pg/mL and without the assistance of any equipment. This platform is based on a previously reported volumetric-bar-chart chip (V-Chip), which applies platinum nanoparticles (PtNPs) as the ELISA probe to convert the biomarker concentration to the volume of oxygen gas that further pushes the red ink to form a visualized bar-chart. The length of each bar is used to quantify the biomarker concentration of each sample. We devised a long reading channel V-Chip (LV-Chip) in this work to achieve a wide detection window. In addition, LV-Chip employed a unique enzyme-free ELISA probe that enriched PtNPs significantly and owned 500-fold enhanced catalytic ability over that of previous V-Chip, resulting in a significantly improved detection limit. LV-Chip is able to complete a PSA assay for five samples in 20 min. The device was applied to detect PSA in 50 patient serum samples, and the on-chip results demonstrated good correlation with conventional immunoassay. In addition, the PSA levels in finger-prick whole blood samples from healthy volunteers were successfully measured on the device. This completely stand-alone LV-Chip platform enables convenient POC testing for patient follow-up in the physician’s office and is also useful in resource-constrained settings.Keywords: point-of-care detection, microfluidics, PSA, ultra-sensitive
Procedia PDF Downloads 110389 Features of Normative and Pathological Realizations of Sibilant Sounds for Computer-Aided Pronunciation Evaluation in Children
Authors: Zuzanna Miodonska, Michal Krecichwost, Pawel Badura
Abstract:
Sigmatism (lisping) is a speech disorder in which sibilant consonants are mispronounced. The diagnosis of this phenomenon is usually based on the auditory assessment. However, the progress in speech analysis techniques creates a possibility of developing computer-aided sigmatism diagnosis tools. The aim of the study is to statistically verify whether specific acoustic features of sibilant sounds may be related to pronunciation correctness. Such knowledge can be of great importance while implementing classifiers and designing novel tools for automatic sibilants pronunciation evaluation. The study covers analysis of various speech signal measures, including features proposed in the literature for the description of normative sibilants realization. Amplitudes and frequencies of three fricative formants (FF) are extracted based on local spectral maxima of the friction noise. Skewness, kurtosis, four normalized spectral moments (SM) and 13 mel-frequency cepstral coefficients (MFCC) with their 1st and 2nd derivatives (13 Delta and 13 Delta-Delta MFCC) are included in the analysis as well. The resulting feature vector contains 51 measures. The experiments are performed on the speech corpus containing words with selected sibilant sounds (/ʃ, ʒ/) pronounced by 60 preschool children with proper pronunciation or with natural pathologies. In total, 224 /ʃ/ segments and 191 /ʒ/ segments are employed in the study. The Mann-Whitney U test is employed for the analysis of stigmatism and normative pronunciation. Statistically, significant differences are obtained in most of the proposed features in children divided into these two groups at p < 0.05. All spectral moments and fricative formants appear to be distinctive between pathology and proper pronunciation. These metrics describe the friction noise characteristic for sibilants, which makes them particularly promising for the use in sibilants evaluation tools. Correspondences found between phoneme feature values and an expert evaluation of the pronunciation correctness encourage to involve speech analysis tools in diagnosis and therapy of sigmatism. Proposed feature extraction methods could be used in a computer-assisted stigmatism diagnosis or therapy systems.Keywords: computer-aided pronunciation evaluation, sigmatism diagnosis, speech signal analysis, statistical verification
Procedia PDF Downloads 301388 A Qualitative Assessment of the Internal Communication of the College of Comunication: Basis for a Strategic Communication Plan
Authors: Edna T. Bernabe, Joshua Bilolo, Sheila Mae Artillero, Catlicia Joy Caseda, Liezel Once, Donne Ynah Grace Quirante
Abstract:
Internal communication is significant for an organization to function to its full extent. A strategic communication plan builds an organization’s structure and makes it more systematic. Information is a vital part of communication inside the organization as this lays every possible outcome—be it positive or negative. It is, therefore, imperative to assess the communication structure of a particular organization to secure a better and harmonious communication environment in any organization. Thus, this research was intended to identify the internal communication channels used in Polytechnic University of the Philippines-College of Communication (PUP-COC) as an organization, to identify the flow of information specifically in downward, upward, and horizontal communication, to assess the accuracy, consistency, and timeliness of its internal communication channels; and to come up with a proposed strategic communication plan of information dissemination to improve the existing communication flow in the college. The researchers formulated a framework from Input-Throughout-Output-Feedback-Goal of General System Theory and gathered data to assess the PUP-COC’s internal communication. The communication model links the objectives of the study to know the internal organization of the college. The qualitative approach and case study as the tradition of inquiry were used to gather deeper understanding of the internal organizational communication in PUP-COC, using Interview, as the primary methods for the study. This was supported with a quantitative data which were gathered through survey from the students of the college. The researchers interviewed 17 participants: the College dean, the 4 chairpersons of the college departments, the 11 faculty members and staff, and the acting Student Council president. An interview guide and a standardized questionnaire were formulated as instruments to generate the data. After a thorough analysis of the study, it was found out that two-way communication flow exists in PUP-COC. The type of communication channel the internal stakeholders use varies as to whom a particular person is communicating with. The members of the PUP-COC community also use different types of communication channels depending on the flow of communication being used. Moreover, the most common types of internal communication are the letters and memoranda for downward communication, while letters, text messages, and interpersonal communication are often used in upward communication. Various forms of social media have been found out to be of use in horizontal communication. Accuracy, consistency, and timeliness play a significant role in information dissemination within the college. However, some problems have also been found out in the communication system. The most common problem are the delay in the dissemination of memoranda and letters and the uneven distribution of information and instruction to faculty, staff, and students. This has led the researchers to formulate a strategic communication plan which aims to propose strategies that will solve the communication problems that are being experienced by the internal stakeholders.Keywords: communication plan, downward communication, internal communication, upward communication
Procedia PDF Downloads 518387 Ideas for Musical Activities and Games in the Early Year (IMAGINE-Autism): A Case Study Approach
Authors: Tania Lisboa, Angela Voyajolu, Adam Ockelford
Abstract:
The positive impact of music on the development of children with autism is widely acknowledged: music offers a unique channel for communication, wellbeing and self-regulation, as well as access to culture and a means of creative engagement. Yet, no coherent program exists for parents, carers and teachers to follow with their children in the early years, when the need for interventions is often most acute. Hence, research and the development of resources is urgently required. Autism is a project with children on the autism spectrum. The project aims at promoting the participants’ engagement with music through involvement in specially-designed musical activities with parents and carers. The main goal of the research is to verify the effectiveness of newly designed resources and strategies, which are based on the Sounds of Intent in the Early Years (SoI-EY) framework of musical development. This is a pilot study, comprising case studies of five children with autism in the early years. The data comprises semi-structured interviews, observations of videos, and feedback from parents on resources. Interpretative Phenomenological Analysis was chosen to analyze the interviews. The video data was coded in relation to the SoI-EY framework. The feedback from parents was used to evaluate the resources (i.e. musical activity cards). The participants’ wider development was also assessed through selected elements of the Early Years Foundation Stage (EYFS), a national assessment framework used in England: specifically, communication, language and social-emotional development. Five families of children on the autism spectrum (aged between 4-8 years) participated in the pilot. The research team visited each family 4 times over a 3-month period, during which the children were observed, and musical activities were suggested based on the child’s assessed level of musical development. Parents then trialed the activities, providing feedback and gathering further video observations of their child’s musical engagement between visits. The results of one case study will be featured in this paper, in which the evidence suggests that specifically tailored musical activity may promote communication and social engagement for a child with language difficulties on the autism spectrum. The resources were appropriate for the children’s involvement in musical activities. Findings suggest that non-specialist musical engagement with family and carers can be a powerful means to foster communication. The case study featured in this paper illustrates this with a child of limited verbal ability. There is a need for further research and development of resources that can be made available to all those working with children on the autism spectrum.Keywords: autism, development, music education, resources
Procedia PDF Downloads 103386 Electrical Tortuosity across Electrokinetically Remediated Soils
Authors: Waddah S. Abdullah, Khaled F. Al-Omari
Abstract:
Electrokinetic remediation is one of the most influential and effective methods to decontaminate contaminated soils. Electroosmosis and electromigration are the processes of electrochemical extraction of contaminants from soils. The driving force that causes removing contaminants from soils (electroosmosis process or electromigration process) is voltage gradient. Therefore, the electric field distribution throughout the soil domain is extremely important to investigate and to determine the factors that help to establish a uniform electric field distribution in order to make the clean-up process work properly and efficiently. In this study, small-sized passive electrodes (made of graphite) were placed at predetermined locations within the soil specimen, and the voltage drop between these passive electrodes was measured in order to observe the electrical distribution throughout the tested soil specimens. The electrokinetic test was conducted on two types of soils; a sandy soil and a clayey soil. The electrical distribution throughout the soil domain was conducted with different tests properties; and the electrical field distribution was observed in three-dimensional pattern in order to establish the electrical distribution within the soil domain. The effects of density, applied voltages, and degree of saturation on the electrical distribution within the remediated soil were investigated. The distribution of the moisture content, concentration of the sodium ions, and the concentration of the calcium ions were determined and established in three-dimensional scheme. The study has shown that the electrical conductivity within soil domain depends on the moisture content and concentration of electrolytes present in the pore fluid. The distribution of the electrical field in the saturated soil was found not be affected by its density. The study has also shown that high voltage gradient leads to non-uniform electric field distribution within the electroremediated soil. Very importantly, it was found that even when the electric field distribution is uniform globally (i.e. between the passive electrodes), local non-uniformity could be established within the remediated soil mass. Cracks or air gaps formed due to temperature rise (because of electric flow in low conductivity regions) promotes electrical tortuosity. Thus, fracturing or cracking formed in the remediated soil mass causes disconnection of electric current and hence, no removal of contaminant occur within these areas.Keywords: contaminant removal, electrical tortuousity, electromigration, electroosmosis, voltage distribution
Procedia PDF Downloads 421385 A Comparison of Tsunami Impact to Sydney Harbour, Australia at Different Tidal Stages
Authors: Olivia A. Wilson, Hannah E. Power, Murray Kendall
Abstract:
Sydney Harbour is an iconic location with a dense population and low-lying development. On the east coast of Australia, facing the Pacific Ocean, it is exposed to several tsunamigenic trenches. This paper presents a component of the most detailed assessment of the potential for earthquake-generated tsunami impact on Sydney Harbour to date. Models in this study use dynamic tides to account for tide-tsunami interaction. Sydney Harbour’s tidal range is 1.5 m, and the spring tides from January 2015 that are used in the modelling for this study are close to the full tidal range. The tsunami wave trains modelled include hypothetical tsunami generated from earthquakes of magnitude 7.5, 8.0, 8.5, and 9.0 MW from the Puysegur and New Hebrides trenches as well as representations of the historical 1960 Chilean and 2011 Tohoku events. All wave trains are modelled for the peak wave to coincide with both a low tide and a high tide. A single wave train, representing a 9.0 MW earthquake at the Puysegur trench, is modelled for peak waves to coincide with every hour across a 12-hour tidal phase. Using the hydrodynamic model ANUGA, results are compared according to the impact parameters of inundation area, depth variation and current speeds. Results show that both maximum inundation area and depth variation are tide dependent. Maximum inundation area increases when coincident with a higher tide, however, hazardous inundation is only observed for the larger waves modelled: NH90high and P90high. The maximum and minimum depths are deeper on higher tides and shallower on lower tides. The difference between maximum and minimum depths varies across different tidal phases although the differences are slight. Maximum current speeds are shown to be a significant hazard for Sydney Harbour; however, they do not show consistent patterns according to tide-tsunami phasing. The maximum current speed hazard is shown to be greater in specific locations such as Spit Bridge, a narrow channel with extensive marine infrastructure. The results presented for Sydney Harbour are novel, and the conclusions are consistent with previous modelling efforts in the greater area. It is shown that tide must be a consideration for both tsunami modelling and emergency management planning. Modelling with peak tsunami waves coinciding with a high tide would be a conservative approach; however, it must be considered that maximum current speeds may be higher on other tides.Keywords: emergency management, sydney, tide-tsunami interaction, tsunami impact
Procedia PDF Downloads 242384 Biochar from Empty Fruit Bunches Generated in the Palm Oil Extraction and Its Nutrients Contribution in Cultivated Soils with Elaeis guineensis in Casanare, Colombia
Authors: Alvarado M. Lady G., Ortiz V. Yaylenne, Quintero B. Quelbis R.
Abstract:
The oil palm sector has seen significant growth in Colombia after the insertion of policies to stimulate the use of biofuels, which eventually contributes to the reduction of greenhouse gases (GHG) that deteriorate not only the environment but the health of people. However, the policy of using biofuels has been strongly questioned by the impacts that can generate; an example is the increase of other more harmful GHGs like the CH₄ that underlies the amount of solid waste generated. Casanare's department is estimated be one of the major producers of palm oil of the country given that has recently expanded its sowed area, which implies an increase in waste generated primarily in the industrial stage. For this reason, the following study evaluated the agronomic potential of the biochar obtained from empty fruit bunches and its nutritional contribution in cultivated soils with Elaeis guineensis in Casanare, Colombia. The biochar was obtained by slow pyrolysis of the clusters in a retort oven at an average temperature of 190 °C and a residence time of 8 hours. The final product was taken to the laboratory for its physical and chemical analysis as well as a soil sample from a cultivation of Elaeis guineensis located in Tauramena-Casanare. With the results obtained plus the bibliographical reports of the nutrient demand in this cultivation, the possible nutritional contribution of the biochar was determined. It is estimated that the cultivation requirements of nitrogen is 12.1 kg.ha⁻¹, potassium is 59.3 kg.ha⁻¹, magnesium is -31.5 kg.ha⁻¹ and phosphorus is 5.6 kg.ha⁻¹ obtaining a biochar contribution of 143.1 kg.ha⁻¹, 1204.5 kg.ha⁻¹, 39.2 kg.ha⁻¹ and 71.6 kg.ha⁻¹ respectively. The incorporation of biochar into the soil would significantly improve the concentrations of N, P, K and Mg, nutrients considered important in the yield of palm oil, coupled with the importance of nutrient recycling in agricultural production systems sustainable. The biochar application improves the physical properties of soils, mainly in the humidity retention. On the other hand, it regulates the availability of nutrients for plants absorption, with economic savings in the application of synthetic fertilizers and water by irrigation. It also becomes an alternative to manage agricultural waste, reducing the involuntary emissions of greenhouse gases to the environment by decomposition in the field, reducing the CO₂ content in the atmosphere.Keywords: biochar, nutrient recycling, oil palm, pyrolysis
Procedia PDF Downloads 157383 Effect of Two Different Method for Juice Processing on the Anthocyanins and Polyphenolics of Blueberry (Vaccinium corymbosum)
Authors: Onur Ercan, Buket Askin, Erdogan Kucukoner
Abstract:
Blueberry (Vaccinium corymbosum, bluegold) has become popular beverage due to their nutritional values such as vitamins, minerals, and antioxidants. In the study, the effects of pressing, mashing, enzymatic treatment, and pasteurization on anthocyanins, colour, and polyphenolics of blueberry juice (BJ) were studied. The blueberry juice (BJ) was produced with two different methods that direct juice extraction (DJE) and mash treatment process (MTP) were applied. After crude blueberry juice (CBJ) production, the samples were first treated with commercial enzymes [Novoferm-61 (Novozymes A/S) (2–10 mL/L)], to break down the hydrocolloid polysaccharides, mainly pectin and starch. The enzymes were added at various concentrations. The highest transmittance (%) was obtained for Novoferm-61 at a concentration of 2 mL/L was 66.53%. After enzymatic treatment, clarification trials were applied to the enzymatically treated BJs with adding various amounts of bentonite (10%, w/v), gelatin (1%, w/v) and kiselsol (15%, v/v). The turbidities of the clarified samples were then determined. However, there was no significant differences between transmittances (%) for samples. For that, only enzymatic treatment was applied to the blueberry juice processing (DDBJ, depectinized direct blueberry juice). Based on initial pressing process made to evaluate press function, it was determined that pressing fresh blueberries with no other processing did not render adequate juice due to lack of liquefaction. Therefore, the blueberries were mash into small pieces (3 mm) and then enzymatic treatments and clarification trials were performed. Finally, both BJ samples were pasteurized. Compositional analyses, colour properties, polyphenols and antioxidant properties were compared. Enzymatic treatment caused significant reductions in ACN content (30%) in Direct Blueberry Juice Processing (DBJ), while there was a significant increasing in Mash Treatment Processing (MTP). Overall anthocyanin levels were higher intreated samples after each processing step in MTP samples, but polyphenolic levels were slightly higher for both processes (DBJ and MTP). There was a reduction for ACNs and polyphenolics only after pasteurization. It has a result that the methods for tried to blueberry juice is suitable into obtain fresh juice. In addition, we examined fruit juice during processing stages; anthocyanin, phenolic substance content and antioxidant activity are higher, and yield is higher in fruit juice compared to DBJ method in MTP method, the MTP method should be preferred in processing juice of blueberry into fruit juice.Keywords: anthocyanins, blueberry, depectinization, polyphenols
Procedia PDF Downloads 94382 An Evaluation of the Lae City Road Network Improvement Project
Authors: Murray Matarab Konzang
Abstract:
Lae Port Development Project, Four Lane Highway and other development in the extraction industry which have direct road link to Lae City are predicted to have significant impact on its road network system. This paper evaluates Lae roads improvement program with forecast on planning, economic and the installation of bypasses to ease congestion, effective and convenient transport service for bulk goods and reduce travel time. Land-use transportation study and plans for local area traffic management scheme will be considered. City roads are faced with increased number of traffic and some inadequate road pavement width, poor transport plans, and facilities to meet this transportation demand. Lae also has drainage system which might not hold a 100 year flood. Proper evaluation, plan, design and intersection analysis is needed to evaluate road network system thus recommend improvement and estimate future growth. Repetitive and cyclic loading by heavy commercial vehicles with different axle configurations apply on the flexible pavement which weakens and tear the pavement surface thus small cracks occur. Rain water seeps through and overtime it creates potholes. Effective planning starts from experimental research and appropriate design standards to enable firm embankment, proper drains and quality pavement material. This paper will address traffic problems as well as road pavement, capacities of intersections, and pedestrian flow during peak hours. The outcome of this research will be to identify heavily trafficked road sections and recommend treatments to reduce traffic congestions, road classification, and proposal for bypass routes and improvement. First part of this study will describe transport or traffic related problems within the city. Second part would be to identify challenges imposed by traffic and road related problems and thirdly to recommend solutions after the analyzing traffic data that will indicate current capacities of road intersections and finally recommended treatment for improvement and future growth.Keywords: Lae, road network, highway, vehicle traffic, planning
Procedia PDF Downloads 358381 Geochemical Study of the Bound Hydrocarbon in the Asphaltene of Biodegraded Oils of Cambay Basin
Authors: Sayani Chatterjee, Kusum Lata Pangtey, Sarita Singh, Harvir Singh
Abstract:
Biodegradation leads to a systematic alteration of the chemical and physical properties of crude oil showing sequential depletion of n-alkane, cycloalkanes, aromatic which increases its specific gravity, viscosity and the abundance of heteroatom-containing compounds. The biodegradation leads to a change in the molecular fingerprints and geochemical parameters of degraded oils, thus make source and maturity identification inconclusive or ambiguous. Asphaltene is equivalent to the most labile part of the respective kerogen and generally has high molecular weight. Its complex chemical structure with substantial microporous units makes it suitable to occlude the hydrocarbon expelled from the source. The occluded molecules are well preserved by the macromolecular structure and thus prevented from secondary alterations. They retain primary organic geochemical information over the geological time. The present study involves the extraction of this occluded hydrocarbon from the asphaltene cage through mild oxidative degradation using mild oxidative reagents like Hydrogen Peroxide (H₂O₂) and Acetic Acid (CH₃COOH) on purified asphaltene of the biodegraded oils of Mansa, Lanwa and Santhal fields in Cambay Basin. The study of these extracted occluded hydrocarbons was carried out for establishing oil to oil and oil to source correlation in the Mehsana block of Cambay Basin. The n-alkane and biomarker analysis through GC and GC-MS of these occluded hydrocarbons show similar biomarker imprint as the normal oil in the area and hence correlatable with them. The abundance of C29 steranes, presence of Oleanane, Gammacerane and 4-Methyl sterane depicts that the oils are derived from terrestrial organic matter deposited in the stratified saline water column in the marine environment with moderate maturity (VRc 0.6-0.8). The oil source correlation study suggests that the oils are derived from Jotana-Warosan Low area. The developed geochemical technique to extract the occluded hydrocarbon has effectively resolved the ambiguity that resulted from the inconclusive fingerprint of the biodegraded oil and the method can be also applied in other biodegraded oils as well.Keywords: asphaltene, biomarkers, correlation, mild oxidation, occluded hydrocarbon
Procedia PDF Downloads 158380 Assessment of Selected Marine Organisms from Malaysian Coastal Areas for Inhibitory Activity against the Chikungunya Virus
Authors: Yik Sin Chan, Nam Weng Sit, Fook Yee Chye, van Ofwegen Leen, de Voogd Nicole, Kong Soo Khoo
Abstract:
Chikungunya fever is an arboviral disease transmitted by the Aedes mosquitoes. It has resulted in epidemics of the disease in tropical countries in the Indian Ocean and South East Asian regions. The recent spread of this disease to the temperate countries such as France and Italy, coupled with the absence of vaccines and effective antiviral drugs make chikungunya fever a worldwide health threat. This study aims to investigate the anti-chikungunya virus activity of selected marine organism samples collected from Malaysian coastal areas, including seaweeds (Caulerpa racemosa, Caulerpa sertularioides and Kappaphycus alvarezii), a soft coral (Lobophytum microlobulatum) and a sponge (Spheciospongia vagabunda). Following lyophilization (oven drying at 40C for K. alvarezii) and grinding to powder form, each sample was subjected to sequential solvent extraction using hexane, chloroform, ethyl acetate, ethanol, methanol and distilled water in order to extract bioactive compounds. The antiviral activity was evaluated using monkey kidney epithelial (Vero) cells infected with the virus (multiplicity of infection=1). The cell viability was determined by Neutral Red uptake assay. 70% of the 30 extracts showed weak inhibitory activity with cell viability ≤30%. Seven of the extracts exhibited moderate inhibitory activity (cell viability: 31%-69%). These were the chloroform, ethyl acetate, ethanol and methanol extracts of C. racemosa; chloroform and ethyl acetate extracts of L. microlobulatum; and the chloroform extract of C. sertularioides. Only the hexane and ethanol extracts of L. microlobulatum showed strong inhibitory activity against the virus, resulting in cell viabilities (mean±SD; n=3) of 73.3±2.6% and 79.2±0.9%, respectively. The corresponding mean 50% effective concentrations (EC50) for the extracts were 14.2±0.2 and 115.3±1.2 µg/mL, respectively. The ethanol extract of the soft coral L. microlobulatum appears to hold the most promise for further characterization of active principles as it possessed greater selectivity index (SI>5.6) compared to the hexane extract (SI=2.1).Keywords: antiviral, seaweed, sponge, soft coral, vero cell
Procedia PDF Downloads 289379 From Shallow Semantic Representation to Deeper One: Verb Decomposition Approach
Authors: Aliaksandr Huminski
Abstract:
Semantic Role Labeling (SRL) as shallow semantic parsing approach includes recognition and labeling arguments of a verb in a sentence. Verb participants are linked with specific semantic roles (Agent, Patient, Instrument, Location, etc.). Thus, SRL can answer on key questions such as ‘Who’, ‘When’, ‘What’, ‘Where’ in a text and it is widely applied in dialog systems, question-answering, named entity recognition, information retrieval, and other fields of NLP. However, SRL has the following flaw: Two sentences with identical (or almost identical) meaning can have different semantic role structures. Let consider 2 sentences: (1) John put butter on the bread. (2) John buttered the bread. SRL for (1) and (2) will be significantly different. For the verb put in (1) it is [Agent + Patient + Goal], but for the verb butter in (2) it is [Agent + Goal]. It happens because of one of the most interesting and intriguing features of a verb: Its ability to capture participants as in the case of the verb butter, or their features as, say, in the case of the verb drink where the participant’s feature being liquid is shared with the verb. This capture looks like a total fusion of meaning and cannot be decomposed in direct way (in comparison with compound verbs like babysit or breastfeed). From this perspective, SRL looks really shallow to represent semantic structure. If the key point in semantic representation is an opportunity to use it for making inferences and finding hidden reasons, it assumes by default that two different but semantically identical sentences must have the same semantic structure. Otherwise we will have different inferences from the same meaning. To overcome the above-mentioned flaw, the following approach is suggested. Assume that: P is a participant of relation; F is a feature of a participant; Vcp is a verb that captures a participant; Vcf is a verb that captures a feature of a participant; Vpr is a primitive verb or a verb that does not capture any participant and represents only a relation. In another word, a primitive verb is a verb whose meaning does not include meanings from its surroundings. Then Vcp and Vcf can be decomposed as: Vcp = Vpr +P; Vcf = Vpr +F. If all Vcp and Vcf will be represented this way, then primitive verbs Vpr can be considered as a canonical form for SRL. As a result of that, there will be no hidden participants caught by a verb since all participants will be explicitly unfolded. An obvious example of Vpr is the verb go, which represents pure movement. In this case the verb drink can be represented as man-made movement of liquid into specific direction. Extraction and using primitive verbs for SRL create a canonical representation unique for semantically identical sentences. It leads to the unification of semantic representation. In this case, the critical flaw related to SRL will be resolved.Keywords: decomposition, labeling, primitive verbs, semantic roles
Procedia PDF Downloads 367378 Moulding Photovoice to Community: Supporting Aboriginal People Experiencing Homelessness to Share Their Stories through Photography
Authors: Jocelyn Jones, Louise Southalan, Lindey Andrews, Mandy Wilson, Emma Vieira, Jackie Oakley, Dorothy Bagshaw, Alice V. Brown, Patrick Egan, Duc Dau, Lucy Spanswick
Abstract:
Working with people experiencing homelessness requires careful use of methods that support them to comfortably share their experiences. This is particularly important for Aboriginal and Torres Strait Islander peoples, the traditional owners of Australia, who have experienced intergenerational and compounding trauma since colonisation. Aboriginal cultures regularly experience research fatigue and distrust in research’s potential for impact. They often view research as an extraction -a process of taking the knowledge that empowers the research team and its institution, rather than benefiting those being researched. Through a partnership between an Aboriginal Community Controlled Organisation and a university research institute, we conducted a community-driven research project with 70-90 Aboriginal people experiencing homelessness in Perth, Western Australia. The project aimed to listen to and advocate for the voices of those who are experiencing homelessness, guided by the Aboriginal community. In consultation with Aboriginal Elders, we selected methods that are considered culturally safe, including those who would prefer to express their experiences creatively. This led us to run a series of Photovoice workshops -an established method that supports people to share their experiences through photography. This method treats participants as experts and is regularly used with marginalised groups across the world. We detail our experience and lessons in using Photovoice with Aboriginal community members experiencing homelessness. This includes the ways the method needed to be moulded to community needs and driven by their individual choices, such as being dynamic in the length of time participants would spend with us, how we would introduce the method to them, and providing support workers for participants when taking photos. We also discuss lessons in establishing and retaining engagement and how the method was successful in supporting participants to comfortably share their stories. Finally, we outline the insights into homelessness that the method offered, including highlighting the difficulty experienced by participants in transitioning from homelessness to accommodation and the diversity of hopes people who have experienced homelessness have for the future.Keywords: Aboriginal and Torres Strait Islander peoples, photovoice, homelessness, community-led research
Procedia PDF Downloads 100377 Alphabet Recognition Using Pixel Probability Distribution
Authors: Vaidehi Murarka, Sneha Mehta, Dishant Upadhyay
Abstract:
Our project topic is “Alphabet Recognition using pixel probability distribution”. The project uses techniques of Image Processing and Machine Learning in Computer Vision. Alphabet recognition is the mechanical or electronic translation of scanned images of handwritten, typewritten or printed text into machine-encoded text. It is widely used to convert books and documents into electronic files etc. Alphabet Recognition based OCR application is sometimes used in signature recognition which is used in bank and other high security buildings. One of the popular mobile applications includes reading a visiting card and directly storing it to the contacts. OCR's are known to be used in radar systems for reading speeders license plates and lots of other things. The implementation of our project has been done using Visual Studio and Open CV (Open Source Computer Vision). Our algorithm is based on Neural Networks (machine learning). The project was implemented in three modules: (1) Training: This module aims “Database Generation”. Database was generated using two methods: (a) Run-time generation included database generation at compilation time using inbuilt fonts of OpenCV library. Human intervention is not necessary for generating this database. (b) Contour–detection: ‘jpeg’ template containing different fonts of an alphabet is converted to the weighted matrix using specialized functions (contour detection and blob detection) of OpenCV. The main advantage of this type of database generation is that the algorithm becomes self-learning and the final database requires little memory to be stored (119kb precisely). (2) Preprocessing: Input image is pre-processed using image processing concepts such as adaptive thresholding, binarizing, dilating etc. and is made ready for segmentation. “Segmentation” includes extraction of lines, words, and letters from the processed text image. (3) Testing and prediction: The extracted letters are classified and predicted using the neural networks algorithm. The algorithm recognizes an alphabet based on certain mathematical parameters calculated using the database and weight matrix of the segmented image.Keywords: contour-detection, neural networks, pre-processing, recognition coefficient, runtime-template generation, segmentation, weight matrix
Procedia PDF Downloads 389376 Structural Analysis and Modelling in an Evolving Iron Ore Operation
Authors: Sameh Shahin, Nannang Arrys
Abstract:
Optimizing pit slope stability and reducing strip ratio of a mining operation are two key tasks in geotechnical engineering. With a growing demand for minerals and an increasing cost associated with extraction, companies are constantly re-evaluating the viability of mineral deposits and challenging their geological understanding. Within Rio Tinto Iron Ore, the Structural Geology (SG) team investigate and collect critical data, such as point based orientations, mapping and geological inferences from adjacent pits to re-model deposits where previous interpretations have failed to account for structurally controlled slope failures. Utilizing innovative data collection methods and data-driven investigation, SG aims to address the root causes of slope instability. Committing to a resource grid drill campaign as the primary source of data collection will often bias data collection to a specific orientation and significantly reduce the capability to identify and qualify complexity. Consequently, these limitations make it difficult to construct a realistic and coherent structural model that identifies adverse structural domains. Without the consideration of complexity and the capability of capturing these structural domains, mining operations run the risk of inadequately designed slopes that may fail and potentially harm people. Regional structural trends have been considered in conjunction with surface and in-pit mapping data to model multi-batter fold structures that were absent from previous iterations of the structural model. The risk is evident in newly identified dip-slope and rock-mass controlled sectors of the geotechnical design rather than a ubiquitous dip-slope sector across the pit. The reward is two-fold: 1) providing sectors of rock-mass controlled design in previously interpreted structurally controlled domains and 2) the opportunity to optimize the slope angle for mineral recovery and reduced strip ratio. Furthermore, a resulting high confidence model with structures and geometries that can account for historic slope instabilities in structurally controlled domains where design assumptions failed.Keywords: structural geology, geotechnical design, optimization, slope stability, risk mitigation
Procedia PDF Downloads 47375 Deciphering Orangutan Drawing Behavior Using Artificial Intelligence
Authors: Benjamin Beltzung, Marie Pelé, Julien P. Renoult, Cédric Sueur
Abstract:
To this day, it is not known if drawing is specifically human behavior or if this behavior finds its origins in ancestor species. An interesting window to enlighten this question is to analyze the drawing behavior in genetically close to human species, such as non-human primate species. A good candidate for this approach is the orangutan, who shares 97% of our genes and exhibits multiple human-like behaviors. Focusing on figurative aspects may not be suitable for orangutans’ drawings, which may appear as scribbles but may have meaning. A manual feature selection would lead to an anthropocentric bias, as the features selected by humans may not match with those relevant for orangutans. In the present study, we used deep learning to analyze the drawings of a female orangutan named Molly († in 2011), who has produced 1,299 drawings in her last five years as part of a behavioral enrichment program at the Tama Zoo in Japan. We investigate multiple ways to decipher Molly’s drawings. First, we demonstrate the existence of differences between seasons by training a deep learning model to classify Molly’s drawings according to the seasons. Then, to understand and interpret these seasonal differences, we analyze how the information spreads within the network, from shallow to deep layers, where early layers encode simple local features and deep layers encode more complex and global information. More precisely, we investigate the impact of feature complexity on classification accuracy through features extraction fed to a Support Vector Machine. Last, we leverage style transfer to dissociate features associated with drawing style from those describing the representational content and analyze the relative importance of these two types of features in explaining seasonal variation. Content features were relevant for the classification, showing the presence of meaning in these non-figurative drawings and the ability of deep learning to decipher these differences. The style of the drawings was also relevant, as style features encoded enough information to have a classification better than random. The accuracy of style features was higher for deeper layers, demonstrating and highlighting the variation of style between seasons in Molly’s drawings. Through this study, we demonstrate how deep learning can help at finding meanings in non-figurative drawings and interpret these differences.Keywords: cognition, deep learning, drawing behavior, interpretability
Procedia PDF Downloads 165374 Desulphurization of Waste Tire Pyrolytic Oil (TPO) Using Photodegradation and Adsorption Techniques
Authors: Moshe Mello, Hilary Rutto, Tumisang Seodigeng
Abstract:
The nature of tires makes them extremely challenging to recycle due to the available chemically cross-linked polymer and, therefore, they are neither fusible nor soluble and, consequently, cannot be remolded into other shapes without serious degradation. Open dumping of tires pollutes the soil, contaminates underground water and provides ideal breeding grounds for disease carrying vermins. The thermal decomposition of tires by pyrolysis produce char, gases and oil. The composition of oils derived from waste tires has common properties to commercial diesel fuel. The problem associated with the light oil derived from pyrolysis of waste tires is that it has a high sulfur content (> 1.0 wt.%) and therefore emits harmful sulfur oxide (SOx) gases to the atmosphere when combusted in diesel engines. Desulphurization of TPO is necessary due to the increasing stringent environmental regulations worldwide. Hydrodesulphurization (HDS) is the commonly practiced technique for the removal of sulfur species in liquid hydrocarbons. However, the HDS technique fails in the presence of complex sulfur species such as Dibenzothiopene (DBT) present in TPO. This study aims to investigate the viability of photodegradation (Photocatalytic oxidative desulphurization) and adsorptive desulphurization technologies for efficient removal of complex and non-complex sulfur species in TPO. This study focuses on optimizing the cleaning (removal of impurities and asphaltenes) process by varying process parameters; temperature, stirring speed, acid/oil ratio and time. The treated TPO will then be sent for vacuum distillation to attain the desired diesel like fuel. The effect of temperature, pressure and time will be determined for vacuum distillation of both raw TPO and the acid treated oil for comparison purposes. Polycyclic sulfides present in the distilled (diesel like) light oil will be oxidized dominantly to the corresponding sulfoxides and sulfone via a photo-catalyzed system using TiO2 as a catalyst and hydrogen peroxide as an oxidizing agent and finally acetonitrile will be used as an extraction solvent. Adsorptive desulphurization will be used to adsorb traces of sulfurous compounds which remained during photocatalytic desulphurization step. This desulphurization convoy is expected to give high desulphurization efficiency with reasonable oil recovery.Keywords: adsorption, asphaltenes, photocatalytic oxidation, pyrolysis
Procedia PDF Downloads 272373 Antibacterial Effects of Some Medicinal and Aromatic Plant Extracts on Pathogenic Bacteria Isolated from Pear Orchards
Authors: Kubilay Kurtulus Bastas
Abstract:
Bacterial diseases are very destructive and cause economic losses on pears. Promising plant extracts for the management of plant diseases are environmentally safe, long-lasting and extracts of certain plants contain alkaloids, tannins, quinones, coumarins, phenolic compounds, and phytoalexins. In this study, bacteria were isolated from different parts of pear exhibiting characteristic symptoms of bacterial diseases from the Central Anatolia, Turkey. Pathogenic bacteria were identified by morphological, physiological, biochemical and molecular methods as fire blight (Erwinia amylovora (39%)), bacterial blossom blast and blister bark (Pseudomonas syringae pv. syringae (22%)), crown gall (Rhizobium radiobacter (1%)) from different pear cultivars, and determined virulence levels of the pathogens with pathogenicity tests. The air-dried 25 plant material was ground into fine powder and extraction was performed at room temperature by maceration with 80% (v/v) methanol/distilled water. The minimum inhibitory concentration (MIC) values were determined by using modified disc diffusion method at five different concentrations and streptomycin sulphate was used as control chemical. Bacterial suspensions were prepared as 108 CFU ml⁻¹ densities and 100 µl bacterial suspensions were spread to TSA medium. Antimicrobial activity was evaluated by measuring the inhibition zones in reference to the test organisms. Among the tested plants, Origanum vulgare, Hedera helix, Satureja hortensis, Rhus coriaria, Eucalyptus globulus, Rosmarinus officinalis, Ocimum basilicum, Salvia officinalis, Cuminum cyminum and Thymus vulgaris showed a good antibacterial activity and they inhibited the growth of the pathogens with inhibition zone diameter ranging from 7 to 27 mm at 20% (w/v) in absolute methanol in vitro conditions. In vivo, the highest efficacy was determined as 27% on reducing tumor formation of R. radiobacter, and 48% and 41% on reducing shoot blight of E. amylovora and P. s. pv. syringae on pear seedlings, respectively. Obtaining data indicated that some plant extracts may be used against the bacterial diseases on pome fruits within sustainable and organic management programs.Keywords: bacteria, eco-friendly management, organic, pear, plant extract
Procedia PDF Downloads 335372 Photoswitchable and Polar-Dependent Fluorescence of Diarylethenes
Authors: Sofia Lazareva, Artem Smolentsev
Abstract:
Fluorescent photochromic materials collect strong interest due to their possible application in organic photonics such as optical logic systems, optical memory, visualizing sensors, as well as characterization of polymers and biological systems. In photochromic fluorescence switching systems the emission of fluorophore is modulated between ‘on’ and ‘off’ via the photoisomerization of photochromic moieties resulting in effective resonance energy transfer (FRET). In current work, we have studied both photochromic and fluorescent properties of several diarylethenes. It was found that coloured forms of these compounds are not fluorescent because of the efficient intramolecular energy transfer. Spectral and photochromic parameters of investigated substances have been measured in five solvents having different polarity. Quantum yields of photochromic transformation A↔B ΦA→B and ΦB→A as well as B isomer extinction coefficients were determined by kinetic method. It was found that the photocyclization reaction quantum yield of all compounds decreases with the increase of solvent polarity. In addition, the solvent polarity is revealed to affect fluorescence significantly. Increasing of the solvent dielectric constant was found to result in a strong shift of emission band position from 450 nm (nhexane) to 550 nm (DMSO and ethanol) for all three compounds. Moreover, the emission intensive in polar solvents becomes weak and hardly detectable in n-hexane. The only one exception in the described dependence is abnormally low fluorescence quantum yield in ethanol presumably caused by the loss of electron-donating properties of nitrogen atom due to the protonation. An effect of the protonation was also confirmed by the addition of concentrated HCl in solution resulting in a complete disappearance of the fluorescent band. Excited state dynamics were investigated by ultrafast optical spectroscopy methods. Kinetic curves of excited states absorption and fluorescence decays were measured. Lifetimes of transient states were calculated from the data measured. The mechanism of ring opening reaction was found to be polarity dependent. Comparative analysis of kinetics measured in acetonitrile and hexane reveals differences in relaxation dynamics after the laser pulse. The most important fact is the presence of two decay processes in acetonitrile, whereas only one is present in hexane. This fact supports an assumption made on the basis of steady-state preliminary experiments that in polar solvents occur stabilization of TICT state. Thus, results achieved prove the hypothesis of two channel mechanism of energy relaxation of compounds studied.Keywords: diarylethenes, fluorescence switching, FRET, photochromism, TICT state
Procedia PDF Downloads 679371 Transmedia and Platformized Political Discourse in a Growing Democracy: A Study of Nigeria’s 2023 General Elections
Authors: Tunde Ope-Davies
Abstract:
Transmediality and platformization as online content-sharing protocols have continued to accentuate the growing impact of the unprecedented digital revolution across the world. The rapid transformation across all sectors as a result of this revolution has continued to spotlight the increasing importance of new media technologies in redefining and reshaping the rhythm and dynamics of our private and public discursive practices. Equally, social and political activities are being impacted daily through the creation and transmission of political discourse content through multi-channel platforms such as mobile telephone communication, social media networks and the internet. It has been observed that digital platforms have become central to the production, processing, and distribution of multimodal social data and cultural content. The platformization paradigm thus underpins our understanding of how digital platforms enhance the production and heterogenous distribution of media and cultural content through these platforms and how this process facilitates socioeconomic and political activities. The use of multiple digital platforms to share and transmit political discourse material synchronously and asynchronously has gained some exciting momentum in the last few years. Nigeria’s 2023 general elections amplified the usage of social media and other online platforms as tools for electioneering campaigns, socio-political mobilizations and civic engagement. The study, therefore, focuses on transmedia and platformed political discourse as a new strategy to promote political candidates and their manifesto in order to mobilize support and woo voters. This innovative transmedia digital discourse model involves a constellation of online texts and images transmitted through different online platforms almost simultaneously. The data for the study was extracted from the 2023 general elections campaigns in Nigeria between January- March 2023 through media monitoring, manual download and the use of software to harvest the online electioneering campaign material. I adopted a discursive-analytic qualitative technique with toolkits drawn from a computer-mediated multimodal discourse paradigm. The study maps the progressive development of digital political discourse in this young democracy. The findings also demonstrate the inevitable transformation of modern democratic practice through platform-dependent and transmedia political discourse. Political actors and media practitioners now deploy layers of social media network platforms to convey messages and mobilize supporters in order to aggregate and maximize the impact of their media campaign projects and audience reach.Keywords: social media, digital humanities, political discourse, platformized discourse, multimodal discourse
Procedia PDF Downloads 85370 [Keynote Talk]: Production Flow Coordination on Supply Chains: Brazilian Case Studies
Authors: Maico R. Severino, Laura G. Caixeta, Nadine M. Costa, Raísa L. T. Napoleão, Éverton F. V. Valle, Diego D. Calixto, Danielle Oliveira
Abstract:
One of the biggest barriers that companies find nowadays is the coordination of production flow in their Supply Chains (SC). In this study, coordination is understood as a mechanism for incorporating the entire production channel, with everyone involved focused on achieving the same goals. Sometimes, this coordination is attempted by the use of logistics practices or production plan and control methods. No papers were found in the literature that presented the combined use of logistics practices and production plan and control methods. The main objective of this paper is to propose solutions for six case studies combining logistics practices and Ordering Systems (OS). The methodology used in this study was a conceptual model of decision making. This model contains six phases: a) the analysis the types and characteristics of relationships in the SC; b) the choice of the OS; c) the choice of the logistics practices; d) the development of alternative proposals of combined use; e) the analysis of the consistency of the chosen alternative; f) the qualitative and quantitative assessment of the impact on the coordination of the production flow and the verification of applicability of the proposal in the real case. This study was conducted on six Brazilian SC of different sectors: footwear, food and beverages, garment, sugarcane, mineral and metal mechanical. The results from this study showed that there was improvement in the coordination of the production flow through the following proposals: a) for the footwear industry the use of Period Bath Control (PBC), Quick Response (QR) and Enterprise Resource Planning (ERP); b) for the food and beverage sector firstly the use of Electronic Data Interchange (EDI), ERP, Continuous Replenishment (CR) and Drum-Buffer-Rope Order (DBR) (for situations in which the plants of both companies are distant), and secondly EDI, ERP, Milk-Run and Review System Continues (for situations in which the plants of both companies are close); c) for the garment industry the use of Collaborative Planning, Forecasting, and Replenishment (CPFR) and Constant Work-In-Process (CONWIP) System; d) for the sugarcane sector the use of EDI, ERP and CONWIP System; e) for the mineral processes industry the use of Vendor Managed Inventory (VMI), EDI and MaxMin Control System; f) for the metal mechanical sector the use of CONWIP System and Continuous Replenishment (CR). It should be emphasized that the proposals are exclusively recommended for the relationship between client and supplier studied. Therefore, it cannot be generalized to other cases. However, what can be generalized is the methodology used to choose the best practices for each case. Based on the study, it can be concluded that the combined use of OS and logistics practices enable a better coordination of flow production on SC.Keywords: supply chain management, production flow coordination, logistics practices, ordering systems
Procedia PDF Downloads 208369 Integration of EEG and Motion Tracking Sensors for Objective Measure of Attention-Deficit Hyperactivity Disorder in Pre-Schoolers
Authors: Neha Bhattacharyya, Soumendra Singh, Amrita Banerjee, Ria Ghosh, Oindrila Sinha, Nairit Das, Rajkumar Gayen, Somya Subhra Pal, Sahely Ganguly, Tanmoy Dasgupta, Tanusree Dasgupta, Pulak Mondal, Aniruddha Adhikari, Sharmila Sarkar, Debasish Bhattacharyya, Asim Kumar Mallick, Om Prakash Singh, Samir Kumar Pal
Abstract:
Background: We aim to develop an integrated device comprised of single-probe EEG and CCD-based motion sensors for a more objective measure of Attention-deficit Hyperactivity Disorder (ADHD). While the integrated device (MAHD) relies on the EEG signal (spectral density of beta wave) for the assessment of attention during a given structured task (painting three segments of a circle using three different colors, namely red, green and blue), the CCD sensor depicts movement pattern of the subjects engaged in a continuous performance task (CPT). A statistical analysis of the attention and movement patterns was performed, and the accuracy of the completed tasks was analysed using indigenously developed software. The device with the embedded software, called MAHD, is intended to improve certainty with criterion E (i.e. whether symptoms are better explained by another condition). Methods: We have used the EEG signal from a single-channel dry sensor placed on the frontal lobe of the head of the subjects (3-5 years old pre-schoolers). During the painting of three segments of a circle using three distinct colors (red, green, and blue), absolute power for delta and beta EEG waves from the subjects are found to be correlated with relaxation and attention/cognitive load conditions. While the relaxation condition of the subject hints at hyperactivity, a more direct CCD-based motion sensor is used to track the physical movement of the subject engaged in a continuous performance task (CPT) i.e., separation of the various colored balls from one table to another. We have used our indigenously developed software for the statistical analysis to derive a scale for the objective assessment of ADHD. We have also compared our scale with clinical ADHD evaluation. Results: In a limited clinical trial with preliminary statistical analysis, we have found a significant correlation between the objective assessment of the ADHD subjects with that of the clinician’s conventional evaluation. Conclusion: MAHD, the integrated device, is supposed to be an auxiliary tool to improve the accuracy of ADHD diagnosis by supporting greater criterion E certainty.Keywords: ADHD, CPT, EEG signal, motion sensor, psychometric test
Procedia PDF Downloads 99368 Facilitated Massive Open Online Course (MOOC) Based Teacher Professional Development in Kazakhstan: Connectivism-Oriented Practices
Authors: A. Kalizhanova, T. Shelestova
Abstract:
Teacher professional development (TPD) in Kazakhstan has followed a fairly standard format for centuries, with teachers learning new information from a lecturer and being tested using multiple-choice questions. In the online world, self-access courses have become increasingly popular. Due to their extensive multimedia content, peer-reviewed assignments, adaptable class times, and instruction from top university faculty from across the world, massive open online courses (MOOCs) have found a home in Kazakhstan's system for lifelong learning. Recent studies indicate the limited use of connectivism-based tools such as discussion forums by Kazakhstani pre-service and in-service English teachers, whose professional interests are limited to obtaining certificates rather than enhancing their teaching abilities and exchanging knowledge with colleagues. This paper highlights the significance of connectivism-based tools and instruments, such as MOOCs, for the continuous professional development of pre- and in-service English teachers, facilitators' roles, and their strategies for enhancing trainees' conceptual knowledge within the MOOCs' curriculum and online learning skills. Reviewing the most pertinent papers on Connectivism Theory, facilitators' function in TPD, and connectivism-based tools, such as MOOCs, a code extraction method was utilized. Three experts, former active participants in a series of projects initiated across Kazakhstan to improve the efficacy of MOOCs, evaluated the excerpts and selected the most appropriate ones to propose the matrix of teacher professional competencies that can be acquired through MOOCs. In this paper, we'll look at some of the strategies employed by course instructors to boost their students' English skills and knowledge of course material, both inside and outside of the MOOC platform. Participants' interactive learning contributed to their language and subject conceptual knowledge and prepared them for peer-reviewed assignments in the MOOCs, and this approach of small group interaction was given to highlight the outcomes of participants' interactive learning. Both formal and informal continuing education institutions can use the findings of this study to support teachers in gaining experience with MOOCs and creating their own online courses.Keywords: connectivism-based tools, teacher professional development, massive open online courses, facilitators, Kazakhstani context
Procedia PDF Downloads 80367 Marine Environmental Monitoring Using an Open Source Autonomous Marine Surface Vehicle
Authors: U. Pruthviraj, Praveen Kumar R. A. K. Athul, K. V. Gangadharan, S. Rao Shrikantha
Abstract:
An open source based autonomous unmanned marine surface vehicle (UMSV) is developed for some of the marine applications such as pollution control, environmental monitoring and thermal imaging. A double rotomoulded hull boat is deployed which is rugged, tough, quick to deploy and moves faster. It is suitable for environmental monitoring, and it is designed for easy maintenance. A 2HP electric outboard marine motor is used which is powered by a lithium-ion battery and can also be charged from a solar charger. All connections are completely waterproof to IP67 ratings. In full throttle speed, the marine motor is capable of up to 7 kmph. The motor is integrated with an open source based controller using cortex M4F for adjusting the direction of the motor. This UMSV can be operated by three modes: semi-autonomous, manual and fully automated. One of the channels of a 2.4GHz radio link 8 channel transmitter is used for toggling between different modes of the USMV. In this electric outboard marine motor an on board GPS system has been fitted to find the range and GPS positioning. The entire system can be assembled in the field in less than 10 minutes. A Flir Lepton thermal camera core, is integrated with a 64-bit quad-core Linux based open source processor, facilitating real-time capturing of thermal images and the results are stored in a micro SD card which is a data storage device for the system. The thermal camera is interfaced to an open source processor through SPI protocol. These thermal images are used for finding oil spills and to look for people who are drowning at low visibility during the night time. A Real Time clock (RTC) module is attached with the battery to provide the date and time of thermal images captured. For the live video feed, a 900MHz long range video transmitter and receiver is setup by which from a higher power output a longer range of 40miles has been achieved. A Multi-parameter probe is used to measure the following parameters: conductivity, salinity, resistivity, density, dissolved oxygen content, ORP (Oxidation-Reduction Potential), pH level, temperature, water level and pressure (absolute).The maximum pressure it can withstand 160 psi, up to 100m. This work represents a field demonstration of an open source based autonomous navigation system for a marine surface vehicle.Keywords: open source, autonomous navigation, environmental monitoring, UMSV, outboard motor, multi-parameter probe
Procedia PDF Downloads 241366 Thermo-Hydro-Mechanical-Chemical Coupling in Enhanced Geothermal Systems: Challenges and Opportunities
Authors: Esmael Makarian, Ayub Elyasi, Fatemeh Saberi, Olusegun Stanley Tomomewo
Abstract:
Geothermal reservoirs (GTRs) have garnered global recognition as a sustainable energy source. The Thermo-Hydro-Mechanical-Chemical (THMC) integration coupling proves to be a practical and effective method for optimizing production in GTRs. The study outcomes demonstrate that THMC coupling serves as a versatile and valuable tool, offering in-depth insights into GTRs and enhancing their operational efficiency. This is achieved through temperature analysis and pressure changes and their impacts on mechanical properties, structural integrity, fracture aperture, permeability, and heat extraction efficiency. Moreover, THMC coupling facilitates potential benefits assessment and risks associated with different geothermal technologies, considering the complex thermal, hydraulic, mechanical, and chemical interactions within the reservoirs. However, THMC-coupling utilization in GTRs presents a multitude of challenges. These challenges include accurately modeling and predicting behavior due to the interconnected nature of processes, limited data availability leading to uncertainties, induced seismic events risks to nearby communities, scaling and mineral deposition reducing operational efficiency, and reservoirs' long-term sustainability. In addition, material degradation, environmental impacts, technical challenges in monitoring and control, accurate assessment of resource potential, and regulatory and social acceptance further complicate geothermal projects. Addressing these multifaceted challenges is crucial for successful geothermal energy resources sustainable utilization. This paper aims to illuminate the challenges and opportunities associated with THMC coupling in enhanced geothermal systems. Practical solutions and strategies for mitigating these challenges are discussed, emphasizing the need for interdisciplinary approaches, improved data collection and modeling techniques, and advanced monitoring and control systems. Overcoming these challenges is imperative for unlocking the full potential of geothermal energy making a substantial contribution to the global energy transition and sustainable development.Keywords: geothermal reservoirs, THMC coupling, interdisciplinary approaches, challenges and opportunities, sustainable utilization
Procedia PDF Downloads 69365 Artificial Habitat Mapping in Adriatic Sea
Authors: Annalisa Gaetani, Anna Nora Tassetti, Gianna Fabi
Abstract:
The hydroacoustic technology is an efficient tool to study the sea environment: the most recent advancement in artificial habitat mapping involves acoustic systems to investigate fish abundance, distribution and behavior in specific areas. Along with a detailed high-coverage bathymetric mapping of the seabed, the high-frequency Multibeam Echosounder (MBES) offers the potential of detecting fine-scale distribution of fish aggregation, combining its ability to detect at the same time the seafloor and the water column. Surveying fish schools distribution around artificial structures, MBES allows to evaluate how their presence modifies the biological natural habitat overtime in terms of fish attraction and abundance. In the last years, artificial habitat mapping experiences have been carried out by CNR-ISMAR in the Adriatic sea: fish assemblages aggregating at offshore gas platforms and artificial reefs have been systematically monitored employing different kinds of methodologies. This work focuses on two case studies: a gas extraction platform founded at 80 meters of depth in the central Adriatic sea, 30 miles far from the coast of Ancona, and the concrete and steel artificial reef of Senigallia, deployed by CNR-ISMAR about 1.2 miles offshore at a depth of 11.2 m . Relating the MBES data (metrical dimensions of fish assemblages, shape, depth, density etc.) with the results coming from other methodologies, such as experimental fishing surveys and underwater video camera, it has been possible to investigate the biological assemblage attracted by artificial structures hypothesizing which species populate the investigated area and their spatial dislocation from these artificial structures. Processing MBES bathymetric and water column data, 3D virtual scenes of the artificial habitats have been created, receiving an intuitive-looking depiction of their state and allowing overtime to evaluate their change in terms of dimensional characteristics and depth fish schools’ disposition. These MBES surveys play a leading part in the general multi-year programs carried out by CNR-ISMAR with the aim to assess potential biological changes linked to human activities on.Keywords: artificial habitat mapping, fish assemblages, hydroacustic technology, multibeam echosounder
Procedia PDF Downloads 260