Search results for: order acceptance
1142 Predictive Semi-Empirical NOx Model for Diesel Engine
Authors: Saurabh Sharma, Yong Sun, Bruce Vernham
Abstract:
Accurate prediction of NOx emission is a continuous challenge in the field of diesel engine-out emission modeling. Performing experiments for each conditions and scenario cost significant amount of money and man hours, therefore model-based development strategy has been implemented in order to solve that issue. NOx formation is highly dependent on the burn gas temperature and the O2 concentration inside the cylinder. The current empirical models are developed by calibrating the parameters representing the engine operating conditions with respect to the measured NOx. This makes the prediction of purely empirical models limited to the region where it has been calibrated. An alternative solution to that is presented in this paper, which focus on the utilization of in-cylinder combustion parameters to form a predictive semi-empirical NOx model. The result of this work is shown by developing a fast and predictive NOx model by using the physical parameters and empirical correlation. The model is developed based on the steady state data collected at entire operating region of the engine and the predictive combustion model, which is developed in Gamma Technology (GT)-Power by using Direct Injected (DI)-Pulse combustion object. In this approach, temperature in both burned and unburnt zone is considered during the combustion period i.e. from Intake Valve Closing (IVC) to Exhaust Valve Opening (EVO). Also, the oxygen concentration consumed in burnt zone and trapped fuel mass is also considered while developing the reported model. Several statistical methods are used to construct the model, including individual machine learning methods and ensemble machine learning methods. A detailed validation of the model on multiple diesel engines is reported in this work. Substantial numbers of cases are tested for different engine configurations over a large span of speed and load points. Different sweeps of operating conditions such as Exhaust Gas Recirculation (EGR), injection timing and Variable Valve Timing (VVT) are also considered for the validation. Model shows a very good predictability and robustness at both sea level and altitude condition with different ambient conditions. The various advantages such as high accuracy and robustness at different operating conditions, low computational time and lower number of data points requires for the calibration establishes the platform where the model-based approach can be used for the engine calibration and development process. Moreover, the focus of this work is towards establishing a framework for the future model development for other various targets such as soot, Combustion Noise Level (CNL), NO2/NOx ratio etc.Keywords: diesel engine, machine learning, NOₓ emission, semi-empirical
Procedia PDF Downloads 1131141 A Dynamic Mechanical Thermal T-Peel Test Approach to Characterize Interfacial Behavior of Polymeric Textile Composites
Authors: J. R. Büttler, T. Pham
Abstract:
Basic understanding of interfacial mechanisms is of importance for the development of polymer composites. For this purpose, we need techniques to analyze the quality of interphases, their chemical and physical interactions and their strength and fracture resistance. In order to investigate the interfacial phenomena in detail, advanced characterization techniques are favorable. Dynamic mechanical thermal analysis (DMTA) using a rheological system is a sensitive tool. T-peel tests were performed with this system, to investigate the temperature-dependent peel behavior of woven textile composites. A model system was made of polyamide (PA) woven fabric laminated with films of polypropylene (PP) or PP modified by grafting with maleic anhydride (PP-g-MAH). Firstly, control measurements were performed with solely PP matrixes. Polymer melt investigations, as well as the extensional stress, extensional viscosity and extensional relaxation modulus at -10°C, 100 °C and 170 °C, demonstrate similar viscoelastic behavior for films made of PP-g-MAH and its non-modified PP-control. Frequency sweeps have shown that PP-g-MAH has a zero phase viscosity of around 1600 Pa·s and PP-control has a similar zero phase viscosity of 1345 Pa·s. Also, the gelation points are similar at 2.42*104 Pa (118 rad/s) and 2.81*104 Pa (161 rad/s) for PP-control and PP-g-MAH, respectively. Secondly, the textile composite was analyzed. The extensional stress of PA66 fabric laminated with either PP-control or PP-g-MAH at -10 °C, 25 °C and 170 °C for strain rates of 0.001 – 1 s-1 was investigated. The laminates containing the modified PP need more stress for T-peeling. However, the strengthening effect due to the modification decreases by increasing temperature and at 170 °C, just above the melting temperature of the matrix, the difference disappears. Independent of the matrix used in the textile composite, there is a decrease of extensional stress by increasing temperature. It appears that the more viscous is the matrix, the weaker the laminar adhesion. Possibly, the measurement is influenced by the fact that the laminate becomes stiffer at lower temperatures. Adhesive lap-shear testing at room temperature supports the findings obtained with the T-peel test. Additional analysis of the textile composite at the microscopic level ensures that the fibers are well embedded in the matrix. Atomic force microscopy (AFM) imaging of a cross section of the composite shows no gaps between the fibers and matrix. Measurements of the water contact angle show that the MAH grafted PP is more polar than the virgin-PP, and that suggests a more favorable chemical interaction of PP-g-MAH with PA, compared to the non-modified PP. In fact, this study indicates that T-peel testing by DMTA is a technique to achieve more insights into polymeric textile composites.Keywords: dynamic mechanical thermal analysis, interphase, polyamide, polypropylene, textile composite
Procedia PDF Downloads 1281140 Methods of Detoxification of Nuts With Aflatoxin B1 Contamination
Authors: Auteleyeva Laura, Maikanov Balgabai, Smagulova Ayana
Abstract:
In order to find and select detoxification methods, patent and information research was conducted, as a result of which 68 patents for inventions were found, among them from the near abroad - 14 (Russia), from far abroad: China – 27, USA - 6, South Korea–1, Germany - 2, Mexico – 4, Yugoslavia – 7, Austria, Taiwan, Belarus, Denmark, Italy, Japan, Canada for 1 security document. Aflatoxin B₁ in various nuts was determined by two methods: enzyme immunoassay "RIDASCREEN ® FAST Aflatoxin" with determination of optical density on a microplate spectrophotometer RIDA®ABSORPTION 96 with RIDASOFT® software Win.NET (Germany) and the method of high-performance liquid chromatography (HPLC Corporation Water, USA) according to GOST 307112001. For experimental contamination of nuts, the cultivation of strain A was carried out. flavus KWIK-STIK on the medium of Chapek (France) with subsequent infection of various nuts (peanuts, peanuts with shells, badam, walnuts with and without shells, pistachios).Based on our research, we have selected 2 detoxification methods: method 1 – combined (5% citric acid solution + microwave for 640 W for 3 min + UV for 20 min) and a chemical method with various leaves of plants: Artemisia terra-albae, Thymus vulgaris, Callogonum affilium, collected in the territory of Akmola region (Artemisia terra-albae, Thymus vulgaris) and Western Kazakhstan (Callogonum affilium). The first stage was the production of ethanol extracts of Artemisia terraea-albae, Thymus vulgaris, Callogonum affilium. To obtain them, 100 g of vegetable raw materials were taken, which was dissolved in 70% ethyl alcohol. Extraction was carried out for 2 hours at the boiling point of the solvent with a reverse refrigerator using an ultrasonic bath "Sapphire". The obtained extracts were evaporated on a rotary evaporator IKA RV 10. At the second stage, the three samples obtained were tested for antimicrobial and antifungal activity. Extracts of Thymus vulgaris and Callogonum affilium showed high antimicrobial and antifungal activity. Artemisia terraea-albae extract showed high antimicrobial activity and low antifungal activity. When testing method 1, it was found that in the first and third experimental groups there was a decrease in the concentration of aflatoxin B1 in walnut samples by 63 and 65%, respectively, but these values also exceeded the maximum permissible concentrations, while the nuts in the second and third experimental groups had a tart lemon flavor; When testing method 2, a decrease in the concentration of aflatoxin B1 to a safe level was observed by 91% (0.0038 mg/kg) in nuts of the 1st and 2nd experimental groups (Artemisia terra-albae, Thymus vulgaris), while in samples of the 2nd and 3rd experimental groups, a decrease in the amount of aflatoxin in 1 to a safe level was observed.Keywords: nuts, aflatoxin B1, my, mycotoxins
Procedia PDF Downloads 861139 Organic Matter Distribution in Bazhenov Source Rock: Insights from Sequential Extraction and Molecular Geochemistry
Authors: Margarita S. Tikhonova, Alireza Baniasad, Anton G. Kalmykov, Georgy A. Kalmykov, Ralf Littke
Abstract:
There is a high complexity in the pore structure of organic-rich rocks caused by the combination of inter-particle porosity from inorganic mineral matter and ultrafine intra-particle porosity from both organic matter and clay minerals. Fluids are retained in that pore space, but there are major uncertainties in how and where the fluids are stored and to what extent they are accessible or trapped in 'closed' pores. A large degree of tortuosity may lead to fractionation of organic matter so that the lighter and flexible compounds would diffuse to the reservoir whereas more complicated compounds may be locked in place. Additionally, parts of hydrocarbons could be bound to solid organic matter –kerogen– and mineral matrix during expulsion and migration. Larger compounds can occupy thin channels so that clogging or oil and gas entrapment will occur. Sequential extraction of applying different solvents is a powerful tool to provide more information about the characteristics of trapped organic matter distribution. The Upper Jurassic – Lower Cretaceous Bazhenov shale is one of the most petroliferous source rock extended in West Siberia, Russia. Concerning the variable mineral composition, pore space distribution and thermal maturation, there are high uncertainties in distribution and composition of organic matter in this formation. In order to address this issue geological and geochemical properties of 30 samples including mineral composition (XRD and XRF), structure and texture (thin-section microscopy), organic matter contents, type and thermal maturity (Rock-Eval) as well as molecular composition (GC-FID and GC-MS) of different extracted materials during sequential extraction were considered. Sequential extraction was performed by a Soxhlet apparatus using different solvents, i.e., n-hexane, chloroform and ethanol-benzene (1:1 v:v) first on core plugs and later on pulverized materials. The results indicate that the studied samples are mainly composed of type II kerogen with TOC contents varied from 5 to 25%. The thermal maturity ranged from immature to late oil window. Whereas clay contents decreased with increasing maturity, the amount of silica increased in the studied samples. According to molecular geochemistry, stored hydrocarbons in open and closed pore space reveal different geochemical fingerprints. The results improve our understanding of hydrocarbon expulsion and migration in the organic-rich Bazhenov shale and therefore better estimation of hydrocarbon potential for this formation.Keywords: Bazhenov formation, bitumen, molecular geochemistry, sequential extraction
Procedia PDF Downloads 1691138 The Impact of Corporate Social Responsibility Perception on Organizational Commitment: The Case of Cabin Crew in a Civil Aviation Company
Authors: Şeyda Kaya
Abstract:
The aim of this study is to examine the relationship between corporate social responsibility perception and organizational commitment among Turkish cabin crew. At the same time, the social responsibility perception and organizational commitment scores of the participants were compared according to their gender, age, education level, title, and work experience. In the globalizing world, businesses have developed some innovative marketing methods in order to survive and strengthen their place in the market. Nowadays, consumers who are connected to the brand with an emotional bond rather than being just consumers. Corporate Social Responsibility Projects, on the one hand, provide social benefit, on the other hand, increase the brand awareness of businesses by providing credibility in the eyes of consumers. The rapid increase of competition, requires businesses to use their human resources, which is the most important resource to sustain their existence, in the most effective and efficient way. For this reason, the concept of ‘Organizational Commitment’ has become an important research topic for business and academics. Although there are studies in the literature to determine the effect of the perception of corporate social Responsibility on Organizational Commitment in Banking and Finance and Tourism sectors, there are no studies conducted specifically for the Turkish aviation sector to our best knowledge. Personal information form, CSR scale, Importance of CSR scale, Organizational commitment scale were used as data collection tools in the research. CSR Scale created by Türker (2006). was used to find out how employees felt about CSR. Importance of CSR Scale through a subscale of the Perceived Role of Ethics and Social Responsibility (PRESOR) that Etheredge (1999) converted into a two-factor framework, the significance of social responsibility for employees was assessed. Organizational Commitment Scale, Mowday, Steers, and Porter (1979) created the OCQ, which uses 15 measures to evaluate global commitment to the organization. As a result of the study, there is a significant positive relationship between the participants' CSR scale sub-dimensions, CSR to Employees, CSR to Customers, CSR to Society, CSR to Government, CSR to Natural Environment, CSR to Next Generation, CSR to Governmental Organizations, Importance of CSR, and Organizational Commitment scores. As a result; as the participants' Corporate Social Responsibility scores increase, their organizational commitment increases. To summarize the findings of our study, the scores obtained from the CSR scale and the scores obtained from the Organizational Commitment scale were found to have a positive and significant relationship. In other words, if the participants value the corporate social responsibility projects of the institution they work for and think that they spare time and effort, the importance they attach to the corporate social responsibility projects and their organizational commitment to the institution they work for, increase. Similarly, the scores obtained from the Importance of CSR and the scores obtained from the Organizational Commitment scale also have a positive and significant relationship. As the importance given to corporate social responsibility projects by the participants increases, their organizational commitment to the institution they work for also increases.Keywords: corporate social responsibility, organizational commitment, Turkish cabin crew, aviation
Procedia PDF Downloads 1091137 Development of Intellectual Property Information Services in Zimbabwe’s University Libraries: Assessing the Current Status and Mapping the Future Direction
Authors: Jonathan Munyoro, Takawira Machimbidza, Stephen Mutula
Abstract:
The study investigates the current status of Intellectual Property (IP) information services in Zimbabwe's university libraries. Specifically, the study assesses the current IP information services offered in Zimbabwe’s university libraries, identifies challenges to the development of comprehensive IP information services in Zimbabwe’s university libraries, and suggests solutions for the development of IP information services in Zimbabwe’s university libraries. The study is born out of a realisation that research on IP information services in university libraries has received little attention, especially in developing country contexts, despite the fact that there are calls for heightened participation of university libraries in IP information services. In Zimbabwe, the launch of the National Intellectual Property Policy and Implementation Strategy 2018-2022 and the introduction of the Education 5.0 concept are set to significantly change the IP landscape in the country. Education 5.0 places more emphasis on innovation and industrialisation (in addition to teaching, community service, and research), and has the potential to shift the focus and level of IP output produced in higher and tertiary education institutions beyond copyrights and more towards commercially exploited patents, utility models, and industrial designs. The growing importance of IP commercialisation in universities creates a need for appropriate IP information services to assist students, academics, researchers, administrators, start-ups, entrepreneurs, and inventors. The critical challenge for university libraries is to reposition themselves and remain relevant in the new trajectory. Designing specialised information services to support increased IP generation and commercialisation appears to be an opportunity for university libraries to stay relevant in the knowledge economy. However, IP information services in Zimbabwe’s universities appear to be incomplete and focused mostly on assisting with research publications and copyright-related activities. Research on the existing status of IP services in university libraries in Zimbabwe is therefore necessary to help identify gaps and provide solutions in order to stimulate the growth of new forms of such services. The study employed a quantitative approach. An online questionnaire was administered to 57 academic librarians from 15 university libraries. Findings show that the current focus of the surveyed institutions is on providing scientific research support services (15); disseminating/sharing university research output (14); and copyright activities (12). More specialised IP information services such as IP education and training, patent information services, IP consulting services, IP online service platforms, and web-based IP information services are largely unavailable in Zimbabwean university libraries. Results reveal that the underlying challenge in the development of IP information services in Zimbabwe's university libraries is insufficient IP knowledge among academic librarians, which is exacerbated by inadequate IP management frameworks in university institutions. The study proposes a framework for the entrenchment of IP information services in Zimbabwe's university libraries.Keywords: academic libraries, information services, intellectual property, IP knowledge, university libraries, Zimbabwe
Procedia PDF Downloads 1541136 Pond Site Diagnosis: Monoclonal Antibody-Based Farmer Level Tests to Detect the Acute Hepatopancreatic Necrosis Disease in Shrimp
Authors: B. T. Naveen Kumar, Anuj Tyagi, Niraj Kumar Singh, Visanu Boonyawiwat, A. H. Shanthanagouda, Orawan Boodde, K. M. Shankar, Prakash Patil, Shubhkaramjeet Kaur
Abstract:
Early mortality syndrome (EMS)/Acute Hepatopancreatic Necrosis Disease (AHPND) has emerged as a major obstacle for the shrimp farming around the world. It is caused by a strain of Vibrio parahaemolyticus. The possible preventive and control measure is, early and rapid detection of the pathogen in the broodstock, post-larvae and monitoring the shrimp during the culture period. Polymerase chain reaction (PCR) based early detection methods are good, but they are costly, time taking and requires a sophisticated laboratory. The present study was conducted to develop a simple, sensitive and rapid diagnostic farmer level kit for the reliable detection of AHPND in shrimp. A panel of monoclonal antibodies (MAbs) were raised against the recombinant Pir B protein (rPirB). First, an immunodot was developed by using MAbs G3B8 and Mab G3H2 which showed specific reactivity to purified r-PirB protein with no cross-reactivity to other shrimp bacterial pathogens (AHPND free Vibrio parahaemolyticus (Indian strains), V. anguillarum, WSSV, Aeromonas hydrophila, and Aphanomyces invadans). Immunodot developed using Mab G3B8 is more sensitive than that with the Mab G3H2. However, immunodot takes almost 2.5 hours to complete with several hands-on steps. Therefore, the flow-through assay (FTA) was developed by using a plastic cassette containing the nitrocellulose membrane with absorbing pads below. The sample was dotted in the test zone on the nitrocellulose membrane followed by continuos addition of five solutions in the order of i) blocking buffer (BSA) ii) primary antibody (MAb) iii) washing Solution iv) secondary antibody and v) chromogen substrate (TMB) clear purple dots against a white background were considered as positive reactions. The FTA developed using MAbG3B8 is more sensitive than that with MAb G3H2. In FTA the two MAbs showed specific reactivity to purified r-PirB protein and not to other shrimp bacterial pathogens. The FTA is simple to farmer/field level, sensitive and rapid requiring only 8-10 min for completion. Tests can be developed to kits, which will be ideal for use in biosecurity, for the first line of screening (at the port or pond site) and during monitoring and surveillance programmes overall for the good management practices to reduce the risk of the disease.Keywords: acute hepatopancreatic necrosis disease, AHPND, flow-through assay, FTA, farmer level, immunodot, pond site, shrimp
Procedia PDF Downloads 1721135 Investigation of a Single Feedstock Particle during Pyrolysis in Fluidized Bed Reactors via X-Ray Imaging Technique
Authors: Stefano Iannello, Massimiliano Materazzi
Abstract:
Fluidized bed reactor technologies are one of the most valuable pathways for thermochemical conversions of biogenic fuels due to their good operating flexibility. Nevertheless, there are still issues related to the mixing and separation of heterogeneous phases during operation with highly volatile feedstocks, including biomass and waste. At high temperatures, the volatile content of the feedstock is released in the form of the so-called endogenous bubbles, which generally exert a “lift” effect on the particle itself by dragging it up to the bed surface. Such phenomenon leads to high release of volatile matter into the freeboard and limited mass and heat transfer with particles of the bed inventory. The aim of this work is to get a better understanding of the behaviour of a single reacting particle in a hot fluidized bed reactor during the devolatilization stage. The analysis has been undertaken at different fluidization regimes and temperatures to closely mirror the operating conditions of waste-to-energy processes. Beechwood and polypropylene particles were used to resemble the biomass and plastic fractions present in waste materials, respectively. The non-invasive X-ray technique was coupled to particle tracking algorithms to characterize the motion of a single feedstock particle during the devolatilization with high resolution. A high-energy X-ray beam passes through the vessel where absorption occurs, depending on the distribution and amount of solids and fluids along the beam path. A high-speed video camera is synchronised to the beam and provides frame-by-frame imaging of the flow patterns of fluids and solids within the fluidized bed up to 72 fps (frames per second). A comprehensive mathematical model has been developed in order to validate the experimental results. Beech wood and polypropylene particles have shown a very different dynamic behaviour during the pyrolysis stage. When the feedstock is fed from the bottom, the plastic material tends to spend more time within the bed than the biomass. This behaviour can be attributed to the presence of the endogenous bubbles, which drag effect is more pronounced during the devolatilization of biomass, resulting in a lower residence time of the particle within the bed. At the typical operating temperatures of thermochemical conversions, the synthetic polymer softens and melts, and the bed particles attach on its outer surface, generating a wet plastic-sand agglomerate. Consequently, this additional layer of sand may hinder the rapid evolution of volatiles in the form of endogenous bubbles, and therefore the establishment of a poor drag effect acting on the feedstock itself. Information about the mixing and segregation of solid feedstock is of prime importance for the design and development of more efficient industrial-scale operations.Keywords: fluidized bed, pyrolysis, waste feedstock, X-ray
Procedia PDF Downloads 1701134 The Association of Southeast Asian Nations (ASEAN) and the Dynamics of Resistance to Sovereignty Violation: The Case of East Timor (1975-1999)
Authors: Laura Southgate
Abstract:
The Association of Southeast Asian Nations (ASEAN), as well as much of the scholarship on the organisation, celebrates its ability to uphold the principle of regional autonomy, understood as upholding the norm of non-intervention by external powers in regional affairs. Yet, in practice, this has been repeatedly violated. This dichotomy between rhetoric and practice suggests an interesting avenue for further study. The East Timor crisis (1975-1999) has been selected as a case-study to test the dynamics of ASEAN state resistance to sovereignty violation in two distinct timeframes: Indonesia’s initial invasion of the territory in 1975, and the ensuing humanitarian crisis in 1999 which resulted in a UN-mandated, Australian-led peacekeeping intervention force. These time-periods demonstrate variation on the dependent variable. It is necessary to observe covariation in order to derive observations in support of a causal theory. To establish covariation, my independent variable is therefore a continuous variable characterised by variation in convergence of interest. Change of this variable should change the value of the dependent variable, thus establishing causal direction. This paper investigates the history of ASEAN’s relationship to the norm of non-intervention. It offers an alternative understanding of ASEAN’s history, written in terms of the relationship between a key ASEAN state, which I call a ‘vanguard state’, and selected external powers. This paper will consider when ASEAN resistance to sovereignty violation has succeeded, and when it has failed. It will contend that variation in outcomes associated with vanguard state resistance to sovereignty violation can be best explained by levels of interest convergence between the ASEAN vanguard state and designated external actors. Evidence will be provided to support the hypothesis that in 1999, ASEAN’s failure to resist violations to the sovereignty of Indonesia was a consequence of low interest convergence between Indonesia and the external powers. Conversely, in 1975, ASEAN’s ability to resist violations to the sovereignty of Indonesia was a consequence of high interest convergence between Indonesia and the external powers. As the vanguard state, Indonesia was able to apply pressure on the ASEAN states and obtain unanimous support for Indonesia’s East Timor policy in 1975 and 1999. However, the key factor explaining the variance in outcomes in both time periods resides in the critical role played by external actors. This view represents a serious challenge to much of the existing scholarship that emphasises ASEAN’s ability to defend regional autonomy. As these cases attempt to show, ASEAN autonomy is much more contingent than portrayed in the existing literature.Keywords: ASEAN, east timor, intervention, sovereignty
Procedia PDF Downloads 3571133 High Impact Biostratigrapgic Study
Abstract:
The re-calibration of the Campanian to Maastritchian of some parts Anambra basin was carried outusing samples from two exploration wells (Amama-1 and Bara-1), Amama-1 (219M–1829M) and Bara-1 (317M-1594M). Palynological and Paleontological analyses werecarried out on 100 ditch cutting samples. The faunal and floral succession were of terrestrialand marine origin as described and logged. The well penetrated four stratigraphic units inAnambra Basin (the Nkporo, Mamu, Ajali and Nsukka) the wells yielded well preservedformanifera and palynormorphs. The well yielded 53 species of foram and 69 species ofpalynomorphs, with 12 genera Bara-1 (25 Species of foram and 101 species of palynormorphs). Amama-1permitted the recognition of 21 genera with 31 formainiferal assemblage zones, 32 pollen and 37 sporesassemblage zones, and dinoflagellate cyst, biozonation, ranging from late Campanian – earlyPaleocene. Bara-1 yielded (60 pollen, 41 spore assemblage zone and 18 dinoflagellate cyst).The zones, in stratigraphically ascending order for the foraminifera and palynomorphs are asfollows. AmamaBiozone A-Globotruncanellahavanensis zone: Late Campanian –Maastrichtian (695 – 1829m) Biozone B-Morozovellavelascoensis zone: Early Paleocene(165–695m) Bara-1 Biozone A-Globotruncanellahavanensis zone: Late Campanian(1512m) Biozone B-Bolivinaafra, B. explicate zone: Maastrichtian (634–1204m) BiozoneC- Indeterminate (305 – 634m) Palynological Amama-1 A.Ctenolophoniditescostatus zone:Early Maastrichtian (1829m) B-Retidiporitesminiporatus Zone: Late Maastrichtian (1274m)Constructipollenitesineffectus Zone: Early Paleocene(695m) Bara-1 Droseriditessenonicus Zone: Late Campanian (994– 1600m) B. Ctenolophoniditescostatus Zone: EarlyMaastrichtian (713–994m) C. Retidiporitesminiporatus Zone: Late Maastrichtian (305 –713m) The paleo – environment of deposition were determined to range from non-marine toouter netritic. A detailed categorization of the palynormorphs into terrestrially derivedpalynormorphs and marine derived palynormorphs based on the distribution of three broadvegetation types; mangrove, fresh water swamps and hinther land communities were used toevaluate sea level fluctuations with respect to sediments deposited in the basins and linkedwith a particular depositional system tract. Amama-1 recorded 4 maximum flooding surface(MFS) at depth 165-1829, dated b/w 61ma-76ma and three sequence boundary(SB) at depth1048m-1533m and 1581 dated b/w 634m-1387m, dated 69.5ma-82ma and four sequenceboundary(SB) at 552m-876m, dated 68ma-77.5ma respectively. The application ofecostratigraphic description is characterised by the prominent expansion of the hinterlandcomponent consisting of the Mangrove to Lowland Rainforest and Afromontane – Savannah vegetation.Keywords: formanifera, palynomorphs. campanian, maastritchian, ecostratigraphic anambra
Procedia PDF Downloads 281132 Modeling Geogenic Groundwater Contamination Risk with the Groundwater Assessment Platform (GAP)
Authors: Joel Podgorski, Manouchehr Amini, Annette Johnson, Michael Berg
Abstract:
One-third of the world’s population relies on groundwater for its drinking water. Natural geogenic arsenic and fluoride contaminate ~10% of wells. Prolonged exposure to high levels of arsenic can result in various internal cancers, while high levels of fluoride are responsible for the development of dental and crippling skeletal fluorosis. In poor urban and rural settings, the provision of drinking water free of geogenic contamination can be a major challenge. In order to efficiently apply limited resources in the testing of wells, water resource managers need to know where geogenically contaminated groundwater is likely to occur. The Groundwater Assessment Platform (GAP) fulfills this need by providing state-of-the-art global arsenic and fluoride contamination hazard maps as well as enabling users to create their own groundwater quality models. The global risk models were produced by logistic regression of arsenic and fluoride measurements using predictor variables of various soil, geological and climate parameters. The maps display the probability of encountering concentrations of arsenic or fluoride exceeding the World Health Organization’s (WHO) stipulated concentration limits of 10 µg/L or 1.5 mg/L, respectively. In addition to a reconsideration of the relevant geochemical settings, these second-generation maps represent a great improvement over the previous risk maps due to a significant increase in data quantity and resolution. For example, there is a 10-fold increase in the number of measured data points, and the resolution of predictor variables is generally 60 times greater. These same predictor variable datasets are available on the GAP platform for visualization as well as for use with a modeling tool. The latter requires that users upload their own concentration measurements and select the predictor variables that they wish to incorporate in their models. In addition, users can upload additional predictor variable datasets either as features or coverages. Such models can represent an improvement over the global models already supplied, since (a) users may be able to use their own, more detailed datasets of measured concentrations and (b) the various processes leading to arsenic and fluoride groundwater contamination can be isolated more effectively on a smaller scale, thereby resulting in a more accurate model. All maps, including user-created risk models, can be downloaded as PDFs. There is also the option to share data in a secure environment as well as the possibility to collaborate in a secure environment through the creation of communities. In summary, GAP provides users with the means to reliably and efficiently produce models specific to their region of interest by making available the latest datasets of predictor variables along with the necessary modeling infrastructure.Keywords: arsenic, fluoride, groundwater contamination, logistic regression
Procedia PDF Downloads 3471131 Semiotics of the New Commercial Music Paradigm
Authors: Mladen Milicevic
Abstract:
This presentation will address how the statistical analysis of digitized popular music influences the music creation and emotionally manipulates consumers.Furthermore, it will deal with semiological aspect of uniformization of musical taste in order to predict the potential revenues generated by popular music sales. In the USA, we live in an age where most of the popular music (i.e. music that generates substantial revenue) has been digitized. It is safe to say that almost everything that was produced in last 10 years is already digitized (either available on iTunes, Spotify, YouTube, or some other platform). Depending on marketing viability and its potential to generate additional revenue most of the “older” music is still being digitized. Once the music gets turned into a digital audio file,it can be computer-analyzed in all kinds of respects, and the similar goes for the lyrics because they also exist as a digital text file, to which any kin of N Capture-kind of analysis may be applied. So, by employing statistical examination of different popular music metrics such as tempo, form, pronouns, introduction length, song length, archetypes, subject matter,and repetition of title, the commercial result may be predicted. Polyphonic HMI (Human Media Interface) introduced the concept of the hit song science computer program in 2003.The company asserted that machine learning could create a music profile to predict hit songs from its audio features Thus,it has been established that a successful pop song must include: 100 bpm or more;an 8 second intro;use the pronoun 'you' within 20 seconds of the start of the song; hit the bridge middle 8 between 2 minutes and 2 minutes 30 seconds; average 7 repetitions of the title; create some expectations and fill that expectation in the title. For the country song: 100 bpm or less for a male artist; 14-second intro; uses the pronoun 'you' within the first 20 seconds of the intro; has a bridge middle 8 between 2 minutes and 2 minutes 30 seconds; has 7 repetitions of title; creates an expectation,fulfills it in 60 seconds.This approach to commercial popular music minimizes the human influence when it comes to which “artist” a record label is going to sign and market. Twenty years ago,music experts in the A&R (Artists and Repertoire) departments of the record labels were making personal aesthetic judgments based on their extensive experience in the music industry. Now, the computer music analyzing programs, are replacing them in an attempt to minimize investment risk of the panicking record labels, in an environment where nobody can predict the future of the recording industry.The impact on the consumers taste through the narrow bottleneck of the above mentioned music selection by the record labels,created some very peculiar effects not only on the taste of popular music consumers, but also the creative chops of the music artists as well. What is the meaning of this semiological shift is the main focus of this research and paper presentation.Keywords: music, semiology, commercial, taste
Procedia PDF Downloads 3921130 Modeling Engagement with Multimodal Multisensor Data: The Continuous Performance Test as an Objective Tool to Track Flow
Authors: Mohammad H. Taheri, David J. Brown, Nasser Sherkat
Abstract:
Engagement is one of the most important factors in determining successful outcomes and deep learning in students. Existing approaches to detect student engagement involve periodic human observations that are subject to inter-rater reliability. Our solution uses real-time multimodal multisensor data labeled by objective performance outcomes to infer the engagement of students. The study involves four students with a combined diagnosis of cerebral palsy and a learning disability who took part in a 3-month trial over 59 sessions. Multimodal multisensor data were collected while they participated in a continuous performance test. Eye gaze, electroencephalogram, body pose, and interaction data were used to create a model of student engagement through objective labeling from the continuous performance test outcomes. In order to achieve this, a type of continuous performance test is introduced, the Seek-X type. Nine features were extracted including high-level handpicked compound features. Using leave-one-out cross-validation, a series of different machine learning approaches were evaluated. Overall, the random forest classification approach achieved the best classification results. Using random forest, 93.3% classification for engagement and 42.9% accuracy for disengagement were achieved. We compared these results to outcomes from different models: AdaBoost, decision tree, k-Nearest Neighbor, naïve Bayes, neural network, and support vector machine. We showed that using a multisensor approach achieved higher accuracy than using features from any reduced set of sensors. We found that using high-level handpicked features can improve the classification accuracy in every sensor mode. Our approach is robust to both sensor fallout and occlusions. The single most important sensor feature to the classification of engagement and distraction was shown to be eye gaze. It has been shown that we can accurately predict the level of engagement of students with learning disabilities in a real-time approach that is not subject to inter-rater reliability, human observation or reliant on a single mode of sensor input. This will help teachers design interventions for a heterogeneous group of students, where teachers cannot possibly attend to each of their individual needs. Our approach can be used to identify those with the greatest learning challenges so that all students are supported to reach their full potential.Keywords: affective computing in education, affect detection, continuous performance test, engagement, flow, HCI, interaction, learning disabilities, machine learning, multimodal, multisensor, physiological sensors, student engagement
Procedia PDF Downloads 931129 Quantifying the Effects of Canopy Cover and Cover Crop Species on Water Use Partitioning in Micro-Sprinkler Irrigated Orchards in South Africa
Authors: Zanele Ntshidi, Sebinasi Dzikiti, Dominic Mazvimavi
Abstract:
South Africa is a dry country and yet it is ranked as the 8th largest exporter of fresh apples (Malus Domestica) globally. Prime apple producing regions are in the Eastern and Western Cape Provinces of the country where all the fruit is grown under irrigation. Climate change models predict increasingly drier future conditions in these regions and the frequency and severity of droughts is expected to increase. For the sustainability and growth of the fruit industry it is important to minimize non-beneficial water losses from the orchard floor. The aims of this study were firstly to compare the water use of cover crop species used in South African orchards for which there is currently no information. The second aim was to investigate how orchard water use (evapotranspiration) was partitioned into beneficial (tree transpiration) and non-beneficial (orchard floor evaporation) water uses for micro-sprinkler irrigated orchards with different canopy covers. This information is important in order to explore opportunities to minimize non-beneficial water losses. Six cover crop species (four exotic and two indigenous) were grown in 2 L pots in a greenhouse. Cover crop transpiration was measured using the gravimetric method on clear days. To establish how water use was partitioned in orchards, evapotranspiration (ET) was measured using an open path eddy covariance system, while tree transpiration was measured hourly throughout the season (October to June) on six trees per orchard using the heat ratio sap flow method. On selected clear days, soil evaporation was measured hourly from sunrise to sunset using six micro-lysimeters situated at different wet/dry and sun/shade positions on the orchard floor. Transpiration of cover crops was measured using miniature (2 mm Ø) stem heat balance sap flow gauges. The greenhouse study showed that exotic cover crops had significantly higher (p < 0.01) average transpiration rates (~3.7 L/m2/d) than the indigenous species (~ 2.2 L/m²/d). In young non-bearing orchards, orchard floor evaporative fluxes accounted for more than 60% of orchard ET while this ranged from 10 to 30% in mature orchards with a high canopy cover. While exotic cover crops are preferred by most farmers, this study shows that they use larger quantities of water than indigenous species. This in turn contributes to a larger orchard floor evaporation flux. In young orchards non-beneficial losses can be minimized by adopting drip or short range micro-sprinkler methods that reduce the wetted soil fraction thereby conserving water.Keywords: evapotranspiration, sap flow, soil evaporation, transpiration
Procedia PDF Downloads 3871128 Poly(propylene fumarate) Copolymers with Phosphonic Acid-based Monomers Designed as Bone Tissue Engineering Scaffolds
Authors: Görkem Cemali̇, Avram Aruh, Gamze Torun Köse, Erde Can ŞAfak
Abstract:
In order to heal bone disorders, the conventional methods which involve the use of autologous and allogenous bone grafts or permanent implants have certain disadvantages such as limited supply, disease transmission, or adverse immune response. A biodegradable material that acts as structural support to the damaged bone area and serves as a scaffold that enhances bone regeneration and guides bone formation is one desirable solution. Poly(propylene fumarate) (PPF) which is an unsaturated polyester that can be copolymerized with appropriate vinyl monomers to give biodegradable network structures, is a promising candidate polymer to prepare bone tissue engineering scaffolds. In this study, hydroxyl-terminated PPF was synthesized and thermally cured with vinyl phosphonic acid (VPA) and diethyl vinyl phosphonate (VPES) in the presence of radical initiator benzoyl peroxide (BP), with changing co-monomer weight ratios (10-40wt%). In addition, the synthesized PPF was cured with VPES comonomer at body temperature (37oC) in the presence of BP initiator, N, N-Dimethyl para-toluidine catalyst and varying amounts of Beta-tricalcium phosphate (0-20 wt% ß-TCP) as filler via radical polymerization to prepare composite materials that can be used in injectable forms. Thermomechanical properties, compressive properties, hydrophilicity and biodegradability of the PPF/VPA and PPF/VPES copolymers were determined and analyzed with respect to the copolymer composition. Biocompatibility of the resulting polymers and their composites was determined by the MTS assay and osteoblast activity was explored with von kossa, alkaline phosphatase and osteocalcin activity analysis and the effects of VPA and VPES comonomer composition on these properties were investigated. Thermally cured PPF/VPA and PPF/VPES copolymers with different compositions exhibited compressive modulus and strength values in the wide range of 10–836 MPa and 14–119 MPa, respectively. MTS assay studies showed that the majority of the tested compositions were biocompatible and the overall results indicated that PPF/VPA and PPF/VPES network polymers show significant potential for applications as bone tissue engineering scaffolds where varying PPF and co-monomer ratio provides adjustable and controllable properties of the end product. The body temperature cured PPF/VPES/ß-TCP composites exhibited significantly lower compressive modulus and strength values than the thermal cured PPF/VPES copolymers and were therefore found to be useful as scaffolds for cartilage tissue engineering applications.Keywords: biodegradable, bone tissue, copolymer, poly(propylene fumarate), scaffold
Procedia PDF Downloads 1651127 Geoinformation Technology of Agricultural Monitoring Using Multi-Temporal Satellite Imagery
Authors: Olena Kavats, Dmitry Khramov, Kateryna Sergieieva, Vladimir Vasyliev, Iurii Kavats
Abstract:
Geoinformation technologies of space agromonitoring are a means of operative decision making support in the tasks of managing the agricultural sector of the economy. Existing technologies use satellite images in the optical range of electromagnetic spectrum. Time series of optical images often contain gaps due to the presence of clouds and haze. A geoinformation technology is created. It allows to fill gaps in time series of optical images (Sentinel-2, Landsat-8, PROBA-V, MODIS) with radar survey data (Sentinel-1) and use information about agrometeorological conditions of the growing season for individual monitoring years. The technology allows to perform crop classification and mapping for spring-summer (winter and spring crops) and autumn-winter (winter crops) periods of vegetation, monitoring the dynamics of crop state seasonal changes, crop yield forecasting. Crop classification is based on supervised classification algorithms, takes into account the peculiarities of crop growth at different vegetation stages (dates of sowing, emergence, active vegetation, and harvesting) and agriculture land state characteristics (row spacing, seedling density, etc.). A catalog of samples of the main agricultural crops (Ukraine) is created and crop spectral signatures are calculated with the preliminary removal of row spacing, cloud cover, and cloud shadows in order to construct time series of crop growth characteristics. The obtained data is used in grain crop growth tracking and in timely detection of growth trends deviations from reference samples of a given crop for a selected date. Statistical models of crop yield forecast are created in the forms of linear and nonlinear interconnections between crop yield indicators and crop state characteristics (temperature, precipitation, vegetation indices, etc.). Predicted values of grain crop yield are evaluated with an accuracy up to 95%. The developed technology was used for agricultural areas monitoring in a number of Great Britain and Ukraine regions using EOS Crop Monitoring Platform (https://crop-monitoring.eos.com). The obtained results allow to conclude that joint use of Sentinel-1 and Sentinel-2 images improve separation of winter crops (rapeseed, wheat, barley) in the early stages of vegetation (October-December). It allows to separate successfully the soybean, corn, and sunflower sowing areas that are quite similar in their spectral characteristics.Keywords: geoinformation technology, crop classification, crop yield prediction, agricultural monitoring, EOS Crop Monitoring Platform
Procedia PDF Downloads 4541126 Salmonella Emerging Serotypes in Northwestern Italy: Genetic Characterization by Pulsed-Field Gel Electrophoresis
Authors: Clara Tramuta, Floris Irene, Daniela Manila Bianchi, Monica Pitti, Giulia Federica Cazzaniga, Lucia Decastelli
Abstract:
This work presents the results obtained by the Regional Reference Centre for Salmonella Typing (CeRTiS) in a retrospective study aimed to investigate, through Pulsed-field Gel Electrophoresis (PFGE) analysis, the genetic relatedness of emerging Salmonella serotypes of human origin circulating in North-West of Italy. Furthermore, the goal of this work was to create a Regional database to facilitate foodborne outbreak investigation and to monitor them at an earlier stage. A total of 112 strains, isolated from 2016 to 2018 in hospital laboratories, were included in this study. The isolates were previously identified as Salmonella according to standard microbiological techniques and serotyping was performed according to ISO 6579-3 and the Kaufmann-White scheme using O and H antisera (Statens Serum Institut®). All strains were characterized by PFGE: analysis was conducted according to a standardized PulseNet protocol. The restriction enzyme XbaI was used to generate several distinguishable genomic fragments on the agarose gel. PFGE was performed on a CHEF Mapper system, separating large fragments and generating comparable genetic patterns. The agarose gel was then stained with GelRed® and photographed under ultraviolet transillumination. The PFGE patterns obtained from the 112 strains were compared using Bionumerics version 7.6 software with the Dice coefficient with 2% band tolerance and 2% optimization. For each serotype, the data obtained with the PFGE were compared according to the geographical origin and the year in which they were isolated. Salmonella strains were identified as follow: S. Derby n. 34; S. Infantis n. 38; S. Napoli n. 40. All the isolates had appreciable restricted digestion patterns ranging from approximately 40 to 1100 kb. In general, a fairly heterogeneous distribution of pulsotypes has emerged in the different provinces. Cluster analysis indicated high genetic similarity (≥ 83%) among strains of S. Derby (n. 30; 88%), S. Infantis (n. 36; 95%) and S. Napoli (n. 38; 95%) circulating in north-western Italy. The study underlines the genomic similarities shared by the emerging Salmonella strains in Northwest Italy and allowed to create a database to detect outbreaks in an early stage. Therefore, the results confirmed that PFGE is a powerful and discriminatory tool to investigate the genetic relationships among strains in order to monitoring and control Salmonellosis outbreak spread. Pulsed-field gel electrophoresis (PFGE) still represents one of the most suitable approaches to characterize strains, in particular for the laboratories for which NGS techniques are not available.Keywords: emerging Salmonella serotypes, genetic characterization, human strains, PFGE
Procedia PDF Downloads 1051125 Predicting Blockchain Technology Installation Cost in Supply Chain System through Supervised Learning
Authors: Hossein Havaeji, Tony Wong, Thien-My Dao
Abstract:
1. Research Problems and Research Objectives: Blockchain Technology-enabled Supply Chain System (BT-enabled SCS) is the system using BT to drive SCS transparency, security, durability, and process integrity as SCS data is not always visible, available, or trusted. The costs of operating BT in the SCS are a common problem in several organizations. The costs must be estimated as they can impact existing cost control strategies. To account for system and deployment costs, it is necessary to overcome the following hurdle. The problem is that the costs of developing and running a BT in SCS are not yet clear in most cases. Many industries aiming to use BT have special attention to the importance of BT installation cost which has a direct impact on the total costs of SCS. Predicting BT installation cost in SCS may help managers decide whether BT is to be an economic advantage. The purpose of the research is to identify some main BT installation cost components in SCS needed for deeper cost analysis. We then identify and categorize the main groups of cost components in more detail to utilize them in the prediction process. The second objective is to determine the suitable Supervised Learning technique in order to predict the costs of developing and running BT in SCS in a particular case study. The last aim is to investigate how the running BT cost can be involved in the total cost of SCS. 2. Work Performed: Applied successfully in various fields, Supervised Learning is a method to set the data frame, treat the data, and train/practice the method sort. It is a learning model directed to make predictions of an outcome measurement based on a set of unforeseen input data. The following steps must be conducted to search for the objectives of our subject. The first step is to make a literature review to identify the different cost components of BT installation in SCS. Based on the literature review, we should choose some Supervised Learning methods which are suitable for BT installation cost prediction in SCS. According to the literature review, some Supervised Learning algorithms which provide us with a powerful tool to classify BT installation components and predict BT installation cost are the Support Vector Regression (SVR) algorithm, Back Propagation (BP) neural network, and Artificial Neural Network (ANN). Choosing a case study to feed data into the models comes into the third step. Finally, we will propose the best predictive performance to find the minimum BT installation costs in SCS. 3. Expected Results and Conclusion: This study tends to propose a cost prediction of BT installation in SCS with the help of Supervised Learning algorithms. At first attempt, we will select a case study in the field of BT-enabled SCS, and then use some Supervised Learning algorithms to predict BT installation cost in SCS. We continue to find the best predictive performance for developing and running BT in SCS. Finally, the paper will be presented at the conference.Keywords: blockchain technology, blockchain technology-enabled supply chain system, installation cost, supervised learning
Procedia PDF Downloads 1191124 Linguistic Competence Analysis and the Development of Speaking Instructional Material
Authors: Felipa M. Rico
Abstract:
Linguistic oral competence plays a vital role in attaining effective communication. Since the English language is considered as universally used language and has a high demand skill needed in the work-place, mastery is the expected output from learners. To achieve this, learners should be given integrated differentiated tasks which help them develop and strengthen the expected skills. This study aimed to develop speaking instructional supplementary material to enhance the English linguistic competence of Grade 9 students in areas of pronunciation, intonation and stress, voice projection, diction and fluency. A descriptive analysis was utilized to analyze the speaking level of performance of the students in order to employ appropriate strategies. There were two sets of respondents: 178 Grade 9 students selected through a stratified sampling and chosen at random. The other set comprised English teachers who evaluated the usefulness of the devised teaching materials. A teacher conducted a speaking test and activities were employed to analyze the speaking needs of students. Observation and recordings were also used to evaluate the students’ performance. The findings revealed that the English pronunciation of the students was slightly unclear at times, but generally fair. There were lapses but generally they rated moderate in intonation and stress, because of other language interference. In terms of voice projection, students have erratic high volume pitch. For diction, the students’ ability to produce comprehensible language is limited, and as to fluency, the choice of vocabulary and use of structure were severely limited. Based on the students’ speaking needs analyses, the supplementary material devised was based on Nunan’s IM model, incorporating context of daily life and global work settings, considering the principle that language is best learned in the actual meaningful situation. To widen the mastery of skill, a rich learning environment, filled with a variety instructional material tends to foster faster acquisition of the requisite skills for sustained learning and development. The role of IM is to encourage information to stick in the learners’ mind, as what is seen is understood more than what is heard. Teachers say they found the IM “very useful.” This implied that English teachers could adopt the materials to improve the speaking skills of students. Further, teachers should provide varied opportunities for students to get involved in real life situations where they could take turns in asking and answering questions and share information related to the activities. This would minimize anxiety among students in the use of the English language.Keywords: diction, fluency, intonation, instructional materials, linguistic competence
Procedia PDF Downloads 2401123 The Re-Emergence of Russia Foreign Policy (Case Study: Middle East)
Authors: Maryam Azish
Abstract:
Russia, as an emerging global player in recent years, has projected a special place in the Middle East. Despite all the challenges it has faced over the years, it has always considered its presence in various fields with a strategy that has defined its maneuvering power as a level of competition and even confrontation with the United States. Therefore, its current approach is considered important as an influential actor in the Middle East. After the collapse of the Soviet Union, when the Russians withdrew completely from the Middle East, the American scene remained almost unrivaled by the Americans. With the start of the US-led war in Iraq and Afghanistan and the subsequent developments that led to the US military and political defeat, a new chapter in regional security was created in which ISIL and Taliban terrorism went along with the Arab Spring to destabilize the Middle East. Because of this, the Americans took every opportunity to strengthen their military presence. Iraq, Syria and Afghanistan have always been the three areas where terrorism was shaped, and the countries of the region have each reacted to this evil phenomenon accordingly. The West dealt with this phenomenon on a case-by-case basis in the general circumstances that created the fluid situation in the Arab countries and the region. Russian President Vladimir Putin accused the US of falling asleep in the face of ISIS and terrorism in Syria. In fact, this was an opportunity for the Russians to revive their presence in Syria. This article suggests that utilizing the recognition policy along with the constructivism theory will offer a better knowledge of Russia’s endeavors to endorse its international position. Accordingly, Russia’s distinctiveness and its ambitions for a situation of great power have played a vital role in shaping national interests and, subsequently, in foreign policy, in Putin's era in particular. The focal claim of the paper is that scrutinize Russia’s foreign policy with realistic methods cannot be attained. Consequently, with an aim to fill the prevailing vacuum, this study exploits the politics of acknowledgment in the context of constructivism to examine Russia’s foreign policy in the Middle East. The results of this paper show that the key aim of Russian foreign policy discourse, accompanied by increasing power and wealth, is to recognize and reinstate the position of great power in the universal system. The Syrian crisis has created an opportunity for Russia to unite its position in the developing global and regional order after ages of dynamic and prevalent existence in the Middle East as well as contradicting US unilateralism. In the meantime, the writer thinks that the question of identifying Russia’s position in the global system by the West has played a foremost role in serving its national interests.Keywords: constructivism, foreign Policy, middle East, Russia, regionalism
Procedia PDF Downloads 1481122 Minding the Gap: Consumer Contracts in the Age of Online Information Flow
Authors: Samuel I. Becher, Tal Z. Zarsky
Abstract:
The digital world becomes part of our DNA now. The way e-commerce, human behavior, and law interact and affect one another is rapidly and significantly changing. Among others things, the internet equips consumers with a variety of platforms to share information in a volume we could not imagine before. As part of this development, online information flows allow consumers to learn about businesses and their contracts in an efficient and quick manner. Consumers can become informed by the impressions that other, experienced consumers share and spread. In other words, consumers may familiarize themselves with the contents of contracts through the experiences that other consumers had. Online and offline, the relationship between consumers and businesses are most frequently governed by consumer standard form contracts. For decades, such contracts are assumed to be one-sided and biased against consumers. Consumer Law seeks to alleviate this bias and empower consumers. Legislatures, consumer organizations, scholars, and judges are constantly looking for clever ways to protect consumers from unscrupulous firms and unfair behaviors. While consumers-businesses relationships are theoretically administered by standardized contracts, firms do not always follow these contracts in practice. At times, there is a significant disparity between what the written contract stipulates and what consumers experience de facto. That is, there is a crucial gap (“the Gap”) between how firms draft their contracts on the one hand, and how firms actually treat consumers on the other. Interestingly, the Gap is frequently manifested by deviation from the written contract in favor of consumers. In other words, firms often exercise lenient approach in spite of the stringent written contracts they draft. This essay examines whether, counter-intuitively, policy makers should add firms’ leniency to the growing list of firms suspicious behaviors. At first glance, firms should be allowed, if not encouraged, to exercise leniency. Many legal regimes are looking for ways to cope with unfair contract terms in consumer contracts. Naturally, therefore, consumer law should enable, if not encourage, firms’ lenient practices. Firms’ willingness to deviate from their strict contracts in order to benefit consumers seems like a sensible approach. Apparently, such behavior should not be second guessed. However, at times online tools, firm’s behaviors and human psychology result in a toxic mix. Beneficial and helpful online information should be treated with due respect as it may occasionally have surprising and harmful qualities. In this essay, we illustrate that technological changes turn the Gap into a key component in consumers' understanding, or misunderstanding, of consumer contracts. In short, a Gap may distort consumers’ perception and undermine rational decision-making. Consequently, this essay explores whether, counter-intuitively, consumer law should sanction firms that create a Gap and use it. It examines when firms’ leniency should be considered as manipulative or exercised in bad faith. It then investigates whether firms should be allowed to enforce the written contract even if the firms deliberately and consistently deviated from it.Keywords: consumer contracts, consumer protection, information flow, law and economics, law and technology, paper deal v firms' behavior
Procedia PDF Downloads 1951121 Weapon-Being: Weaponized Design and Object-Oriented Ontology in Hypermodern Times
Authors: John Dimopoulos
Abstract:
This proposal attempts a refabrication of Heidegger’s classic thing-being and object-being analysis in order to provide better ontological tools for understanding contemporary culture, technology, and society. In his work, Heidegger sought to understand and comment on the problem of technology in an era of rampant innovation and increased perils for society and the planet. Today we seem to be at another crossroads in this course, coming after postmodernity, during which dreams and dangers of modernity augmented with critical speculations of the post-war era take shape. The new era which we are now living in, referred to as hypermodernity by researchers in various fields such as architecture and cultural theory, is defined by the horizontal implementation of digital technologies, cybernetic networks, and mixed reality. Technology today is rapidly approaching a turning point, namely the point of no return for humanity’s supervision over its creations. The techno-scientific civilization of the 21st century creates a series of problems, progressively more difficult and complex to solve and impossible to ignore, climate change, data safety, cyber depression, and digital stress being some of the most prevalent. Humans often have no other option than to address technology-induced problems with even more technology, as in the case of neuron networks, machine learning, and AI, thus widening the gap between creating technological artifacts and understanding their broad impact and possible future development. As all technical disciplines and particularly design, become enmeshed in a matrix of digital hyper-objects, a conceptual toolbox that allows us to handle the new reality becomes more and more necessary. Weaponized design, prevalent in many fields, such as social and traditional media, urban planning, industrial design, advertising, and the internet in general, hints towards an increase in conflicts. These conflicts between tech companies, stakeholders, and users with implications in politics, work, education, and production as apparent in the cases of Amazon workers’ strikes, Donald Trump’s 2016 campaign, Facebook and Microsoft data scandals, and more are often non-transparent to the wide public’s eye, thus consolidating new elites and technocratic classes and making the public scene less and less democratic. The new category proposed, weapon-being, is outlined in respect to the basic function of reducing complexity, subtracting materials, actants, and parameters, not strictly in favor of a humanistic re-orientation but in a more inclusive ontology of objects and subjects. Utilizing insights of Object-Oriented Ontology (OOO) and its schematization of technological objects, an outline for a radical ontology of technology is approached.Keywords: design, hypermodernity, object-oriented ontology, weapon-being
Procedia PDF Downloads 1521120 Prevalence of Antibiotic Resistant Enterococci in Treated Wastewater Effluent in Durban, South Africa and Characterization of Vancomycin and High-Level Gentamicin-Resistant Strains
Authors: S. H. Gasa, L. Singh, B. Pillay, A. O. Olaniran
Abstract:
Wastewater treatment plants (WWTPs) have been implicated as the leading reservoir for antibiotic resistant bacteria (ARB), including Enterococci spp. and antibiotic resistance genes (ARGs), worldwide. Enterococci are a group of clinically significant bacteria that have gained much attention as a result of their antibiotic resistance. They play a significant role as the principal cause of nosocomial infections and dissemination of antimicrobial resistance genes in the environment. The main objective of this study was to ascertain the role of WWTPs in Durban, South Africa as potential reservoirs for antibiotic resistant Enterococci (ARE) and their related ARGs. Furthermore, the antibiogram and resistance gene profile of Enterococci species recovered from treated wastewater effluent and receiving surface water in Durban were also investigated. Using membrane filtration technique, Enterococcus selective agar and selected antibiotics, ARE were enumerated in samples (influent, activated sludge, before chlorination and final effluent) collected from two WWTPs, as well as from upstream and downstream of the receiving surface water. Two hundred Enterococcus isolates recovered from the treated effluent and receiving surface water were identified by biochemical and PCR-based methods, and their antibiotic resistance profiles determined by the Kirby-Bauer disc diffusion assay, while PCR-based assays were used to detect the presence of resistance and virulence genes. High prevalence of ARE was obtained at both WWTPs, with values reaching a maximum of 40%. The influent and activated sludge samples contained the greatest prevalence of ARE with lower values observed in the before and after chlorination samples. Of the 44 vancomycin and high-level gentamicin-resistant isolates, 11 were identified as E. faecium, 18 as E. faecalis, 4 as E. hirae while 11 are classified as “other” Enterococci species. High-level aminoglycoside resistance for gentamicin (39%) and vancomycin (61%) was recorded in species tested. The most commonly detected virulence gene was the gelE (44%), followed by asa1 (40%), while cylA and esp were detected in only 2% of the isolates. The most prevalent aminoglycoside resistance genes were aac(6')-Ie-aph(2''), aph(3')-IIIa, and ant(6')-Ia detected in 43%, 45% and 41% of the isolates, respectively. Positive correlation was observed between resistant phenotypes to high levels of aminoglycosides and presence of all aminoglycoside resistance genes. Resistance genes for glycopeptide: vanB (37%) and vanC-1 (25%), and macrolide: ermB (11%) and ermC (54%) were detected in the isolates. These results show the need for more efficient wastewater treatment and disposal in order to prevent the release of virulent and antibiotic resistant Enterococci species and safeguard public health.Keywords: antibiogram, enterococci, gentamicin, vancomycin, virulence signatures
Procedia PDF Downloads 2181119 Profitability and Productivity Performance of the Selected Public Sector Banks in India
Authors: Sudipto Jana
Abstract:
Background and significance of the study: Banking industry performs as a catalyst for industrial growth and agricultural growth, however, as well involves the existence and welfare of the citizens. The banking system in India was described by unmatched growth and the recreation of bunch making in the pre-liberalization era. At the time of financial sector reforms Reserve Bank of India issued a regulatory norm concerning capital adequacy, income recognition, asset classification and provisioning that have increasingly precede meeting by means of the international paramount performs. Bank management ceaselessly manages the triumph, effectiveness, productivity and performance of the bank as good performance, high productivity and efficiency authorizes the triumph of the bank management targets as well as aims of bank. In a comparable move toward performance of any economy depends upon the expediency and effectiveness of its financial system of nation establishes its economic growth indicators. Profitability and productivity are the most important relevant parameters of any banking group. Keeping in view of this, this study examines the profitability and productivity performance of the selected public sector banks in India. Methodology: This study is based on secondary data obtained from Reserve Bank of India database for the periods between 2006 and 2015. This study purposively selects four types of commercial banks, namely, State Bank of India, United Bank of India, Punjab National Bank and Allahabad Bank. In order to analyze the performance with relation to profitability and productivity, productivity performance indicators in terms of capital adequacy ratio, burden ratio, business per employee, spread per employee and advances per employee and profitability performance indicators in terms of return on assets, return on equity, return on advances and return on branch have been considered. In the course of analysis, descriptive statistics, correlation statistics and multiple regression have been used. Major findings: Descriptive statistics indicate that productivity performance of State Bank of India is very satisfactory than other public sector banks in India. But management of productivity is unsatisfactory in case of all the public sector banks under study. Correlation statistics point out that profitability of the public sector banks are strongly positively related with productivity performance in case of all the public sector banks under study. Multiple regression test results show that when profitability increases profit per employee increases and net non-performing assets decreases. Concluding statements: Productivity and profitability performance of United Bank of India, Allahabad Bank and Punjab National Bank are unsatisfactory due to poor management of asset quality as well as management efficiency. It needs government’s interference so that profitability and productivity performance are increased in the near future.Keywords: India, productivity, profitability, public sector banks
Procedia PDF Downloads 4281118 Degradation of the Cu-DOM Complex by Bacteria: A Way to Increase Phytoextraction of Copper in a Vineyard Soil
Authors: Justine Garraud, Hervé Capiaux, Cécile Le Guern, Pierre Gaudin, Clémentine Lapie, Samuel Chaffron, Erwan Delage, Thierry Lebeau
Abstract:
The repeated use of Bordeaux mixture (copper sulphate) and other chemical forms of copper (Cu) has led to its accumulation in wine-growing soils for more than a century, to the point of modifying the ecosystem of these soils. Phytoextraction of copper could progressively reduce the Cu load in these soils, and even to recycle copper (e.g. as a micronutrient in animal nutrition) by cultivating the extracting plants in the inter-row of the vineyards. Soil cleaning up usually requires several years because the chemical speciation of Cu in solution is mainly based on forms complexed with dissolved organic matter (DOM) that are not phytoavailable, unlike the "free" forms (Cu2+). Indeed, more than 98% of Cu in the solution is bound to DOM. The selection and inoculation of invineyardsoils in vineyard soils ofbacteria(bioaugmentation) able to degrade Cu-DOM complexes could increase the phytoavailable pool of Cu2+ in the soil solution (in addition to bacteria which first mobilize Cu in solution from the soil bearing phases) in order to increase phytoextraction performance. In this study, sevenCu-accumulating plants potentially usable in inter-row were tested for their Cu phytoextraction capacity in hydroponics (ray-grass, brown mustard, buckwheat, hemp, sunflower, oats, and chicory). Also, a bacterial consortium was tested: Pseudomonas sp. previously studied for its ability to mobilize Cu through the pyoverdine siderophore (complexing agent) and potentially to degrade Cu-DOM complexes, and a second bacterium (to be selected) able to promote the survival of Pseudomonas sp. following its inoculation in soil. Interaction network method was used based on the notions of co-occurrence and, therefore, of bacterial abundance found in the same soils. Bacteria from the EcoVitiSol project (Alsace, France) were targeted. The final step consisted of incoupling the bacterial consortium with the chosen plant in soil pots. The degradation of Cu-DOMcomplexes is measured on the basis of the absorption index at 254nm, which gives insight on the aromaticity of the DOM. The“free” Cu in solution (from the mobilization of Cu and/or the degradation of Cu-MOD complexes) is assessed by measuring pCu. Eventually, Cu accumulation in plants is measured by ICP-AES. The selection of the plant is currently being finalized. The interaction network method targeted the best positive interactions ofFlavobacterium sp. with Pseudomonassp. These bacteria are both PGPR (plant growth promoting rhizobacteria) with the ability to improve the plant growth and to mobilize Cu from the soil bearing phases (siderophores). Also, these bacteria are known to degrade phenolic groups, which are highly present in DOM. They could therefore contribute to the degradation of DOM-Cu. The results of the upcoming bacteria-plant coupling tests in pots will be also presented.Keywords: complexes Cu-DOM, bioaugmentation, phytoavailability, phytoextraction
Procedia PDF Downloads 791117 Ethical, Legal and Societal Aspects of Unmanned Aircraft in Defence
Authors: Henning Lahmann, Benjamyn I. Scott, Bart Custers
Abstract:
Suboptimal adoption of AI in defence organisations carries risks for the protection of the freedom, safety, and security of society. Despite the vast opportunities that defence AI-technology presents, there are also a variety of ethical, legal, and societal concerns. To ensure the successful use of AI technology by the military, ethical, legal, and societal aspects (ELSA) need to be considered, and their concerns continuously addressed at all levels. This includes ELSA considerations during the design, manufacturing and maintenance of AI-based systems, as well as its utilisation via appropriate military doctrine and training. This raises the question how defence organisations can remain strategically competitive and at the edge of military innovation, while respecting the values of its citizens. This paper will explain the set-up and share preliminary results of a 4-year research project commissioned by the National Research Council in the Netherlands on the ethical, legal, and societal aspects of AI in defence. The project plans to develop a future-proof, independent, and consultative ecosystem for the responsible use of AI in the defence domain. In order to achieve this, the lab shall devise a context-dependent methodology that focuses on the ‘analysis’, ‘design’ and ‘evaluation’ of ELSA of AI-based applications within the military context, which include inter alia unmanned aircraft. This is bolstered as the Lab also recognises and complements the existing methods in regards to human-machine teaming, explainable algorithms, and value-sensitive design. Such methods will be modified for the military context and applied to pertinent case-studies. These case-studies include, among others, the application of autonomous robots (incl. semi- autonomous) and AI-based methods against cognitive warfare. As the perception of the application of AI in the military context, by both society and defence personnel, is important, the Lab will study how these perceptions evolve and vary in different contexts. Furthermore, the Lab will monitor – as they may influence people’s perception – developments in the global technological, military and societal spheres. Although the emphasis of the research project is on different forms of AI in defence, it focuses on several case studies. One of these case studies is on unmanned aircraft, which will also be the focus of the paper. Hence, ethical, legal, and societal aspects of unmanned aircraft in the defence domain will be discussed in detail, including but not limited to privacy issues. Typical other issues concern security (for people, objects, data or other aircraft), privacy (sensitive data, hindrance, annoyance, data collection, function creep), chilling effects, PlayStation mentality, and PTSD.Keywords: autonomous weapon systems, unmanned aircraft, human-machine teaming, meaningful human control, value-sensitive design
Procedia PDF Downloads 911116 Densities and Volumetric Properties of {Difurylmethane + [(C5 – C8) N-Alkane or an Amide]} Binary Systems at 293.15, 298.15 and 303.15 K: Modelling Excess Molar Volumes by Prigogine-Flory-Patterson Theory
Authors: Belcher Fulele, W. A. A. Ddamba
Abstract:
Study of solvent systems contributes to the understanding of intermolecular interactions that occur in binary mixtures. These interactions involves among others strong dipole-dipole interactions and weak van de Waals interactions which are of significant application in pharmaceuticals, solvent extractions, design of reactors and solvent handling and storage processes. Binary mixtures of solvents can thus be used as a model to interpret thermodynamic behavior that occur in a real solution mixture. Densities of pure DFM, n-alkanes (n-pentane, n-hexane, n-heptane and n-octane) and amides (N-methylformamide, N-ethylformamide, N,N-dimethylformamide and N,N-dimethylacetamide) as well as their [DFM + ((C5-C8) n-alkane or amide)] binary mixtures over the entire composition range, have been reported at temperature 293.15, 298.15 and 303.15 K and atmospheric pressure. These data has been used to derive the thermodynamic properties: the excess molar volume of solution, apparent molar volumes, excess partial molar volumes, limiting excess partial molar volumes, limiting partial molar volumes of each component of a binary mixture. The results are discussed in terms of possible intermolecular interactions and structural effects that occur in the binary mixtures. The variation of excess molar volume with DFM composition for the [DFM + (C5-C7) n-alkane] binary mixture exhibit a sigmoidal behavior while for the [DFM + n-octane] binary system, positive deviation of excess molar volume function was observed over the entire composition range. For each of the [DFM + (C5-C8) n-alkane] binary mixture, the excess molar volume exhibited a fall with increase in temperature. The excess molar volume for each of [DFM + (NMF or NEF or DMF or DMA)] binary system was negative over the entire DFM composition at each of the three temperatures investigated. The negative deviations in excess molar volume values follow the order: DMA > DMF > NEF > NMF. Increase in temperature has a greater effect on component self-association than it has on complex formation between molecules of components in [DFM + (NMF or NEF or DMF or DMA)] binary mixture which shifts complex formation equilibrium towards complex to give a drop in excess molar volume with increase in temperature. The Prigogine-Flory-Patterson model has been applied at 298.15 K and reveals that the free volume is the most important contributing term to the excess experimental molar volume data for [DFM + (n-pentane or n-octane)] binary system. For [DFM + (NMF or DMF or DMA)] binary mixture, the interactional term and characteristic pressure term contributions are the most important contributing terms in describing the sign of experimental excess molar volume. The mixture systems contributed to the understanding of interactions of polar solvents with proteins (amides) with non-polar solvents (alkanes) in biological systems.Keywords: alkanes, amides, excess thermodynamic parameters, Prigogine-Flory-Patterson model
Procedia PDF Downloads 3541115 Harvesting Value-added Products Through Anodic Electrocatalytic Upgrading Intermediate Compounds Utilizing Biomass to Accelerating Hydrogen Evolution
Authors: Mehran Nozari-Asbemarz, Italo Pisano, Simin Arshi, Edmond Magner, James J. Leahy
Abstract:
Integrating electrolytic synthesis with renewable energy makes it feasible to address urgent environmental and energy challenges. Conventional water electrolyzers concurrently produce H₂ and O₂, demanding additional procedures in gas separation to prevent contamination of H₂ with O₂. Moreover, the oxygen evolution reaction (OER), which is sluggish and has a low overall energy conversion efficiency, does not deliver a significant value product on the electrode surface. Compared to conventional water electrolysis, integrating electrolytic hydrogen generation from water with thermodynamically more advantageous aqueous organic oxidation processes can increase energy conversion efficiency and create value-added compounds instead of oxygen at the anode. One strategy is to use renewable and sustainable carbon sources from biomass, which has a large annual production capacity and presents a significant opportunity to supplement carbon sourced from fossil fuels. Numerous catalytic techniques have been researched in order to utilize biomass economically. Because of its safe operating conditions, excellent energy efficiency, and reasonable control over production rate and selectivity using electrochemical parameters, electrocatalytic upgrading stands out as an appealing choice among the numerous biomass refinery technologies. Therefore, we propose a broad framework for coupling H2 generation from water splitting with oxidative biomass upgrading processes. Four representative biomass targets were considered for oxidative upgrading that used a hierarchically porous CoFe-MOF/LDH @ Graphite Paper bifunctional electrocatalyst, including glucose, ethanol, benzyl, furfural, and 5-hydroxymethylfurfural (HMF). The potential required to support 50 mA cm-2 is considerably lower than (~ 380 mV) the potential for OER. All four compounds can be oxidized to yield liquid byproducts with economic benefit. The electrocatalytic oxidation of glucose to the value-added products, gluconic acid, glucuronic acid, and glucaric acid, was examined in detail. The cell potential for combined H₂ production and glucose oxidation was substantially lower than for water splitting (1.44 V(RHE) vs. 1.82 V(RHE) for 50 mA cm-2). In contrast, the oxidation byproduct at the anode was significantly more valuable than O₂, taking advantage of the more favorable glucose oxidation in comparison to the OER. Overall, such a combination of HER and oxidative biomass valorization using electrocatalysts prevents the production of potentially explosive H₂/O₂mixtures and produces high-value products at both electrodes with lower voltage input, thereby increasing the efficiency and activity of electrocatalytic conversion.Keywords: biomass, electrocatalytic, glucose oxidation, hydrogen evolution
Procedia PDF Downloads 931114 Investigating the English Speech Processing System of EFL Japanese Older Children
Authors: Hiromi Kawai
Abstract:
This study investigates the nature of EFL older children’s L2 perceptive and productive abilities using classroom data, in order to find a pedagogical solution to the teaching of L2 sounds at an early stage of learning in a formal school setting. It is still inconclusive whether older children with only EFL formal school instruction at the initial stage of L2 learning are able to attain native-like perception and production in English within the very limited amount of exposure to the target language available. Based on the notion of the lack of study of EFL Japanese children’s acquisition of English segments, the researcher uses a model of L1 speech processing which was developed for investigating L1 English children’s speech and literacy difficulties using a psycholinguistic framework. The model is composed of input channel, output channel, and lexical representation, and examines how a child receives information from spoken or written language, remembers and stores it within the lexical representations and how the child selects and produces spoken or written words. Concerning language universality and language specificity in the language acquisitional process, the aim of finding any sound errors in L1 English children seemed to conform to the author’s intention to find abilities of English sounds in older Japanese children at the novice level of English in an EFL setting. 104 students in Grade 5 (between the ages of 10 and 11 years old) of an elementary school in Tokyo participated in this study. Four tests to measure their perceptive ability and three oral repetition tests to measure their productive ability were conducted with/without reference to lexical representation. All the test items were analyzed to calculate item facility (IF) indices, and correlational analyses and Structural Equation Modeling (SEM) were conducted to examine the relationship between the receptive ability and the productive ability. IF analysis showed that (1) the participants were better at perceiving a segment than producing a segment, (2) they had difficulty in auditory discrimination of paired consonants when one of them does not exist in the Japanese inventory, (3) they had difficulty in both perceiving and producing English vowels, and (4) their L1 loan word knowledge had an influence on their ability to perceive and produce L2 sounds. The result of the Multiple Regression Modeling showed that the two production tests could predict the participants’ auditory ability of real words in English. The result of SEM showed that the hypothesis that perceptive ability affects productive ability was supported. Based on these findings, the author discusses the possible explicit method of teaching English segments to EFL older children in a formal school setting.Keywords: EFL older children, english segments, perception, production, speech processing system
Procedia PDF Downloads 2431113 Global Experiences in Dealing with Biological Epidemics with an Emphasis on COVID-19 Disease: Approaches and Strategies
Authors: Marziye Hadian, Alireza Jabbari
Abstract:
Background: The World Health Organization has identified COVID-19 as a public health emergency and is urging governments to stop the virus transmission by adopting appropriate policies. In this regard, authorities have taken different approaches to cut the chain or controlling the spread of the disease. Now, the questions we are facing include what these approaches are? What tools should be used to implement each preventive protocol? In addition, what is the impact of each approach? Objective: The aim of this study was to determine the approaches to biological epidemics and related prevention tools with an emphasis on COVID-19 disease. Data sources: Databases including ISI web of science, PubMed, Scopus, Science Direct, Ovid, and ProQuest were employed for data extraction. Furthermore, authentic sources such as the WHO website, the published reports of relevant countries, as well as the Worldometer website were evaluated for gray studies. The time-frame of the study was from 1 December 2019 to 30 May 2020. Methods: The present study was a systematic study of publications related to the prevention strategies for the COVID-19 disease. The study was carried out based on the PRISMA guidelines and CASP for articles and AACODS for grey literature. Results: The study findings showed that in order to confront the COVID-19 epidemic, in general, there are three approaches of "mitigation", "active control" and "suppression" and four strategies of "quarantine", "isolation", "social distance" and "lockdown" in both individual and social dimensions to deal with epidemics. Selection and implementation of each approach requires specific strategies and has different effects when it comes to controlling and inhibiting the disease. Key finding: One possible approach to control the disease is to change individual behavior and lifestyle. In addition to prevention strategies, use of masks, observance of personal hygiene principles such as regular hand washing and non-contact of contaminated hands with the face, as well as an observance of public health principles such as sneezing and coughing etiquettes, safe extermination of personal protective equipment, must be strictly observed. Have not been included in the category of prevention tools. However, it has a great impact on controlling the epidemic, especially the new coronavirus epidemic. Conclusion: Although the use of different approaches to control and inhibit biological epidemics depends on numerous variables, however, despite these requirements, global experience suggests that some of these approaches are ineffective. The use of previous experiences in the world, along with the current experiences of countries, can be very helpful in choosing the accurate approach for each country in accordance with the characteristics of that country and lead to the reduction of possible costs at the national and international levels.Keywords: novel corona virus, COVID-19, approaches, prevention tools, prevention strategies
Procedia PDF Downloads 126