Search results for: selective partial update
1843 Mobulid Ray Post-Release Mortality to Assess the Feasibility of Live-Release Management Measures
Authors: Sila K. Sari, Betty J.L. Laglbauer, Muhammad G. Salim, Irianies C. Gozali, Iqbal Herwata, Fahmi Fahmi, Selvia Oktaviyani, Isabel Ender, Sarah Lewis, Abraham Sianipar, Mark Erdmann
Abstract:
Taking strides towards the sustainable use of marine stocks requires science-based management of target fish populations and reduction of bycatch in non-selective fisheries. Among elasmobranchs, mobulid rays are faced with high extinction risk due to intrinsic vulnerability to fishing and their conservation has been recognized as a strong priority both in Indonesia and worldwide. Despite their common vulnerabilities to fishing pressure due to slow growth, late maturation and low fecundity, only manta rays, but not devil rays, are protected in Indonesian waters. However, both manta and devil rays are captured in non-selective fisheries, in particular drift gillnets, since their habitat overlaps with fishing grounds for primary target species (e.g. marlin, swordfish and bullet tuna off the coast of Muncar). For this reason, mobulid populations are being heavily impacted, and while national-level protections are crucial to help conservation, they may not suffice alone to insure populations sustainability. In order to assess the potential of applying live-release management measures to conserve mobulids captured as bycatch in drift gillnets, we deployed pop-up survival archival transmitters to assess post-release mortality in Indonesian mobulid rays. We also assessed which fishing practices, in particular, soak duration, affected post-release mortality in order to draw relevant conclusions for management.Keywords: Mobulid, Devil ray, Manta ray, Bycatch
Procedia PDF Downloads 1391842 Image Segmentation Using Active Contours Based on Anisotropic Diffusion
Authors: Shafiullah Soomro
Abstract:
Active contour is one of the image segmentation techniques and its goal is to capture required object boundaries within an image. In this paper, we propose a novel image segmentation method by using an active contour method based on anisotropic diffusion feature enhancement technique. The traditional active contour methods use only pixel information to perform segmentation, which produces inaccurate results when an image has some noise or complex background. We use Perona and Malik diffusion scheme for feature enhancement, which sharpens the object boundaries and blurs the background variations. Our main contribution is the formulation of a new SPF (signed pressure force) function, which uses global intensity information across the regions. By minimizing an energy function using partial differential framework the proposed method captures semantically meaningful boundaries instead of catching uninterested regions. Finally, we use a Gaussian kernel which eliminates the problem of reinitialization in level set function. We use several synthetic and real images from different modalities to validate the performance of the proposed method. In the experimental section, we have found the proposed method performance is better qualitatively and quantitatively and yield results with higher accuracy compared to other state-of-the-art methods.Keywords: active contours, anisotropic diffusion, level-set, partial differential equations
Procedia PDF Downloads 1431841 An Investigation on Fresh and Hardened Properties of Concrete While Using Polyethylene Terephthalate (PET) as Aggregate
Authors: Md. Jahidul Islam, A. K. M. Rakinul Islam, M. Salamah Meherier
Abstract:
This study investigates the suitability of using plastic, such as polyethylene terephthalate (PET), as a partial replacement of natural coarse and fine aggregates (for example, brick chips and natural sand) to produce lightweight concrete for load bearing structural members. The plastic coarse aggregate (PCA) and plastic fine aggregate (PFA) were produced from melted polyethylene terephthalate (PET) bottles. Tests were conducted using three different water–cement (w/c) ratios, such as 0.42, 0.48, and 0.57, where PCA and PFA were used as 50% replacement of coarse and fine aggregate respectively. Fresh and hardened properties of concrete have been compared for natural aggregate concrete (NAC), PCA concrete (PCC) and PFA concrete (PFC). The compressive strength of concrete at 28 days varied with the water–cement ratio for both the PCC and PFC. Between PCC and PFC, PFA concrete showed the highest compressive strength (23.7 MPa) at 0.42 w/c ratio and also the lowest compressive strength (13.7 MPa) at 0.57 w/c ratio. Significant reduction in concrete density was mostly observed for PCC samples, ranging between 1977–1924 kg/m³. With the increase in water–cement ratio PCC achieved higher workability compare to both NAC and PFC. It was found that both the PCA and PFA contained concrete achieved the required compressive strength to be used for structural purpose as partial replacement of the natural aggregate; but to obtain the desired lower density as lightweight concrete the PCA is most suited.Keywords: polyethylene terephthalate, plastic aggregate, concrete, fresh and hardened properties
Procedia PDF Downloads 4141840 The Selective Reduction of a Morita-baylis-hillman Adduct-derived Ketones Using Various Ketoreductase Enzyme Preparations
Authors: Nompumelelo P. Mathebula, Roger A. Sheldon, Daniel P. Pienaar, Moira L. Bode
Abstract:
The preparation of enantiopure Morita-Baylis-Hillman (MBH) adducts remains a challenge in organic chemistry. MBH adducts are highly functionalised compounds which act as key intermediates in the preparation of compounds of medicinal importance. MBH adducts are prepared in racemic form by reacting various aldehydes and activated alkenes in the presence of DABCO. Enantiopure MBH adducts can be obtained by employing Enzymatic kinetic resolution (EKR). This technique has been successfully demonstrated in our group, amongst others, using lipases in either hydrolysis or transesterification reactions. As these methods only allow 50% of each enantiomer to be obtained, our interest grew in exploring other enzymatic methods for the synthesis of enantiopure MBH adducts where, theoretically, 100% of the desired enantiomer could be obtained.Dehydrogenase enzymes can be employed on prochiral substrates to obtain optically pure compounds by reducing carbon-carbon double bonds or carbonyl groups of ketones. Ketoreductases have been used historically to obtain enantiopure secondary alcohols on an industrial scale. Ketoreductases are NAD(P)H-dependent enzymes and thus require nicotinamide as a cofactor. This project focuses on employing ketoreductase enzymes to selectively reduce ketones derived from Morita-Baylis-Hillman (MBH) adducts in order to obtain these adducts in enantiopure form.Results obtained from this study will be reported. Good enantioselectivity was observed using a range of different ketoreductases, however, reactions were complicated by the formation of an unexpected by-product, which was characterised employing single crystal x-ray crystallography techniques. Methods to minimise by-product formation are currently being investigated.Keywords: ketoreductase, morita-baylis-hillman, selective reduction, x-ray crystallography
Procedia PDF Downloads 371839 Revealing the Nitrogen Reaction Pathway for the Catalytic Oxidative Denitrification of Fuels
Authors: Michael Huber, Maximilian J. Poller, Jens Tochtermann, Wolfgang Korth, Andreas Jess, Jakob Albert
Abstract:
Aside from the desulfurisation, the denitrogenation of fuels is of great importance to minimize the environmental impact of transport emissions. The oxidative reaction pathway of organic nitrogen in the catalytic oxidative denitrogenation could be successfully elucidated. This is the first time such a pathway could be traced in detail in non-microbial systems. It was found that the organic nitrogen is first oxidized to nitrate, which is subsequently reduced to molecular nitrogen via nitrous oxide. Hereby, the organic substrate serves as a reducing agent. The discovery of this pathway is an important milestone for the further development of fuel denitrogenation technologies. The United Nations aims to counteract global warming with Net Zero Emissions (NZE) commitments; however, it is not yet foreseeable when crude oil-based fuels will become obsolete. In 2021, more than 50 million barrels per day (mb/d) were consumed for the transport sector alone. Above all, heteroatoms such as sulfur or nitrogen produce SO₂ and NOx during combustion in the engines, which is not only harmful to the climate but also to health. Therefore, in refineries, these heteroatoms are removed by hy-drotreating to produce clean fuels. However, this catalytic reaction is inhibited by the basic, nitrogenous reactants (e.g., quinoline) as well as by NH3. The ion pair of the nitrogen atom forms strong pi-bonds to the active sites of the hydrotreating catalyst, which dimin-ishes its activity. To maximize the desulfurization and denitrogenation effectiveness in comparison to just extraction and adsorption, selective oxidation is typically combined with either extraction or selective adsorption. The selective oxidation produces more polar compounds that can be removed from the non-polar oil in a separate step. The extraction step can also be carried out in parallel to the oxidation reaction, as a result of in situ separation of the oxidation products (ECODS; extractive catalytic oxidative desulfurization). In this process, H8PV5Mo7O40 (HPA-5) is employed as a homogeneous polyoxometalate (POM) catalyst in an aqueous phase, whereas the sulfur containing fuel components are oxidized after diffusion from the organic fuel phase into the aqueous catalyst phase, to form highly polar products such as H₂SO₄ and carboxylic acids, which are thereby extracted from the organic fuel phase and accumulate in the aqueous phase. In contrast to the inhibiting properties of the basic nitrogen compounds in hydrotreating, the oxidative desulfurization improves with simultaneous denitrification in this system (ECODN; extractive catalytic oxidative denitrogenation). The reaction pathway of ECODS has already been well studied. In contrast, the oxidation of nitrogen compounds in ECODN is not yet well understood and requires more detailed investigations.Keywords: oxidative reaction pathway, denitrogenation of fuels, molecular catalysis, polyoxometalate
Procedia PDF Downloads 1511838 Mediating Role of Social Responsibility on the Relationship between Consumer Awareness of Green Marketing and Purchase Intentions
Authors: Norazah Mohd Suki, Norbayah Mohd Suki
Abstract:
This research aims to examine the influence of mediating effect of corporate social responsibility on the relationship between consumer awareness of green marketing and purchase intentions in the retail setting. Data from 200 valid questionnaires was analyzed using the partial least squares (PLS) approach for the analysis of structural equation models with SmartPLS computer program version 2.0 as research data does not necessarily have a multivariate normal distribution and is less sensitive to sample size than other covariance approaches. PLS results revealed that corporate social responsibility partially mediated the link between consumer awareness of green marketing and purchase intentions of the product in the retail setting. Marketing managers should allocate a sufficient portion of their budget to appropriate corporate social responsibility activities by engaging in voluntary programs for positive return on investment leading to increased business profitability and long run business sustainability. The outcomes of the mediating effects of corporate social responsibility add a new impetus to the growing literature and preceding discoveries on consumer green marketing awareness, which is inadequately researched in the Malaysian setting. Direction for future research is also presented.Keywords: green marketing awareness, social responsibility, partial least squares, purchase intention
Procedia PDF Downloads 5801837 Accurate Cortical Reconstruction in Narrow Sulci with Zero-Non-Zero Distance (ZNZD) Vector Field
Authors: Somojit Saha, Rohit K. Chatterjee, Sarit K. Das, Avijit Kar
Abstract:
A new force field is designed for propagation of the parametric contour into deep narrow cortical fold in the application of knowledge based reconstruction of cerebral cortex from MR image of brain. Designing of this force field is highly inspired by the Generalized Gradient Vector Flow (GGVF) model and markedly differs in manipulation of image information in order to determine the direction of propagation of the contour. While GGVF uses edge map as its main driving force, the newly designed force field uses the map of distance between zero valued pixels and their nearest non-zero valued pixel as its main driving force. Hence, it is called Zero-Non-Zero Distance (ZNZD) force field. The objective of this force field is forceful propagation of the contour beyond spurious convergence due to partial volume effect (PVE) in to narrow sulcal fold. Being function of the corresponding non-zero pixel value, the force field has got an inherent property to determine spuriousness of the edge automatically. It is effectively applied along with some morphological processing in the application of cortical reconstruction to breach the hindrance of PVE in narrow sulci where conventional GGVF fails.Keywords: deformable model, external force field, partial volume effect, cortical reconstruction, MR image of brain
Procedia PDF Downloads 3671836 The 5-HT1A Receptor Biased Agonists, NLX-101 and NLX-204, Elicit Rapid-Acting Antidepressant Activity in Rat Similar to Ketamine and via GABAergic Mechanisms
Authors: A. Newman-Tancredi, R. Depoortère, P. Gruca, E. Litwa, M. Lason, M. Papp
Abstract:
The N-methyl-D-aspartic acid (NMDA) receptor antagonist, ketamine, can elicit rapid-acting antidepressant (RAAD) effects in treatment-resistant patients, but it requires parenteral co-administration with a classical antidepressant under medical supervision. In addition, ketamine can also produce serious side effects that limit its long-term use, and there is much interest in identifying RAADs based on ketamine’s mechanism of action but with safer profiles. Ketamine elicits GABAergic interneuron inhibition, glutamatergic neuron stimulation, and, notably, activation of serotonin 5-HT1A receptors in the prefrontal cortex (PFC). Direct activation of the latter receptor subpopulation with selective ‘biased agonists’ may therefore be a promising strategy to identify novel RAADs and, consistent with this hypothesis, the prototypical cortical biased agonist, NLX-101, exhibited robust RAAD-like activity in the chronic mild stress model of depression (CMS). The present study compared the effects of a novel, selective 5-HT1A receptor-biased agonist, NLX-204, with those of ketamine and NLX-101. Materials and methods: CMS procedure was conducted on Wistar rats; drugs were administered either intraperitoneally (i.p.) or by bilateral intracortical microinjection. Ketamine: 10 mg/kg i.p. or 10 µg/side in PFC; NLX-204 and NLX-101: 0.08 and 0.16 mg/kg i.p. or 16 µg/side in PFC. In addition, interaction studies were carried out with systemic NLX-204 or NLX-101 (each at 0.16 mg/kg i.p.) in combination with intracortical WAY-100635 (selective 5-HT1A receptor antagonist; 2 µg/side) or muscimol (GABA-A receptor agonist, 12.5 ng/side). Anhedonia was assessed by CMS-induced decrease in sucrose solution consumption; anxiety-like behavior was assessed using the Elevated Plus Maze (EPM), and cognitive impairment was assessed by the Novel Object Recognition (NOR) test. Results: A single administration of NLX-204 was sufficient to reverse the CMS-induced deficit in sucrose consumption, similarly to ketamine and NLX-101. NLX-204 also reduced CMS-induced anxiety in the EPM and abolished CMS-induced NOR deficits. These effects were maintained (EPM and NOR) or enhanced (sucrose consumption) over a subsequent 2-week period of treatment. The anti-anhedonic response of the drugs was also maintained for several weeks Following treatment discontinuation, suggesting that they had sustained effects on neuronal networks. A single PFC administration of NLX-204 reversed deficient sucrose consumption, similarly to ketamine and NLX-101. Moreover, the anti-anhedonic activities of systemic NLX-204 and NLX 101 were abolished by coadministration with intracortical WAY-100635 or muscimol. Conclusions: (i) The antidepressant-like activity of NLX-204 in the rat CMS model was as rapid as that of ketamine or NLX-101, supporting targeting cortical 5-HT1A receptors with selective, biased agonists to achieve RAAD effects. (ii)The anti-anhedonic activity of systemic NLX-204 was mimicked by local administration of the compound in the PFC, confirming the involvement of cortical circuits in its RAAD-like effects. (iii) Notably, the effects of systemic NLX-204 and NLX-101 were abolished by PFC administration of muscimol, indicating that they act by (indirectly) eliciting a reduction in cortical GABAergic neurotransmission. This is consistent with ketamine’s mechanism of action and suggests that there are converging NMDA and 5-HT1A receptor signaling cascades in PFC underlying the RAAD-like activities of ketamine and NLX-204. Acknowledgements: The study was financially supported by NCN grant no. 2019/35/B/NZ7/00787.Keywords: depression, ketamine, serotonin, 5-HT1A receptor, chronic mild stress
Procedia PDF Downloads 781835 Adequacy of Second-Generation Laryngeal Mask Airway during Prolonged Abdominal Surgery
Authors: Sukhee Park, Gaab Soo Kim
Abstract:
Purpose: We aimed to evaluate the adequacy of second-generation laryngeal mask airway use during prolonged abdominal surgery in respect of ventilation, oxygenation, postoperative pulmonary complications (PPC), and postoperative non-pulmonary complications on living donor kidney transplant (LDKT) surgery. Methods: In total, 257 recipients who underwent LDKT using either laryngeal mask airway-ProSeal (LMA-P) or endotracheal tube (ETT) were retrospectively analyzed. Arterial partial pressure of carbon dioxide (PaCO2 and ratio of arterial partial pressure of oxygen to fractional inspired oxygen (PFR) during surgery were compared between two groups. In addition, PPC including pulmonary aspiration and postoperative non-pulmonary complications including nausea, vomiting, hoarseness, vocal cord palsy, delirium, and atrial fibrillation were also compared. Results: PaCO2 and PFR during surgery were not significantly different between the two groups. PPC was also not significantly different between the two groups. Interestingly, the incidence of delirium was significantly lower in the LMA-P group than the ETT group (3.0% vs. 10.3%, P = 0.029). Conclusions: During prolonged abdominal surgery such as LDKT, second-generation laryngeal mask airway offers adequate ventilation and oxygenation and can be considered a suitable alternative to ETT.Keywords: laryngeal mask airway, prolonged abdominal surgery, kidney transplantation, postoperative pulmonary complication
Procedia PDF Downloads 1291834 A Multi-Templated Fe-Ni-Cu Ion Imprinted Polymer for the Selective and Simultaneous Removal of Toxic Metallic Ions from Wastewater
Authors: Morlu Stevens, Bareki Batlokwa
Abstract:
The use of treated wastewater is widely employed to compensate for the scarcity of safe and uncontaminated freshwater. However, the existence of toxic heavy metal ions in the wastewater pose a health hazard to animals and the environment, hence, the importance for an effective technique to tackle the challenge. A multi-templated ion imprinted sorbent (Fe,Ni,Cu-IIP) for the simultaneous removal of heavy metal ions from waste water was synthesised employing molecular imprinting technology (MIT) via thermal free radical bulk polymerization technique. Methacrylic acid (MAA) was employed as the functional monomer, and ethylene glycol dimethylacrylate (EGDMA) as cross-linking agent, azobisisobutyronitrile (AIBN) as the initiator, Fe, Ni, Cu ions as template ions, and 1,10-phenanthroline as the complexing agent. The template ions were exhaustively washed off the synthesized polymer by solvent extraction in several washing steps, while periodically increasing solvent (HCl) concentration from 1.0 M to 10.0 M. The physical and chemical properties of the sorbents were investigated using Fourier Transform Infrared Spectroscopy (FT-IR), X-ray Diffraction (XRD) and Atomic Force Microscopy (AFM) were employed. Optimization of operational parameters such as time, pH and sorbent dosage to evaluate the effectiveness of sorbents were investigated and found to be 15 min, 7.5 and 666.7 mg/L respectively. Selectivity of ion-imprinted polymers and competitive sorption studies between the template and similar ions were carried out and showed good selectivity towards the targeted metal ion by removing 90% - 98% of the templated ions as compared to 58% - 62% of similar ions. The sorbents were further applied for the selective removal of Fe, Ni and Cu from real wastewater samples and recoveries of 92.14 ± 0.16% - 106.09 ± 0.17% and linearities of R2 = 0.9993 - R2 = 0.9997 were achieved.Keywords: ion imprinting, ion imprinted polymers, heavy metals, wastewater
Procedia PDF Downloads 2861833 Effects of Temperature and Cysteine Addition on Formation of Flavor from Maillard Reaction Using Xylose and Rapeseed Meal Peptide
Authors: Zuoyong Zhang, Min Yu, Jinlong Zhao, Shudong He
Abstract:
The Maillard reaction can produce the flavor enhancing substance through the chemical crosslinking between free amino group of the protein or polypeptide with the carbonyl of the reducing sugar. In this research, solutions of rapeseed meal peptide and D-xylose with or without L-cysteine (RXC or RX) were heated over a range of temperatures (80-140 °C) for 2 h. It was observed that RXs had a severe browning,while RXCs accompanied by more pH decrement with the temperature increasing. Then the correlation among data of quantitative sensory descriptive analysis, free amino acid (FAA) and GC–MS of RXCs and RXs were analyzed using the partial least square regression method. Results suggested that the Maillard reaction product (MRPs) with cysteine formed at 120 °C (RXC-120) had greater sensory properties especially meat-like flavor compared to other MRPs. Meanwhile, it revealed that glutamic and glycine not only had a positive contribution to meaty aroma but also showed a significant and positive influence on umami taste of RXs based on the FAA data. Moreover, the sulfur-containing compounds showed a significant positive correlation with the meat-like flavor of RXCs, while RXs depended on furans and nitrogenous-containing compounds with more caramel-like flavor. Therefore, a MRP with strong meaty flavor could be obtained at 120 °C by addition of cysteine.Keywords: rapeseed meal, Maillard reaction, sensory characteristics, FAA, GC–MS, partial least square regression
Procedia PDF Downloads 2381832 Amazonian Native Biomass Residue for Sustainable Development of Isolated Communities
Authors: Bruna C. Brasileiro, José Alberto S. Sá, Brigida R. P. Rocha
Abstract:
The Amazon region development was related to large-scale projects associated with economic cycles. Economic cycles were originated from policies implemented by successive governments that exploited the resources and have not yet been able to improve the local population's quality of life. These implanted development strategies were based on vertical planning centered on State that didn’t know and showed no interest in know the local needs and potentialities. The future of this region is a challenge that depends on a model of development based on human progress associated to intelligent, selective and environmentally safe exploitation of natural resources settled in renewable and no-polluting energy generation sources – a differential factor of attraction of new investments in a context of global energy and environmental crisis. In this process the planning and support of Brazilian State, local government, and selective international partnership are essential. Residual biomass utilization allows the sustainable development by the integration of production chain and energy generation process which could improve employment condition and income of riversides. Therefore, this research discourses how the use of local residual biomass (açaí lumps) could be an important instrument of sustainable development for isolated communities located at Alcobaça Sustainable Development Reserve (SDR), Tucuruí, Pará State, since in this region the energy source more accessible for who can pay are the fossil fuels that reaches about 54% of final energy consumption by the integration between the açaí productive chain and the use of renewable energy source besides it can promote less environmental impact and decrease the use of fossil fuels and carbon dioxide emissions.Keywords: Amazon, biomass, renewable energy, sustainability
Procedia PDF Downloads 2821831 Green Materials for Hot Mixed Asphalt Production
Authors: Salisu Dahiru, Jibrin M. Kaura, Abubakar I. Jumare, Sulaiman M. Mahmood
Abstract:
Reclaimed asphalt, used automobile tires and rice husk, were regarded as waste. These materials could be used in construction of new roads and for roads rehabilitation. Investigation into the production of a Green Hot Mixed Asphalt (GHMA) pavement using Reclaimed Asphalt Pavement (RAP) as partial replacement for coarse aggregate, Crumb Rubber (CR) from waste automobile tires as modifier for bitumen binder and Rice Husk Ash (RHA) as partial replacement of ordinary portland cement (OPC) filler, for roads construction and rehabilitation was presented. 30% Reclaimed asphalt of total aggregate, 15% Crumb Rubber of total binder content, 5% Rice Husk Ash of total mix, and 5.2% Crumb Rubber Modified Bitumen content were recommended for optimum performance. Loss of marshal stability was investigated on mix with the recommended optimum CRMB. The mix revealed good performance with only about 13% loss of stability after 24 hours of immersion in hot water bath, as against about 24% marshal stability lost reported in previous studies for conventional Hot Mixed Asphalt (HMA).Keywords: rice husk, reclaimed asphalt, filler, crumb rubber, bitumen content green hot mix asphalt
Procedia PDF Downloads 2991830 Explicit Numerical Approximations for a Pricing Weather Derivatives Model
Authors: Clarinda V. Nhangumbe, Ercília Sousa
Abstract:
Weather Derivatives are financial instruments used to cover non-catastrophic weather events and can be expressed in the form of standard or plain vanilla products, structured or exotics products. The underlying asset, in this case, is the weather index, such as temperature, rainfall, humidity, wind, and snowfall. The complexity of the Weather Derivatives structure shows the weakness of the Black Scholes framework. Therefore, under the risk-neutral probability measure, the option price of a weather contract can be given as a unique solution of a two-dimensional partial differential equation (parabolic in one direction and hyperbolic in other directions), with an initial condition and subjected to adequate boundary conditions. To calculate the price of the option, one can use numerical methods such as the Monte Carlo simulations and implicit finite difference schemes conjugated with Semi-Lagrangian methods. This paper is proposed two explicit methods, namely, first-order upwind in the hyperbolic direction combined with Lax-Wendroff in the parabolic direction and first-order upwind in the hyperbolic direction combined with second-order upwind in the parabolic direction. One of the advantages of these methods is the fact that they take into consideration the boundary conditions obtained from the financial interpretation and deal efficiently with the different choices of the convection coefficients.Keywords: incomplete markets, numerical methods, partial differential equations, stochastic process, weather derivatives
Procedia PDF Downloads 671829 Impact of Length of Straw by the Use of a Straw Mill on the Selective Feeding of Young Cattle and Their Effects for the Cattle
Authors: Heiko Scholz
Abstract:
When feeding high qualitysilagetoheifersfromthe age of two, there is a riskofenergyoversupply. Depending on the feeding valueorscarceavailability ofsilageorcorn silage diets withhighproportionsof straw is often incorporated. Foran energetically standardized young cattle supply of strawproportion can be more than 20% of dry matter. It was investigated whether the grinding of straw with the strawmillselective feeding significantly limits. The investigation has been carried out with young cattle in the second year. 78 animals were kept and fed under similar conditions in two groups. The experimental group (EG) consisted of cattle 12 to 15 months, and in the control group (CG), the cattle were 15 to 20 months old. The experimental feeding took place in five days of feed distribution, and residual feed were weighed. The ration of EG contained ground with the straw mill straw, and CG was further fed rotor-cut pressed straw. To determine the selective seizure samples of feed distributionandtheremainingfood with the particle separator boxandthecrude protein-and energy-content have been determined. The grinding of the straw increased the daily feed intake.IntheEGan increase infeed intakewas observedby grinding of the straw. Feed intakedirectlyon the day for changing the dietoflongonground straw increased by more than 2.0 kgofDMper animal. In the following days, the feed intakewasincreasedby 0.9kg DMper animal and day on average (7.4 vs. 8.3 kg DM per day). The results of the screen distribution of residual feed point to a differentiated feeding behavior between the groups. In the EG, the particle length of the residual feed to a large extent with the template matches. The acid-base-balance (NSBA)valuesofEGarewithin normal limits. Ifstrawsharesof25% and more are federations to young cattle (heifers), the theparticlelengthof straw has significant impact ontheselectivefeeding behavior. Aparticlelength of 1.5cmcompared to7.5 cmlongpreventedstrawcertainly discarding of the straw on the feeding barn. The feed intake increases whenshortstrawis mixed into theTMR.Keywords: straw mill, heifer, feed selection, dry matter intake
Procedia PDF Downloads 1671828 Applying Element Free Galerkin Method on Beam and Plate
Authors: Mahdad M’hamed, Belaidi Idir
Abstract:
This paper develops a meshless approach, called Element Free Galerkin (EFG) method, which is based on the weak form Moving Least Squares (MLS) of the partial differential governing equations and employs the interpolation to construct the meshless shape functions. The variation weak form is used in the EFG where the trial and test functions are approximated bye the MLS approximation. Since the shape functions constructed by this discretization have the weight function property based on the randomly distributed points, the essential boundary conditions can be implemented easily. The local weak form of the partial differential governing equations is obtained by the weighted residual method within the simple local quadrature domain. The spline function with high continuity is used as the weight function. The presently developed EFG method is a truly meshless method, as it does not require the mesh, either for the construction of the shape functions, or for the integration of the local weak form. Several numerical examples of two-dimensional static structural analysis are presented to illustrate the performance of the present EFG method. They show that the EFG method is highly efficient for the implementation and highly accurate for the computation. The present method is used to analyze the static deflection of beams and plate holeKeywords: numerical computation, element-free Galerkin (EFG), moving least squares (MLS), meshless methods
Procedia PDF Downloads 2621827 An Alternative Credit Scoring System in China’s Consumer Lendingmarket: A System Based on Digital Footprint Data
Authors: Minjuan Sun
Abstract:
Ever since the late 1990s, China has experienced explosive growth in consumer lending, especially in short-term consumer loans, among which, the growth rate of non-bank lending has surpassed bank lending due to the development in financial technology. On the other hand, China does not have a universal credit scoring and registration system that can guide lenders during the processes of credit evaluation and risk control, for example, an individual’s bank credit records are not available for online lenders to see and vice versa. Given this context, the purpose of this paper is three-fold. First, we explore if and how alternative digital footprint data can be utilized to assess borrower’s creditworthiness. Then, we perform a comparative analysis of machine learning methods for the canonical problem of credit default prediction. Finally, we analyze, from an institutional point of view, the necessity of establishing a viable and nationally universal credit registration and scoring system utilizing online digital footprints, so that more people in China can have better access to the consumption loan market. Two different types of digital footprint data are utilized to match with bank’s loan default records. Each separately captures distinct dimensions of a person’s characteristics, such as his shopping patterns and certain aspects of his personality or inferred demographics revealed by social media features like profile image and nickname. We find both datasets can generate either acceptable or excellent prediction results, and different types of data tend to complement each other to get better performances. Typically, the traditional types of data banks normally use like income, occupation, and credit history, update over longer cycles, hence they can’t reflect more immediate changes, like the financial status changes caused by the business crisis; whereas digital footprints can update daily, weekly, or monthly, thus capable of providing a more comprehensive profile of the borrower’s credit capabilities and risks. From the empirical and quantitative examination, we believe digital footprints can become an alternative information source for creditworthiness assessment, because of their near-universal data coverage, and because they can by and large resolve the "thin-file" issue, due to the fact that digital footprints come in much larger volume and higher frequency.Keywords: credit score, digital footprint, Fintech, machine learning
Procedia PDF Downloads 1321826 Case of A Huge Retroperitoneal Abscess Spanning from the Diaphragm to the Pelvic Brim
Authors: Christopher Leung, Tony Kim, Rebecca Lendzion, Scott Mackenzie
Abstract:
Retroperitoneal abscesses are a rare but serious condition with often delayed diagnosis, non-specific symptoms, multiple causes and high morbidity/mortality. With the advent of more readily available cross-sectional imaging, retroperitoneal abscesses are treated earlier and better outcomes are achieved. Occasionally, a retroperitoneal abscess is present as a huge retroperitoneal abscess, as evident in this 53-year-old male. With a background of chronic renal disease and left partial nephrectomy, this gentleman presented with a one-month history of left flank pain without any other symptoms, including fevers or abdominal pain. CT abdomen and pelvis demonstrated a huge retroperitoneal abscess spanning from the diaphragm, abutting the spleen, down to the iliopsoas muscle and abutting the iliac vessels at the pelvic brim. This large retroperitoneal abscess required open drainage as well as drainage by interventional radiology. A long course of intravenous antibiotics and multiple drainages was required to drain the abscess. His blood culture and fluid culture grew Proteus species suggesting a urinary source, likely from his non-functioning kidney, which had a partial nephrectomy. Such a huge retroperitoneal abscess has rarely been described in the literature. The learning point here is that the basic principle of source control and antibiotics is paramount in treating retroperitoneal abscesses regardless of the size of the abscess.Keywords: retroperitoneal abscess, retroperitoneal mass, sepsis, genitourinary infection
Procedia PDF Downloads 1961825 Status and Results from EXO-200
Authors: Ryan Maclellan
Abstract:
EXO-200 has provided one of the most sensitive searches for neutrinoless double-beta decay utilizing 175 kg of enriched liquid xenon in an ultra-low background time projection chamber. This detector has demonstrated excellent energy resolution and background rejection capabilities. Using the first two years of data, EXO-200 has set a limit of 1.1x10^25 years at 90% C.L. on the neutrinoless double-beta decay half-life of Xe-136. The experiment has experienced a brief hiatus in data taking during a temporary shutdown of its host facility: the Waste Isolation Pilot Plant. EXO-200 expects to resume data taking in earnest this fall with upgraded detector electronics. Results from the analysis of EXO-200 data and an update on the current status of EXO-200 will be presented.Keywords: double-beta, Majorana, neutrino, neutrinoless
Procedia PDF Downloads 3871824 The Effects of Vocational Training on Offender Rehabilitation in Nigerian Correctional Institutions
Authors: Hadi Mohammed
Abstract:
The introduction of vocational education and training (VET) in correctional institutions as part of prisoner rehabilitation program is to help offenders develop marketable job skills and reduce re-offending thereby increasing the likely hood of successful reintegration back to their community. Offenders who participate in vocational education and training are significantly less likely to return to prison after released and are more likely to find employment after released than offenders who do not received such training. Those who participated in vocational training were 28% more likely to be employed after released from prison than those who did not received such training. This paper examined the effects of vocational training on offender rehabilitation as well as the effects of vocational training on the relationship between reformation and reintegration in Nigerian correctional institution. To address this two research question were formulated to guide the research. A survey research was employed. The participants were 200 offenders in Nigerian correctional institutions. Questionnaire items were administered. Mean, standard deviation and Partial Correlation were used for the data analysis. The findings revealed that vocational training has helped in offender rehabilitation in Nigerian correctional institutions. Similarly there was a moderate significant positive partial correlation between reformation and reintegration, controlling for vocational training, r=0.461, n=221, p<0.005 with moderate level of reformation and being associated with moderate level of reintegration. Based on the findings of the study, it was recommended that Nigerian Correctional Institutions should strengthen their vocational training program for offenders to be properly rehabilitated.Keywords: correctional institutions, vocational education and training, offender rehabilitation
Procedia PDF Downloads 1341823 Partial Least Square Regression for High-Dimentional and High-Correlated Data
Authors: Mohammed Abdullah Alshahrani
Abstract:
The research focuses on investigating the use of partial least squares (PLS) methodology for addressing challenges associated with high-dimensional correlated data. Recent technological advancements have led to experiments producing data characterized by a large number of variables compared to observations, with substantial inter-variable correlations. Such data patterns are common in chemometrics, where near-infrared (NIR) spectrometer calibrations record chemical absorbance levels across hundreds of wavelengths, and in genomics, where thousands of genomic regions' copy number alterations (CNA) are recorded from cancer patients. PLS serves as a widely used method for analyzing high-dimensional data, functioning as a regression tool in chemometrics and a classification method in genomics. It handles data complexity by creating latent variables (components) from original variables. However, applying PLS can present challenges. The study investigates key areas to address these challenges, including unifying interpretations across three main PLS algorithms and exploring unusual negative shrinkage factors encountered during model fitting. The research presents an alternative approach to addressing the interpretation challenge of predictor weights associated with PLS. Sparse estimation of predictor weights is employed using a penalty function combining a lasso penalty for sparsity and a Cauchy distribution-based penalty to account for variable dependencies. The results demonstrate sparse and grouped weight estimates, aiding interpretation and prediction tasks in genomic data analysis. High-dimensional data scenarios, where predictors outnumber observations, are common in regression analysis applications. Ordinary least squares regression (OLS), the standard method, performs inadequately with high-dimensional and highly correlated data. Copy number alterations (CNA) in key genes have been linked to disease phenotypes, highlighting the importance of accurate classification of gene expression data in bioinformatics and biology using regularized methods like PLS for regression and classification.Keywords: partial least square regression, genetics data, negative filter factors, high dimensional data, high correlated data
Procedia PDF Downloads 131822 Severity Index Level in Effectively Managing Medium Voltage Underground Power Cable
Authors: Mohd Azraei Pangah Pa'at, Mohd Ruzlin Mohd Mokhtar, Norhidayu Rameli, Tashia Marie Anthony, Huzainie Shafi Abd Halim
Abstract:
Partial Discharge (PD) diagnostic mapping testing is one of the main diagnostic testing techniques that are widely used in the field or onsite testing for underground power cable in medium voltage level. The existence of PD activities is an early indication of insulation weakness hence early detection of PD activities can be determined and provides an initial prediction on the condition of the cable. To effectively manage the results of PD Mapping test, it is important to have acceptable criteria to facilitate prioritization of mitigation action. Tenaga Nasional Berhad (TNB) through Distribution Network (DN) division have developed PD severity model name Severity Index (SI) for offline PD mapping test since 2007 based on onsite test experience. However, this severity index recommendation action had never been revised since its establishment. At presence, PD measurements data have been extensively increased, hence the severity level indication and the effectiveness of the recommendation actions can be analyzed and verified again. Based on the new revision, the recommended action to be taken will be able to reflect the actual defect condition. Hence, will be accurately prioritizing preventive action plan and minimizing maintenance expenditure.Keywords: partial discharge, severity index, diagnostic testing, medium voltage, power cable
Procedia PDF Downloads 1461821 Research on Malware Application Patterns of Using Permission Monitoring System
Authors: Seung-Hwan Ju, Yo-Han Choi, Hee-Suk Seo, Tae-Kyung Kim
Abstract:
This study investigates the permissions requested by Android applications, and the possibility of identifying suspicious applications based only on information presented to the user before an application is downloaded. The pattern analysis is based on a smaller data set consisting of confirmed malicious applications. The method is evaluated based on its ability to recognize malicious potential in the analyzed applications. In this study, we develop a system to monitor that mobile application permission at application update. This study is a service-based malware analysis. It will be based on the mobile security study.Keywords: malware patterns, application permission, application analysis, security
Procedia PDF Downloads 4881820 Nanowire Sensor Based on Novel Impedance Spectroscopy Approach
Authors: Valeriy M. Kondratev, Ekaterina A. Vyacheslavova, Talgat Shugabaev, Alexander S. Gudovskikh, Alexey D. Bolshakov
Abstract:
Modern sensorics imposes strict requirements on the biosensors characteristics, especially technological feasibility, and selectivity. There is a growing interest in the analysis of human health biological markers, which indirectly testifying the pathological processes in the body. Such markers are acids and alkalis produced by the human, in particular - ammonia and hydrochloric acid, which are found in human sweat, blood, and urine, as well as in gastric juice. Biosensors based on modern nanomaterials, especially low dimensional, can be used for this markers detection. Most classical adsorption sensors based on metal and silicon oxides are considered non-selective, because they identically change their electrical resistance (or impedance) under the action of adsorption of different target analytes. This work demonstrates a feasible frequency-resistive method of electrical impedance spectroscopy data analysis. The approach allows to obtain of selectivity in adsorption sensors of a resistive type. The method potential is demonstrated with analyzis of impedance spectra of silicon nanowires in the presence of NH3 and HCl vapors with concentrations of about 125 mmol/L (2 ppm) and water vapor. We demonstrate the possibility of unambiguous distinction of the sensory signal from NH3 and HCl adsorption. Moreover, the method is found applicable for analysis of the composition of ammonia and hydrochloric acid vapors mixture without water cross-sensitivity. Presented silicon sensor can be used to find diseases of the gastrointestinal tract by the qualitative and quantitative detection of ammonia and hydrochloric acid content in biological samples. The method of data analysis can be directly translated to other nanomaterials to analyze their applicability in the field of biosensory.Keywords: electrical impedance spectroscopy, spectroscopy data analysis, selective adsorption sensor, nanotechnology
Procedia PDF Downloads 821819 Glyco-Conjugated Gold Nanorods Based Biosensor for Optical Detection and Photothermal Ablation of Food Borne Bacteria
Authors: Shimayali Kaushal, Nitesh Priyadarshi, Nitin Kumar Singhal
Abstract:
Food borne bacterial species have been identified as major pathogens in most of the severe pathogen-related diseases among humans which result in great loss to human health and food industry. Conventional methods like plating and enzyme-linked immune sorbent assay (ELISA) are time-consuming, laborious and require specialized instruments. Nanotechnology has emerged as a great field in case of rapid detection of pathogens in recent years. The AuNRs material has good electro-optical properties due to its larger light absorption band and scattering in surface plasmon resonance wavelength regions. By exploiting the sugar-based adhesion properties of microorganism, we can use the glycoconjugates capped gold nanorods as a potential nanobiosensor to detect the foodborne pathogen. In the present study, polyethylene glycol (PEG) coated gold nanorods (AuNRs) were prepared and functionalized with different types of carbohydrates and further characterized by UV-Visible spectrophotometry, dynamic light scattering (DLS), transmission electron microscopy (TEM). The reactivity of above said nano-biosensor was probed by lectin binding assay and also by different strains of foodborne bacteria by using spectrophotometric and microscopic techniques. Due to the specific interaction of probe with foodborne bacteria (Escherichia coli, Pseudomonas aeruginosa), our nanoprobe has shown significant and selective ablation of targeted bacteria. Our findings suggest that our nanoprobe can be an ideal candidate for selective optical detection of food pathogens and can reduce loss to the food industry.Keywords: glyco-conjugates, gold nanorods, nanobiosensor, nanoprobe
Procedia PDF Downloads 1141818 A Clustering Algorithm for Massive Texts
Authors: Ming Liu, Chong Wu, Bingquan Liu, Lei Chen
Abstract:
Internet users have to face the massive amount of textual data every day. Organizing texts into categories can help users dig the useful information from large-scale text collection. Clustering, in fact, is one of the most promising tools for categorizing texts due to its unsupervised characteristic. Unfortunately, most of traditional clustering algorithms lose their high qualities on large-scale text collection. This situation mainly attributes to the high- dimensional vectors generated from texts. To effectively and efficiently cluster large-scale text collection, this paper proposes a vector reconstruction based clustering algorithm. Only the features that can represent the cluster are preserved in cluster’s representative vector. This algorithm alternately repeats two sub-processes until it converges. One process is partial tuning sub-process, where feature’s weight is fine-tuned by iterative process. To accelerate clustering velocity, an intersection based similarity measurement and its corresponding neuron adjustment function are proposed and implemented in this sub-process. The other process is overall tuning sub-process, where the features are reallocated among different clusters. In this sub-process, the features useless to represent the cluster are removed from cluster’s representative vector. Experimental results on the three text collections (including two small-scale and one large-scale text collections) demonstrate that our algorithm obtains high quality on both small-scale and large-scale text collections.Keywords: vector reconstruction, large-scale text clustering, partial tuning sub-process, overall tuning sub-process
Procedia PDF Downloads 4081817 Humans Trust Building in Robots with the Help of Explanations
Authors: Misbah Javaid, Vladimir Estivill-Castro, Rene Hexel
Abstract:
The field of robotics is advancing rapidly to the point where robots have become an integral part of the modern society. These robots collaborate and contribute productively with humans and compensate some shortcomings from human abilities and complement them with their skills. Effective teamwork of humans and robots demands to investigate the critical issue of trust. The field of human-computer interaction (HCI) has already examined trust humans place in technical systems mostly on issues like reliability and accuracy of performance. Early work in the area of expert systems suggested that automatic generation of explanations improved trust and acceptability of these systems. In this work, we augmented a robot with the user-invoked explanation generation proficiency. To measure explanations effect on human’s level of trust, we collected subjective survey measures and behavioral data in a human-robot team task into an interactive, adversarial and partial information environment. The results showed that with the explanation capability humans not only understand and recognize robot as an expert team partner. But, it was also observed that human's learning and human-robot team performance also significantly improved because of the meaningful interaction with the robot in the human-robot team. Moreover, by observing distinctive outcomes, we expect our research outcomes will also provide insights into further improvement of human-robot trustworthy relationships.Keywords: explanation interface, adversaries, partial observability, trust building
Procedia PDF Downloads 1781816 Sequential Data Assimilation with High-Frequency (HF) Radar Surface Current
Authors: Lei Ren, Michael Hartnett, Stephen Nash
Abstract:
The abundant measured surface current from HF radar system in coastal area is assimilated into model to improve the modeling forecasting ability. A simple sequential data assimilation scheme, Direct Insertion (DI), is applied to update model forecast states. The influence of Direct Insertion data assimilation over time is analyzed at one reference point. Vector maps of surface current from models are compared with HF radar measurements. Root-Mean-Squared-Error (RMSE) between modeling results and HF radar measurements is calculated during the last four days with no data assimilation.Keywords: data assimilation, CODAR, HF radar, surface current, direct insertion
Procedia PDF Downloads 5421815 An Experiment of Three-Dimensional Point Clouds Using GoPro
Authors: Jong-Hwa Kim, Mu-Wook Pyeon, Yang-dam Eo, Ill-Woong Jang
Abstract:
Construction of geo-spatial information recently tends to develop as multi-dimensional geo-spatial information. People constructing spatial information is also expanding its area to the general public from some experts. As well as, studies are in progress using a variety of devices, with the aim of near real-time update. In this paper, getting the stereo images using GoPro device used widely also to the general public as well as experts. And correcting the distortion of the images, then by using SIFT, DLT, is acquired the point clouds. It presented a possibility that on the basis of this experiment, using a video device that is readily available in real life, to create a real-time digital map.Keywords: GoPro, SIFT, DLT, point clouds
Procedia PDF Downloads 4411814 Opacity Synthesis with Orwellian Observers
Authors: Moez Yeddes
Abstract:
The property of opacity is widely used in the formal verification of security in computer systems and protocols. Opacity is a general language-theoretic scheme of many security properties of systems. Opacity is parametrized with framework in which several security properties of a system can be expressed. A secret behaviour of a system is opaque if a passive attacker can never deduce its occurrence from the system observation. Instead of considering the case of static observability where the set of observable events is fixed off-line or dynamic observability where the set of observable events changes over time depending on the history of the trace, we introduce Orwellian partial observability where unobservable events are not revealed provided that downgrading events never occurs in the future of the trace. Orwellian partial observability is needed to model intransitive information flow. This Orwellian observability is knwon as ipurge function. We show in previous work how to verify opacity for regular secret is opaque for a regular language L w.r.t. an Orwellian projection is PSPACE-complete while it has been proved undecidable even for a regular language L w.r.t. a general Orwellian observation function. In this paper, we address two problems of opacification of a regular secret ϕ for a regular language L w.r.t. an Orwellian projection: Given L and a secret ϕ ∈ L, the first problem consist to compute some minimal regular super-language M of L, if it exists, such that ϕ is opaque for M and the second consists to compute the supremal sub-language M′ of L such that ϕ is opaque for M′. We derive both language-theoretic characterizations and algorithms to solve these two dual problems.Keywords: security policies, opacity, formal verification, orwellian observation
Procedia PDF Downloads 201