Search results for: conventional techniques
8492 Using Data Mining Techniques to Evaluate the Different Factors Affecting the Academic Performance of Students at the Faculty of Information Technology in Hashemite University in Jordan
Authors: Feras Hanandeh, Majdi Shannag
Abstract:
This research studies the different factors that could affect the Faculty of Information Technology in Hashemite University students’ accumulative average. The research paper verifies the student information, background, their academic records, and how this information will affect the student to get high grades. The student information used in the study is extracted from the student’s academic records. The data mining tools and techniques are used to decide which attribute(s) will affect the student’s accumulative average. The results show that the most important factor which affects the students’ accumulative average is the student Acceptance Type. And we built a decision tree model and rules to determine how the student can get high grades in their courses. The overall accuracy of the model is 44% which is accepted rate.Keywords: data mining, classification, extracting rules, decision tree
Procedia PDF Downloads 4168491 Production and Application of Organic Waste Compost for Urban Agriculture in Emerging Cities
Authors: Alemayehu Agizew Woldeamanuel, Mekonnen Maschal Tarekegn, Raj Mohan Balakrishina
Abstract:
Composting is one of the conventional techniques adopted for organic waste management, but the practice is very limited in emerging cities despite the most of the waste generated is organic. This paper aims to examine the viability of composting for organic waste management in the emerging city of Addis Ababa, Ethiopia, by addressing the composting practice, quality of compost, and application of compost in urban agriculture. The study collects data using compost laboratory testing and urban farm households’ survey and uses descriptive analysis on the state of compost production and application, physicochemical analysis of the compost samples, and regression analysis on the urban farmer’s willingness to pay for compost. The findings of the study indicated that there is composting practice at a small scale, most of the producers use unsorted feedstock materials, aerobic composting is dominantly used, and the maturation period ranged from four to ten weeks. The carbon content of the compost ranges from 30.8 to 277.1 due to the type of feedstock applied, and this surpasses the ideal proportions for C:N ratio. The total nitrogen, pH, organic matter, and moisture content are relatively optimal. The levels of heavy metals measured for Mn, Cu, Pb, Cd and Cr⁶⁺ in the compost samples are also insignificant. In the urban agriculture sector, chemical fertilizer is the dominant type of soil input in crop productions but vegetable producers use a combination of both fertilizer and other organic inputs, including compost. The willingness to pay for compost depends on income, household size, gender, type of soil inputs, monitoring soil fertility, the main product of the farm, farming method and farm ownership. Finally, this study recommends the need for collaboration among stakeholders’ along the value chain of waste, awareness creation on the benefits of composting and addressing challenges faced by both compost producers and users.Keywords: composting, emerging city, organic waste management, urban agriculture
Procedia PDF Downloads 3088490 [Keynote Talk]: Software Reliability Assessment and Fault Tolerance: Issues and Challenges
Authors: T. Gayen
Abstract:
Although, there are several software reliability models existing today there does not exist any versatile model even today which can be used for the reliability assessment of software. Complex software has a large number of states (unlike the hardware) so it becomes practically difficult to completely test the software. Irrespective of the amount of testing one does, sometimes it becomes extremely difficult to assure that the final software product is fault free. The Black Box Software Reliability models are found be quite uncertain for the reliability assessment of various systems. As mission critical applications need to be highly reliable and since it is not always possible to ensure the development of highly reliable system. Hence, in order to achieve fault-free operation of software one develops some mechanism to handle faults remaining in the system even after the development. Although, several such techniques are currently in use to achieve fault tolerance, yet these mechanisms may not always be very suitable for various systems. Hence, this discussion is focused on analyzing the issues and challenges faced with the existing techniques for reliability assessment and fault tolerance of various software systems.Keywords: black box, fault tolerance, failure, software reliability
Procedia PDF Downloads 4268489 Secret Sharing in Visual Cryptography Using NVSS and Data Hiding Techniques
Authors: Misha Alexander, S. B. Waykar
Abstract:
Visual Cryptography is a special unbreakable encryption technique that transforms the secret image into random noisy pixels. These shares are transmitted over the network and because of its noisy texture it attracts the hackers. To address this issue a Natural Visual Secret Sharing Scheme (NVSS) was introduced that uses natural shares either in digital or printed form to generate the noisy secret share. This scheme greatly reduces the transmission risk but causes distortion in the retrieved secret image through variation in settings and properties of digital devices used to capture the natural image during encryption / decryption phase. This paper proposes a new NVSS scheme that extracts the secret key from randomly selected unaltered multiple natural images. To further improve the security of the shares data hiding techniques such as Steganography and Alpha channel watermarking are proposed.Keywords: decryption, encryption, natural visual secret sharing, natural images, noisy share, pixel swapping
Procedia PDF Downloads 4048488 Achieving Success in NPD Projects
Authors: Ankush Agrawal, Nadia Bhuiyan
Abstract:
The new product development (NPD) literature emphasizes the importance of introducing new products on the market for continuing business success. New products are responsible for employment, economic growth, technological progress, and high standards of living. Therefore, the study of NPD and the processes through which they emerge is important. The goal of our research is to propose a framework of critical success factors, metrics, and tools and techniques for implementing metrics for each stage of the new product development (NPD) process. An extensive literature review was undertaken to investigate decades of studies on NPD success and how it can be achieved. These studies were scanned for common factors for firms that enjoyed success of new products on the market. The paper summarizes NPD success factors, suggests metrics that should be used to measure these factors, and proposes tools and techniques to make use of these metrics. This was done for each stage of the NPD process, and brought together in a framework that the authors propose should be followed for complex NPD projects. While many studies have been conducted on critical success factors for NPD, these studies tend to be fragmented and focus on one or a few phases of the NPD process.Keywords: new product development, performance, critical success factors, framework
Procedia PDF Downloads 3998487 Special Features Of Phacoemulsification Technique For Dense Cataracts
Authors: Shilkin A.G., Goncharov D.V., Rotanov D.A., Voitecha M.A., Kulyagina Y.I., Mochalova U.E.
Abstract:
Context: Phacoemulsification is a surgical technique used to remove cataracts, but it has a higher number of complications when dense cataracts are present. The risk factors include thin posterior capsule, dense nucleus fragments, and prolonged exposure to high-power ultrasound. To minimize these complications, various methods are used. Research aim: The aim of this study is to develop and implement optimal methods of ultrasound phacoemulsification for dense cataracts in order to minimize postoperative complications. Methodology: The study involved 36 eyes of dogs with dense cataracts over a period of 5 years. The surgeries were performed using a LEICA 844 surgical microscope and an Oertli Faros phacoemulsifier. The surgical techniques included the optimal technique for breaking the nucleus, bimanual surgery, and the use of Akahoshi prechoppers. Findings: The complications observed during the surgery included rupture of the posterior capsule and the need for anterior vitrectomy. Complications in the postoperative period included corneal edema and uveitis. Theoretical importance: This study contributes to the field by providing insights into the special features of phacoemulsification for dense cataracts. It highlights the importance of using specific techniques and settings to minimize complications. Data collection and analysis procedures: The data for the study were collected from surgeries performed on dogs with dense cataracts. The complications were documented and analyzed. Question addressed: The study addressed the question of how to minimize complications during phacoemulsification surgery for dense cataracts. Conclusion: By following the optimal techniques, settings, and using prechoppers, the surgery for dense cataracts can be made safer and faster, minimizing the risks and complications.Keywords: dense cataracts, phacoemulsification, phacoemulsification of cataracts in elderly dogs, осложнения факоэмульсификации
Procedia PDF Downloads 628486 Recent Developments in the Application of Deep Learning to Stock Market Prediction
Authors: Shraddha Jain Sharma, Ratnalata Gupta
Abstract:
Predicting stock movements in the financial market is both difficult and rewarding. Analysts and academics are increasingly using advanced approaches such as machine learning techniques to anticipate stock price patterns, thanks to the expanding capacity of computing and the recent advent of graphics processing units and tensor processing units. Stock market prediction is a type of time series prediction that is incredibly difficult to do since stock prices are influenced by a variety of financial, socioeconomic, and political factors. Furthermore, even minor mistakes in stock market price forecasts can result in significant losses for companies that employ the findings of stock market price prediction for financial analysis and investment. Soft computing techniques are increasingly being employed for stock market prediction due to their better accuracy than traditional statistical methodologies. The proposed research looks at the need for soft computing techniques in stock market prediction, the numerous soft computing approaches that are important to the field, past work in the area with their prominent features, and the significant problems or issue domain that the area involves. For constructing a predictive model, the major focus is on neural networks and fuzzy logic. The stock market is extremely unpredictable, and it is unquestionably tough to correctly predict based on certain characteristics. This study provides a complete overview of the numerous strategies investigated for high accuracy prediction, with a focus on the most important characteristics.Keywords: stock market prediction, artificial intelligence, artificial neural networks, fuzzy logic, accuracy, deep learning, machine learning, stock price, trading volume
Procedia PDF Downloads 908485 Employing Visual Culture to Enhance Initial Adult Maltese Language Acquisition
Authors: Jacqueline Żammit
Abstract:
Recent research indicates that the utilization of right-brain strategies holds significant implications for the acquisition of language skills. Nevertheless, the utilization of visual culture as a means to stimulate these strategies and amplify language retention among adults engaging in second language (L2) learning remains a relatively unexplored area. This investigation delves into the impact of visual culture on activating right-brain processes during the initial stages of language acquisition, particularly in the context of teaching Maltese as a second language (ML2) to adult learners. By employing a qualitative research approach, this study convenes a focus group comprising twenty-seven educators to delve into a range of visual culture techniques integrated within language instruction. The collected data is subjected to thematic analysis using NVivo software. The findings underscore a variety of impactful visual culture techniques, encompassing activities such as drawing, sketching, interactive matching games, orthographic mapping, memory palace strategies, wordless picture books, picture-centered learning methodologies, infographics, Face Memory Game, Spot the Difference, Word Search Puzzles, the Hidden Object Game, educational videos, the Shadow Matching technique, Find the Differences exercises, and color-coded methodologies. These identified techniques hold potential for application within ML2 classes for adult learners. Consequently, this study not only provides insights into optimizing language learning through specific visual culture strategies but also furnishes practical recommendations for enhancing language competencies and skills.Keywords: visual culture, right-brain strategies, second language acquisition, maltese as a second language, visual aids, language-based activities
Procedia PDF Downloads 618484 ViraPart: A Text Refinement Framework for Automatic Speech Recognition and Natural Language Processing Tasks in Persian
Authors: Narges Farokhshad, Milad Molazadeh, Saman Jamalabbasi, Hamed Babaei Giglou, Saeed Bibak
Abstract:
The Persian language is an inflectional subject-object-verb language. This fact makes Persian a more uncertain language. However, using techniques such as Zero-Width Non-Joiner (ZWNJ) recognition, punctuation restoration, and Persian Ezafe construction will lead us to a more understandable and precise language. In most of the works in Persian, these techniques are addressed individually. Despite that, we believe that for text refinement in Persian, all of these tasks are necessary. In this work, we proposed a ViraPart framework that uses embedded ParsBERT in its core for text clarifications. First, used the BERT variant for Persian followed by a classifier layer for classification procedures. Next, we combined models outputs to output cleartext. In the end, the proposed model for ZWNJ recognition, punctuation restoration, and Persian Ezafe construction performs the averaged F1 macro scores of 96.90%, 92.13%, and 98.50%, respectively. Experimental results show that our proposed approach is very effective in text refinement for the Persian language.Keywords: Persian Ezafe, punctuation, ZWNJ, NLP, ParsBERT, transformers
Procedia PDF Downloads 2188483 Using Machine Learning Techniques for Autism Spectrum Disorder Analysis and Detection in Children
Authors: Norah Mohammed Alshahrani, Abdulaziz Almaleh
Abstract:
Autism Spectrum Disorder (ASD) is a condition related to issues with brain development that affects how a person recognises and communicates with others which results in difficulties with interaction and communication socially and it is constantly growing. Early recognition of ASD allows children to lead safe and healthy lives and helps doctors with accurate diagnoses and management of conditions. Therefore, it is crucial to develop a method that will achieve good results and with high accuracy for the measurement of ASD in children. In this paper, ASD datasets of toddlers and children have been analyzed. We employed the following machine learning techniques to attempt to explore ASD and they are Random Forest (RF), Decision Tree (DT), Na¨ıve Bayes (NB) and Support Vector Machine (SVM). Then Feature selection was used to provide fewer attributes from ASD datasets while preserving model performance. As a result, we found that the best result has been provided by the Support Vector Machine (SVM), achieving 0.98% in the toddler dataset and 0.99% in the children dataset.Keywords: autism spectrum disorder, machine learning, feature selection, support vector machine
Procedia PDF Downloads 1528482 Condition Assessment of Reinforced Concrete Bridge Deck Using Ground Penetrating Radar
Authors: Azin Shakibabarough, Mojtaba Valinejadshoubi, Ashutosh Bagchi
Abstract:
Catastrophic bridge failure happens due to the lack of inspection, lack of design and extreme events like flooding, an earthquake. Bridge Management System (BMS) is utilized to diminish such an accident with proper design and frequent inspection. Visual inspection cannot detect any subsurface defects, so using Non-Destructive Evaluation (NDE) techniques remove these barriers as far as possible. Among all NDE techniques, Ground Penetrating Radar (GPR) has been proved as a highly effective device for detecting internal defects in a reinforced concrete bridge deck. GPR is used for detecting rebar location and rebar corrosion in the reinforced concrete deck. GPR profile is composed of hyperbola series in which sound hyperbola denotes sound rebar and blur hyperbola or signal attenuation shows corroded rebar. Interpretation of GPR images is implemented by numerical analysis or visualization. Researchers recently found that interpretation through visualization is more precise than interpretation through numerical analysis, but visualization is time-consuming and a highly subjective process. Automating the interpretation of GPR image through visualization can solve these problems. After interpretation of all scans of a bridge, condition assessment is conducted based on the generated corrosion map. However, this such a condition assessment is not objective and precise. Condition assessment based on structural integrity and strength parameters can make it more objective and precise. The main purpose of this study is to present an automated interpretation method of a reinforced concrete bridge deck through a visualization technique. In the end, the combined analysis of the structural condition in a bridge is implemented.Keywords: bridge condition assessment, ground penetrating radar, GPR, NDE techniques, visualization
Procedia PDF Downloads 1498481 Standardization Of Miniature Neutron Research Reactor And Occupational Safety Analysis
Authors: Raymond Limen Njinga
Abstract:
The comparator factors (Fc) for miniature research reactors are of great importance in the field of nuclear physics as it provide accurate bases for the evaluation of elements in all form of samples via ko-NAA techniques. The Fc was initially simulated theoretically thereafter, series of experiments were performed to validate the results. In this situation, the experimental values were obtained using the alloy of Au(0.1%) - Al monitor foil and a neutron flux setting of 5.00E+11 cm-2.s-1. As was observed in the inner irradiation position, the average experimental value of 7.120E+05 was reported against the theoretical value of 7.330E+05. In comparison, a percentage deviation of 2.86 (from theoretical value) was observed. In the large case of the outer irradiation position, the experimental value of 1.170E+06 was recorded against the theoretical value of 1.210E+06 with a percentage deviation of 3.310 (from the theoretical value). The estimation of equivalent dose rate at 5m from neutron flux of 5.00E+11 cm-2.s-1 within the neutron energies of 1KeV, 10KeV, 100KeV, 500KeV, 1MeV, 5MeV and 10MeV were calculated to be 0.01 Sv/h, 0.01 Sv/h, 0.03 Sv/h, 0.15 Sv/h, 0.21Sv/h and 0.25 Sv/h respectively with a total dose within a period of an hour was obtained to be 0.66 Sv.Keywords: neutron flux, comparator factor, NAA techniques, neutron energy, equivalent dose
Procedia PDF Downloads 1838480 The Effect of Electromagnetic Stirring during Solidification of Nickel Based Alloys
Authors: Ricardo Paiva, Rui Soares, Felix Harnau, Bruno Fragoso
Abstract:
Nickel-based alloys are materials well suited for service in extreme environments subjected to pressure and heat. Some industrial applications for Nickel-based alloys are aerospace and jet engines, oil and gas extraction, pollution control and waste processing, automotive and marine industry. It is generally recognized that grain refinement is an effective methodology to improve the quality of casted parts. Conventional grain refinement techniques involve the addition of inoculation substances, the control of solidification conditions, or thermomechanical treatment with recrystallization. However, such methods often lead to non-uniform grain size distribution and the formation of hard phases, which are detrimental to both wear performance and biocompatibility. Stirring of the melt by electromagnetic fields has been widely used in continuous castings with success for grain refinement, solute redistribution, and surface quality improvement. Despite the advantages, much attention has not been paid yet to the use of this approach on functional castings such as investment casting. Furthermore, the effect of electromagnetic stirring (EMS) fields on Nickel-based alloys is not known. In line with the gaps/needs of the state-of-art, the present research work targets to promote new advances in controlling grain size and morphology of investment cast Nickel based alloys. For such a purpose, a set of experimental tests was conducted. A high-frequency induction furnace with vacuum and controlled atmosphere was used to cast the Inconel 718 alloy in ceramic shells. A coil surrounded the casting chamber in order to induce electromagnetic stirring during solidification. Aiming to assess the effect of the electromagnetic stirring on Ni alloys, the samples were subjected to microstructural analysis and mechanical tests. The results show that electromagnetic stirring can be an effective methodology to modify the grain size and mechanical properties of investment-cast parts.Keywords: investment casting, grain refinement, electromagnetic stirring, nickel alloys
Procedia PDF Downloads 1338479 Application of Artificial Neural Network for Prediction of High Tensile Steel Strands in Post-Tensioned Slabs
Authors: Gaurav Sancheti
Abstract:
This study presents an impacting approach of Artificial Neural Networks (ANNs) in determining the quantity of High Tensile Steel (HTS) strands required in post-tensioned (PT) slabs. Various PT slab configurations were generated by varying the span and depth of the slab. For each of these slab configurations, quantity of required HTS strands were recorded. ANNs with backpropagation algorithm and varying architectures were developed and their performance was evaluated in terms of Mean Square Error (MSE). The recorded data for the quantity of HTS strands was used as a feeder database for training the developed ANNs. The networks were validated using various validation techniques. The results show that the proposed ANNs have a great potential with good prediction and generalization capability.Keywords: artificial neural networks, back propagation, conceptual design, high tensile steel strands, post tensioned slabs, validation techniques
Procedia PDF Downloads 2218478 The Gold Standard Treatment Plan for Vitiligo: A Review on Conventional and Updated Treatment Methods
Authors: Kritin K. Verma, Brian L. Ransdell
Abstract:
White patches are a symptom of vitiligo, a chronic autoimmune dermatological condition that causes a loss of pigmentation in the skin. Vitiligo can cause issues of self-esteem and quality of life while also progressing the development of other autoimmune diseases. Current treatments in allopathy and homeopathy exist; some treatments have been found to be toxic, whereas others have been helpful. Allopathy has seemed to offer several treatment plans, such as phototherapy, skin lightening preparations, immunosuppressive drugs, combined modality therapy, and steroid medications to improve vitiligo. This presentation will review the FDA-approved topical cream, Opzelura, a JAK inhibitor, and its effects on limiting vitiligo progression. Meanwhile, other non-conventional methods, such as Arsenic Sulphuratum Flavum used in homeopathy, will be debunked based on current literature. Most treatments still serve to arrest progression and induce skin repigmentation. Treatment plans may differ between patients due to depigmentation location on the skin. Since there is no gold standard plan for treating patients with vitiligo, the oral presentation will review all topical and systemic pharmacological therapies that fight the depigmentation of the skin and categorize their validity from a systematic review of the literature. Since treatment plans are limited in nature, all treatment methods will be mentioned and an attempt will be made to make a golden standard treatment process for these patients.Keywords: vitiligo, phototherapy, immunosuppressive drugs, skin lightening preparations, combined modality therapy, arsenic sulphuratum flavum, homeopathy, allopathy, golden standard, Opzelura
Procedia PDF Downloads 878477 Investigation of Fumaric Acid Radiolysis Using Gamma Irradiation
Authors: Wafa Jahouach-Rabai, Khouloud Ouerghi, Zohra Azzouz-Berriche, Faouzi Hosni
Abstract:
Widely used organic products in the pharmaceutical industry have been detected in environmental systems, essentially carboxylic acids. In this purpose, the degradation efficiency of these contaminants was evaluated using an advanced oxidation process (AOP), namely ionization process as an alternative to conventional water treatment technologies. This process permitted the generation of radical reactions to directly degrade organic pollutants in wastewater. In fact, gamma irradiation of aqueous solutions produces several reactive radicals, essentially hydroxyl radical (OH), to destroy recalcitrant pollutants. Different concentrations of aqueous solutions of Fumaric acid (FA) were considered in this study (0.1-1 mmol/L), which were treated by irradiation doses from 1 to 15 kGy with 6.1 kGy/h rate by ionizing system in pilot scale (⁶⁰Co irradiator). Variations of main parameters influencing degradation efficiency versus absorbed doses were released in the aim to optimize total mineralization of considered pollutants. Preliminary degradation pathway until complete mineralization into CO₂ has been suggested based on detection of residual degradation derivatives using different techniques, namely high performance liquid chromatography (HPLC) and electron paramagnetic resonance spectroscopy (EPR). Results revealed total destruction of treated compound, which improve the efficiency of this process in water remediation. We investigated the reactivity of hydroxyl radicals generated by irradiation on dicarboxylic acid (FA) in aqueous solutions, leading to its degradation into other smaller molecules. In fact, gamma irradiation of FA leads to the formation of hydroxylated intermediates such as hydroxycarbonyl radical which were identified by EPR spectroscopy. Finally, pilot plant irradiation facilities improved the applicability of radiation technology on large scale.Keywords: AOP, radiolysis, fumaric acid, gamma irradiation, hydroxyl radical, EPR, HPLC
Procedia PDF Downloads 1738476 Safety of Built Infrastructure: Single Degree of Freedom Approach to Blast Resistant RC Wall Panels
Authors: Muizz Sanni-Anibire
Abstract:
The 21st century has witnessed growing concerns for the protection of built facilities against natural and man-made disasters. Studies in earthquake resistant buildings, fire, and explosion resistant buildings now dominate the arena. To protect people and facilities from the effects of the explosion, reinforced concrete walls have been designed to be blast resistant. Understanding the performance of these walls is a key step in ensuring the safety of built facilities. Blast walls are mostly designed using simple techniques such as single degree of freedom (SDOF) method, despite the increasing use of multi-degree of freedom techniques such as the finite element method. This study is the first stage of a continuous research into the safety and reliability of blast walls. It presents the SDOF approach applied to the analysis of a concrete wall panel under three representative bomb situations. These are motorcycle 50 kg, car 400kg and also van with the capacity of 1500 kg of TNT explosive.Keywords: blast wall, safety, protection, explosion
Procedia PDF Downloads 2638475 Effects of Different Processing Methods on Composition, Physicochemical and Morphological Properties of MR263 Rice Flour
Authors: R. Asmeda, A. Noorlaila, M. H. Norziah
Abstract:
This research work was conducted to investigate the effects of different grinding techniques during the milling process of rice grains on physicochemical characteristics of rice flour produced. Dry grinding, semi-wet grinding, and wet grinding were employed to produce the rice flour. The results indicated that different grinding methods significantly (p ≤ 0.05) affected physicochemical and functional properties of starch except for the carbohydrate content, x-ray diffraction pattern and breakdown viscosity. Dry grinding technique caused highest percentage of starch damage compared to semi-wet and wet grinding. Protein, fat and ash content were highest in rice flour obtained by dry grinding. It was found that wet grinding produce flour with smallest average particle size (8.52 µm), resulting in highest process yield (73.14%). Pasting profiles revealed that dry grinding produce rice flour with significantly lowest pasting temperature and highest setback viscosity.Keywords: average particle size, grinding techniques, physicochemical characteristics, rice flour
Procedia PDF Downloads 1918474 3D Point Cloud Model Color Adjustment by Combining Terrestrial Laser Scanner and Close Range Photogrammetry Datasets
Authors: M. Pepe, S. Ackermann, L. Fregonese, C. Achille
Abstract:
3D models obtained with advanced survey techniques such as close-range photogrammetry and laser scanner are nowadays particularly appreciated in Cultural Heritage and Archaeology fields. In order to produce high quality models representing archaeological evidences and anthropological artifacts, the appearance of the model (i.e. color) beyond the geometric accuracy, is not a negligible aspect. The integration of the close-range photogrammetry survey techniques with the laser scanner is still a topic of study and research. By combining point cloud data sets of the same object generated with both technologies, or with the same technology but registered in different moment and/or natural light condition, could construct a final point cloud with accentuated color dissimilarities. In this paper, a methodology to uniform the different data sets, to improve the chromatic quality and to highlight further details by balancing the point color will be presented.Keywords: color models, cultural heritage, laser scanner, photogrammetry
Procedia PDF Downloads 2808473 Comparative DNA Binding of Iron and Manganese Complexes by Spectroscopic and ITC Techniques and Antibacterial Activity
Authors: Maryam Nejat Dehkordi, Per Lincoln, Hassan Momtaz
Abstract:
Interaction of Schiff base complexes of iron and manganese (iron [N, N’ Bis (5-(triphenyl phosphonium methyl) salicylidene) -1, 2 ethanediamine) chloride, [Fe Salen]Cl, manganese [N, N’ Bis (5-(triphenyl phosphonium methyl) salicylidene) -1, 2 ethanediamine) acetate) with DNA were investigated by spectroscopic and isothermal titration calorimetry techniques (ITC). The absorbance spectra of complexes have shown hyper and hypochromism in the presence of DNA that is indication of interaction of complexes with DNA. The linear dichroism (LD) measurements confirmed the bending of DNA in the presence of complexes. Furthermore, isothermal titration calorimetry experiments approved that complexes bound to DNA on the base of both electrostatic and hydrophobic interactions. Furthermore, ITC profile exhibits the existence of two binding phases for the complex. Antibacterial activity of ligand and complexes were tested in vitro to evaluate their activity against the gram positive and negative bacteria.Keywords: Schiff base complexes, ct-DNA, linear dichroism (LD), isothermal titration calorimetry (ITC), antibacterial activity
Procedia PDF Downloads 4718472 Electrochemical Recovery of Lithium from Geothermal Brines
Authors: Sanaz Mosadeghsedghi, Mathew Hudder, Mohammad Ali Baghbanzadeh, Charbel Atallah, Seyedeh Laleh Dashtban Kenari, Konstantin Volchek
Abstract:
Lithium has recently been extensively used in lithium-ion batteries (LIBs) for electric vehicles and portable electronic devices. The conventional evaporative approach to recover and concentrate lithium is extremely slow and may take 10-24 months to concentrate lithium from dilute sources, such as geothermal brines. To response to the increasing industrial lithium demand, alternative extraction and concentration technologies should be developed to recover lithium from brines with low concentrations. In this study, a combination of electrocoagulation (EC) and electrodialysis (ED) was evaluated for the recovery of lithium from geothermal brines. The brine samples in this study, collected in Western Canada, had lithium concentrations of 50-75 mg/L on a background of much higher (over 10,000 times) concentrations of sodium. This very high sodium-to-lithium ratio poses challenges to the conventional direct-lithium extraction processes which employ lithium-selective adsorbents. EC was used to co-precipitate lithium using a sacrificial aluminium electrode. The precipitate was then dissolved, and the leachate was treated using ED to separate and concentrate lithium from other ions. The focus of this paper is on the study of ED, including a two-step ED process that included a mono-valent selective stage to separate lithium from multi-valent cations followed by a bipolar ED stage to convert lithium chloride (LiCl) to LiOH product. Eventually, the ED cell was reconfigured using mono-valent cation exchange with the bipolar membranes to combine the two ED steps in one. Using this process at optimum conditions, over 95% of the co-existing cations were removed and the purity of lithium increased to over 90% in the final product.Keywords: electrochemical separation, electrocoagulation, electrodialysis, lithium extraction
Procedia PDF Downloads 948471 Increasing Solubility and Bioavailability of Fluvastatin through Transdermal Nanoemulsion Gel Delivery System for the Treatment of Osteoporosis
Authors: Ramandeep Kaur, Makula Ajitha
Abstract:
Fluvastatin has been reported for increasing bone mineral density in osteoporosis since last decade. Systemically administered drug undergoes extensive hepatic first-pass metabolism, thus very small amount of drug reaches the bone tissue which is highly insignificant. The present study aims to deliver fluvastatin in the form of nanoemulsion (NE) gel directly to the bone tissue through transdermal route thereby bypassing hepatic first pass metabolism. The NE formulation consisted of isopropyl myristate as oil, tween 80 as surfactant, transcutol as co-surfactant and water as the aqueous phase. Pseudoternary phase diagrams were constructed using aqueous titration method and NE’s obtained were subjected to thermodynamic-kinetic stability studies. The stable NE formulations were evaluated for their droplet size, zeta potential, and transmission electron microscopy (TEM). The nano-sized formulations were incorporated into 0.5% carbopol 934 gel matrix. Ex-vivo permeation behaviour of selected formulations through rat skin was investigated and compared with the conventional formulations (suspension and emulsion). Further, in-vivo pharmacokinetic study was carried using male Wistar rats. The optimized NE formulations mean droplet size was 11.66±3.2 nm with polydispersity index of 0.117. Permeation flux of NE gel formulations was found significantly higher than the conventional formulations i.e. suspension and emulsion. In vivo pharmacokinetic study showed significant increase in bioavailability (1.25 fold) of fluvastatin than oral formulation. Thus, it can be concluded that NE gel was successfully developed for transdermal delivery of fluvastatin for the treatment of osteoporosis.Keywords: fluvastatin, nanoemulsion gel, osteoporosis, transdermal
Procedia PDF Downloads 1898470 Create a Brand Value Assessment Model to Choosing a Cosmetic Brand in Tehran Combining DEMATEL Techniques and Multi-Stage ANFIS
Authors: Hamed Saremi, Suzan Taghavy, Seyed Mohammad Hanif Sanjari, Mostafa Kahali
Abstract:
One of the challenges in manufacturing and service companies to provide a product or service is recognized Brand to consumers in target markets. They provide most of their processes under the same capacity. But the constant threat of devastating internal and external resources to prevent a rise Brands and more companies are recognizing the stages are bankrupt. This paper has tried to identify and analyze effective indicators of brand equity and focuses on indicators and presents a model of intelligent create a model to prevent possible damage. In this study, the identified indicators of brand equity are based on literature study and according to expert opinions, set of indicators By techniques DEMATEL Then to used Multi-Step Adaptive Neural-Fuzzy Inference system (ANFIS) to design a multi-stage intelligent system for assessment of brand equity.Keywords: brand, cosmetic product, ANFIS, DEMATEL
Procedia PDF Downloads 4178469 Automatic Lead Qualification with Opinion Mining in Customer Relationship Management Projects
Authors: Victor Radich, Tania Basso, Regina Moraes
Abstract:
Lead qualification is one of the main procedures in Customer Relationship Management (CRM) projects. Its main goal is to identify potential consumers who have the ideal characteristics to establish a profitable and long-term relationship with a certain organization. Social networks can be an important source of data for identifying and qualifying leads since interest in specific products or services can be identified from the users’ expressed feelings of (dis)satisfaction. In this context, this work proposes the use of machine learning techniques and sentiment analysis as an extra step in the lead qualification process in order to improve it. In addition to machine learning models, sentiment analysis or opinion mining can be used to understand the evaluation that the user makes of a particular service, product, or brand. The results obtained so far have shown that it is possible to extract data from social networks and combine the techniques for a more complete classification.Keywords: lead qualification, sentiment analysis, opinion mining, machine learning, CRM, lead scoring
Procedia PDF Downloads 858468 Parametric Study on the Development of Earth Pressures Behind Integral Bridge Abutments Under Cyclic Translational Movements
Authors: Lila D. Sigdel, Chin J. Leo, Samanthika Liyanapathirana, Pan Hu, Minghao Lu
Abstract:
Integral bridges are a class of bridges with integral or semi-integral abutments, designed without expansion joints in the bridge deck of the superstructure. Integral bridges are economical alternatives to conventional jointed bridges with lower maintenance costs and greater durability, thereby improving social and economic stability for the community. Integral bridges have also been proven to be effective in lowering the overall construction cost compared to the conventional type of bridges. However, there is significant uncertainty related to the design and analysis of integral bridges in response to cyclic thermal movements induced due to deck expansion and contraction. The cyclic thermal movements of the abutments increase the lateral earth pressures on the abutment and its foundation, leading to soil settlement and heaving of the backfill soil. Thus, the primary objective of this paper is to investigate the soil-abutment interaction under the cyclic translational movement of the abutment. Results from five experiments conducted to simulate different magnitudes of cyclic translational movements of abutments induced by thermal changes are presented, focusing on lateral earth pressure development at the abutment-soil interface. Test results show that the cycle number and magnitude of cyclic translational movements have significant effects on the escalation of lateral earth pressures. Experimentally observed earth pressure distributions behind the integral abutment were compared with the current design approaches, which shows that the most of the practices has under predicted the lateral earth pressure.Keywords: integral bridge, cyclic thermal movement, lateral earth pressure, soil-structure interaction
Procedia PDF Downloads 1148467 Business-Intelligence Mining of Large Decentralized Multimedia Datasets with a Distributed Multi-Agent System
Authors: Karima Qayumi, Alex Norta
Abstract:
The rapid generation of high volume and a broad variety of data from the application of new technologies pose challenges for the generation of business-intelligence. Most organizations and business owners need to extract data from multiple sources and apply analytical methods for the purposes of developing their business. Therefore, the recently decentralized data management environment is relying on a distributed computing paradigm. While data are stored in highly distributed systems, the implementation of distributed data-mining techniques is a challenge. The aim of this technique is to gather knowledge from every domain and all the datasets stemming from distributed resources. As agent technologies offer significant contributions for managing the complexity of distributed systems, we consider this for next-generation data-mining processes. To demonstrate agent-based business intelligence operations, we use agent-oriented modeling techniques to develop a new artifact for mining massive datasets.Keywords: agent-oriented modeling (AOM), business intelligence model (BIM), distributed data mining (DDM), multi-agent system (MAS)
Procedia PDF Downloads 4328466 Minimizing Vehicular Traffic via Integrated Land Use Development: A Heuristic Optimization Approach
Authors: Babu Veeregowda, Rongfang Liu
Abstract:
The current traffic impact assessment methodology and environmental quality review process for approval of land development project are conventional, stagnant, and one-dimensional. The environmental review policy and procedure lacks in providing the direction to regulate or seek alternative land uses and sizes that exploits the existing or surrounding elements of built environment (‘4 D’s’ of development – Density, Diversity, Design, and Distance to Transit) or smart growth principles which influence the travel behavior and have a significant effect in reducing vehicular traffic. Additionally, environmental review policy does not give directions on how to incorporate urban planning into the development in ways such as incorporating non-motorized roadway elements such as sidewalks, bus shelters, and access to community facilities. This research developed a methodology to optimize the mix of land uses and sizes using the heuristic optimization process to minimize the auto dependency development and to meet the interests of key stakeholders. A case study of Willets Point Mixed Use Development in Queens, New York, was used to assess the benefits of the methodology. The approved Willets Point Mixed Use project was based on maximum envelop of size and land use type allowed by current conventional urban renewal plans. This paper will also evaluate the parking accumulation for various land uses to explore the potential for shared parking to further optimize the mix of land uses and sizes. This research is very timely and useful to many stakeholders interested in understanding the benefits of integrated land uses and its development.Keywords: traffic impact, mixed use, optimization, trip generation
Procedia PDF Downloads 2148465 Examples from a Traditional Sismo-Resistant Architecture
Authors: Amira Zatir, Abderahmane Mokhtari, Amina Foufa, Sara Zatir
Abstract:
It exists in several regions in the world, of numerous historic monuments, buildings and housing environment, built in traditional ways which survive for earthquakes, even in zones where the seismic risk is particularly raised. These constructions, stemming from vernacular architecture, allow, through their resistances in the time earthquakes, to identify the various sismo-resistant "local" techniques. Through the examples and the experiences presented, the remark which can be made, is that in the traditional built, two major principles in a way opposite, govern the constructions in earthquake-resistant. It is about the very big flexibility, whom answer very light constructions, like the Japanese wooden constructions, Turkish and even Chinese; that of the very big rigidity to which correspond constructions in masonry in particular stone, more or less heavy and massive, which we meet in particular in the Mediterranean Basin, and in the historic sanctuary of Machu Pacchu. In it sensible and well-reflected techniques of construction are added, of which the use of the humble materials such as the earth and the adobe. The ancient communities were able to face the seismic risks, thanks to them know-how reflected in their intelligently designed constructions, testifying of a local seismic culture.Keywords: earthquake, architecture, traditional, construction, resistance
Procedia PDF Downloads 4208464 Application of Nanofibers in Heavy Metal (HM) Filtration
Authors: Abhijeet Kumar, Palaniswamy N. K.
Abstract:
Heavy metal contamination in water sources endangers both the environment and human health. Various water filtration techniques have been employed till now for purification and removal of hazardous metals from water. Among all the existing methods, nanofibres have emerged as a viable alternative for effective heavy metal removal in recent years because of their unique qualities, such as large surface area, interconnected porous structure, and customizable surface chemistry. Among the numerous manufacturing techniques, solution blow spinning has gained popularity as a versatile process for producing nanofibers with customized properties. This paper seeks to offer a complete overview of the use of nanofibers for heavy metal filtration, particularly those produced using solution blow spinning. The review discusses current advances in nanofiber materials, production processes, and heavy metal removal performance. Furthermore, the field's difficulties and future opportunities are examined in order to direct future research and development activities.Keywords: heavy metals, nanofiber composite, filter membranes, adsorption, impaction
Procedia PDF Downloads 688463 Intensification of Process Kinetics for Conversion of Organic Volatiles into Syngas Using Non-Thermal Plasma
Authors: Palash Kumar Mollick, Leire Olazar, Laura Santamaria, Pablo Comendador, Manomita Mollick, Gartzen Lopez, Martin Olazar
Abstract:
The entire world is skeptical towards a silver line technology of converting plastic waste into valuable synthetic gas. At this junction, besides an adequately studied conventional catalytic process for steam reforming, a non-thermal plasma is being introduced. Organic volatiles are produced in the first step, pyrolysing the plastic materials. Resultant lightweight olefins and carbon monoxide are the major components that undergo a steam reforming process to achieve syngas. A non-thermal plasma consists of ionized gases and free electrons with an electronic temperature as high as 10³ K. Organic volatiles are, in general, endorganics inactive and thus demand huge bond-breaking energy. Conventional catalyst is incapable of providing the required activation energy, leading to poor thermodynamic equilibrium, whereas a non-thermal plasma can actively collide with reactants to produce a rich mix of reactive species, including vibrationally or electronically excited molecules, radicals, atoms, and ions. In addition, non-thermal plasma provides nonequilibrium conditions leading to electric discharge only in certain degrees of freedom without affecting the intrinsic chemical conditions of the participating reactants and products. In this work, we report thermodynamic and kinetic aspects of the conversion of organic volatiles into syngas using a non-thermal plasma. Detailed characteristics of plasma and its effect on the overall yield of the process will be presented.Keywords: non thermal plasma, plasma catalysis, steam reforming, syngas, plastic waste, green energy
Procedia PDF Downloads 71