Search results for: Numerical method simulation
1007 Perceptions of Greenhouse Vegetable Growers Regarding Use of Biological Control Practices: A Case Study in Jiroft County, Iran
Authors: Hossein Shabanali Fami, Omid Sharifi, Javad Ghasemi, Mahtab Pouratashi, Mona Sadat Moghadasian
Abstract:
The main purpose of this study was to investigate perception of greenhouse vegetable growers regarding use of biological control practices during the growing season. The statistical population of the study included greenhouse vegetable growers in Jiroft county (N=1862). A sample of 137 vegetable growers was selected, using random sampling method. Data were collected via a questionnaire. The validity of the instrument was obtained by the faculty members of the Department of Agricultural Development and Management in the University of Tehran. Cronbach’s alpha was applied to estimate the reliability which showed a high reliability for the instrument. Data was analyzed using SPSS/Windows 13.5. The results revealed that greenhouse vegetable growers had moderate level of perception regarding biological control practices. Levels of vegetable growers’ perceptions regarding biological control practices were different on the basis of their academic qualifications as well as educational level and job. In addition, the results indicated that about 54.1% of variations in vegetable growers’ perceptions could be explained by variables such as awareness of biological control practices, knowledge on pests, annual production and age.Keywords: Greenhouse, biological control, biological agents, perception, vegetable grower.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17051006 Role of Organic Wastewater Constituents in Iron Redox Cycling for Ferric Sludge Reuse in the Fenton-Based Treatment
Authors: J. Bolobajev, M. Trapido, A. Goi
Abstract:
The practical application of the Fenton-based treatment method for organic compounds-contaminated water purification is limited mainly because of the large amount of ferric sludge formed during the treatment, where ferrous iron (Fe(II)) is used as the activator of the hydrogen peroxide oxidation processes. Reuse of ferric sludge collected from clarifiers to substitute Fe(II) salts allows reducing the total cost of Fenton-type treatment technologies and minimizing the accumulation of hazardous ferric waste. Dissolution of ferric iron (Fe(III)) from the sludge to liquid phase at acidic pH and autocatalytic transformation of Fe(III) to Fe(II) by phenolic compounds (tannic acid, lignin, phenol, catechol, pyrogallol and hydroquinone) added or present as water/wastewater constituents were found to be essentially involved in the Fenton-based oxidation mechanism. Observed enhanced formation of highly reactive species, hydroxyl radicals, resulted in a substantial organic contaminant degradation increase. Sludge reuse at acidic pH and in the presence of ferric iron reductants is a novel strategy in the Fenton-based treatment application for organic compounds-contaminated water purification.
Keywords: Ferric sludge reuse, ferric iron reductant, water treatment, organic pollutant.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16691005 Lean Environmental Management Integration System (LEMIS) Framework Development
Authors: Puvanasvaran, A. P., Suresh V., N. Norazlin
Abstract:
The Lean Environmental Management Integration System (LEMIS) framework development is integration between lean core element and ISO 14001. The curiosity on the relationship between continuous improvement and sustainability of lean implementation has influenced this study toward LEMIS. Characteristic of ISO 14001 standard clauses and core elements of lean principles are explored from past studies and literature reviews. Survey was carried out on ISO 14001 certified companies to examine continual improvement by implementing the ISO 14001 standard. The study found that there is a significant and positive relationship between Lean Principles: value, value stream, flow, pull and perfection with the ISO 14001 requirements. LEMIS is significant to support the continuous improvement and sustainability. The integration system can be implemented to any manufacturing company. It gives awareness on the importance on why organizations need to sustain its environmental management system. In the meantime, the lean principle can be adapted in order to streamline daily activities of the company. Throughout the study, it had proven that there is no sacrifice or trade-off between lean principles with ISO 14001 requirements. The framework developed in the study can be further simplified in the future, especially the method of crossing each sub requirements of ISO 14001 standard with the core elements of Lean principles in this study.
Keywords: LEMIS, ISO 14001, integration, framework.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 23831004 Building Information Modeling-Based Approach for Automatic Quantity Take-off and Cost Estimation
Authors: Lo Kar Yin, Law Ka Mei
Abstract:
Architectural, engineering, construction and operations (AECO) industry practitioners have been well adapting to the dynamic construction market from the fundamental training of its disciplines. As further triggered by the pandemic since 2019, great steps are taken in virtual environment and the best collaboration is strived with project teams without boundaries. With adoption of Building Information Modeling-based approach and qualitative analysis, this paper is to review quantity take-off (QTO) and cost estimation process through modeling techniques in liaison with suppliers, fabricators, subcontractors, contractors, designers, consultants and services providers in the construction industry value chain for automatic project cost budgeting, project cost control and cost evaluation on design options of in-situ reinforced-concrete construction and Modular Integrated Construction (MiC) at design stage, variation of works and cash flow/spending analysis at construction stage as far as practicable, with a view to sharing the findings for enhancing mutual trust and co-operation among AECO industry practitioners. It is to foster development through a common prototype of design and build project delivery method in NEC4 Engineering and Construction Contract (ECC) Options A and C.
Keywords: Building Information Modeling, cost estimation, quantity take-off, modeling techniques.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7091003 The Effect of Board Composition and Ownership Concentration on Earnings Management: Evidence from IRAN
Authors: F. Rahnamay Roodposhti, S. A. Nabavi Chashmi
Abstract:
The role of corporate governance is to reduce the divergence of interests between shareholders and managers. The role of corporate governance is more useful when managers have an incentive to deviate from shareholders- interests. One example of management-s deviation from shareholders- interests is the management of earnings through the use of accounting accruals. This paper examines the association between corporate governance internal mechanisms ownership concentration, board independence, the existence of CEO-Chairman duality and earnings management. Firm size and leverage are control variables. The population used in this study comprises firms listed on the Tehran Stock Exchange (TSE) between 2004 and 2008, the sample comprises 196 firms. Panel Data method is employed as technique to estimate the model. We find that there is negative significant association between ownership concentration and board independence manage earnings with earnings management, there is negative significant association between the existence of CEO-Chairman duality and earnings management. This study also found a positive significant association between control variable (firm size and leverage) and earnings management.Keywords: Earnings management, board independence, ownership concentration, corporate governance.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 39191002 Characterization and Geochemical Modeling of Cu and Zn Sorption Using Mixed Mineral Systems Injected with Iron Sulfide under Sulfidic-Anoxic Conditions I: Case Study of Cwmheidol Mine Waste Water, Wales, United Kingdom
Authors: D. E. Egirani, J. E. Andrews, A. R. Baker
Abstract:
This study investigates sorption of Cu and Zn contained in natural mine wastewater, using mixed mineral systems in sulfidic-anoxic condition. The mine wastewater was obtained from disused mine workings at Cwmheidol in Wales, United Kingdom. These contaminants flow into water courses. These water courses include River Rheidol. In this River fishing activities exist. In an attempt to reduce Cu-Zn levels of fish intake in the watercourses, single mineral systems and 1:1 mixed mineral systems of clay and goethite were tested with the mine waste water for copper and zinc removal at variable pH. Modelling of hydroxyl complexes was carried out using phreeqc method. Reactions using batch mode technique was conducted at room temperature. There was significant differences in the behaviour of copper and zinc removal using mixed mineral systems when compared to single mineral systems. All mixed mineral systems sorb more Cu than Zn when tested with mine wastewater.
Keywords: Cu- Zn, hydroxyl complexes, kinetics, mixed mineral systems, reactivity
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9251001 Effect of Rubber Treatment on Compressive Strength and Modulus of Elasticity of Self-Compacting Rubberized Concrete
Authors: I. Miličević, M. Hadzima Nyarko, R. Bušić, J. Simonović Radosavljević, M. Prokopijević, K. Vojisavljević
Abstract:
This paper investigates the effects of different treatment methods of rubber aggregates for self-compacting concrete (SCC) on compressive strength and modulus of elasticity. SCC mixtures with 10% replacement of fine aggregate with crumb rubber by total aggregate volume and with different aggregate treatment methods were investigated. The rubber aggregate was treated in three different methods: dry process, water-soaking, and NaOH treatment plus water soaking. Properties of SCC in a fresh and hardened state were tested and evaluated. Scanning electron microscope (SEM) analysis of three different SCC patches were made and discussed. It was observed that applying the proposed NaOH plus water soaking method resulted in the improvement of fresh and hardened concrete properties. It resulted in a more uniform distribution of rubber particles in the cement matrix, a better bond between rubber particles and the cement matrix, and higher compressive strength of SCC rubberized concrete.
Keywords: Compressive strength, modulus of elasticity, NaOH treatment, rubber aggregate, self-compacting rubberized concrete, scanning electron microscope analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 6411000 Investigating Daylight Quality in Malaysian Government Office Buildings Through Daylight Factor and Surface Luminance
Authors: Mohd Zin Kandar, Mohd Sabere Sulaiman, Yong Razidah Rashid, Dilshan Remaz Ossen, Aminatuzuhariah MAbdullah, Lim Yaik Wah, Mansour Nikpour
Abstract:
In recent years, there has been an increasing interest in using daylight to save energy in buildings. In tropical regions, daylighting is always an energy saver. On the other hand, daylight provides visual comfort. According to standards, it shows that many criteria should be taken into consideration in order to have daylight utilization and visual comfort. The current standard in Malaysia, MS 1525 does not provide sufficient guideline. Hence, more research is needed on daylight performance. If architects do not consider daylight design, it not only causes inconvenience in working spaces but also causes more energy consumption as well as environmental pollution. This research had surveyed daylight performance in 5 selected office buildings from different area of Malaysian through experimental method. Several parameters of daylight quality such as daylight factor, surface luminance and surface luminance ratio were measured in different rooms in each building. The result of this research demonstrated that most of the buildings were not designed for daylight utilization. Therefore, it is very important that architects follow the daylight design recommendation to reduce consumption of electric power for artificial lighting while the sufficient quality of daylight is available.
Keywords: Daylight factor, Field measurement, Daylighting quality, Tropical
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3484999 Contribution of Electrochemical Treatment in Treating Textile Dye Wastewater
Authors: Usha N. Murthy, Rekha H. B., Mahaveer Devoor
Abstract:
The introduction of more stringent pollution regulations, in relation to financial and social pressures for sustainable development, has pressed toward limiting the volumes of industrial and domestic effluents discharged into the environment - as well as to increase the efforts within research and development of new or more efficient wastewater treatment technologies. Considering both discharge volume and effluent composition, wastewater generated by the textile industry is rated as the most polluting among all industrial sectors. The pollution load is mainly due to spent dye baths, which are composed of unreacted dyes, dispersing agents, surfactants, salts and organics. In the present investigation, the textile dye wastewater was characterized by high color, chemical oxygen demand (COD), total dissolved solids (TDS) and pH. Electrochemical oxidation process for four plate electrodes was carried out at five different current intensities, out of which 0.14A has achieved maximum percentage removal of COD with 75% and 83% of color. The COD removal rate in kg COD/h/m2 decreases with increase in the current intensity. The energy consumption increases with increase in the current intensity. Hence, textile dye wastewater can be effectively pretreated by electrochemical oxidation method where the process limits objectionable color while leaving the COD associated with organics left for natural degradation thus causing a sustainable reduction in pollution load.
Keywords: Electrochemical treatment, COD, color.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2393998 Implementation-Oriented Discussion for Historical and Cultural Villages’ Conservation Planning
Authors: Xing Zhang
Abstract:
Since the State Council of China issued the Regulations on the Conservation of Historical Cultural Towns and Villages in 2008, formulation of conservation planning has been carried out in national, provincial and municipal historical and cultural villages for protection needs, which provides a legal basis for inheritance of historical culture and protection of historical resources. Although the quantity and content of the conservation planning are continually increasing, the implementation and application are still ambiguous. To solve the aforementioned problems, this paper explores methods to enhance the implementation of conservation planning from the perspective of planning formulation. Specifically, the technical framework of "overall objectives planning - sub-objectives planning - zoning guidelines - implementation by stages" is proposed to implement the planning objectives in different classifications and stages. Then combined with details of the Qiqiao historical and cultural village conservation planning project in Ningbo, five sub-objectives are set, which are implemented through the village zoning guidelines. At the same time, the key points and specific projects in the near-term, medium-term and long-term work are clarified, and the spatial planning is transformed into the action plan with time scale. The proposed framework and method provide a reference for the implementation and management of the conservation planning of historical and cultural villages in the future.Keywords: Conservation planning, planning by stages, planning implementation, zoning guidelines.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 782997 Optimizing Dialogue Strategy Learning Using Learning Automata
Authors: G. Kumaravelan, R. Sivakumar
Abstract:
Modeling the behavior of the dialogue management in the design of a spoken dialogue system using statistical methodologies is currently a growing research area. This paper presents a work on developing an adaptive learning approach to optimize dialogue strategy. At the core of our system is a method formalizing dialogue management as a sequential decision making under uncertainty whose underlying probabilistic structure has a Markov Chain. Researchers have mostly focused on model-free algorithms for automating the design of dialogue management using machine learning techniques such as reinforcement learning. But in model-free algorithms there exist a dilemma in engaging the type of exploration versus exploitation. Hence we present a model-based online policy learning algorithm using interconnected learning automata for optimizing dialogue strategy. The proposed algorithm is capable of deriving an optimal policy that prescribes what action should be taken in various states of conversation so as to maximize the expected total reward to attain the goal and incorporates good exploration and exploitation in its updates to improve the naturalness of humancomputer interaction. We test the proposed approach using the most sophisticated evaluation framework PARADISE for accessing to the railway information system.Keywords: Dialogue management, Learning automata, Reinforcement learning, Spoken dialogue system
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1611996 An Authentic Algorithm for Ciphering and Deciphering Called Latin Djokovic
Authors: Diogen Babuc
Abstract:
The question that is a motivation of writing is how many devote themselves to discovering something in the world of science where much is discerned and revealed, but at the same time, much is unknown. The insightful elements of this algorithm are the ciphering and deciphering algorithms of Playfair, Caesar, and Vigen`ere. Only a few of their main properties are taken and modified, with the aim of forming a specific functionality of the algorithm called Latin Djokovic. Specifically, a string is entered as input data. A key k is given, with a random value between the values a and b = a+3. The obtained value is stored in a variable with the aim of being constant during the run of the algorithm. In correlation to the given key, the string is divided into several groups of substrings, and each substring has a length of k characters. The next step involves encoding each substring from the list of existing substrings. Encoding is performed using the basis of Caesar algorithm, i.e. shifting with k characters. However, that k is incremented by 1 when moving to the next substring in that list. When the value of k becomes greater than b + 1, it will return to its initial value. The algorithm is executed, following the same procedure, until the last substring in the list is traversed. Using this polyalphabetic method, ciphering and deciphering of strings are achieved. The algorithm also works for a 100-character string. The x character is not used when the number of characters in a substring is incompatible with the expected length. The algorithm is simple to implement, but it is questionable if it works better than the other methods, from the point of view of execution time and storage space.
Keywords: Ciphering and deciphering, Authentic Algorithm, Polyalphabetic Cipher, Random Key, methods comparison.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 196995 Investigating the Effective Parameters in Determining the Type of Traffic Congestion Pricing Schemes in Urban Streets
Authors: Saeed Sayyad Hagh Shomar
Abstract:
Traffic congestion pricing – as a strategy in travel demand management in urban areas to reduce traffic congestion, air pollution and noise pollution – has drawn many attentions towards itself. Unlike the satisfying findings in this method, there are still problems in determining the best functional congestion pricing scheme with regard to the situation. The so-called problems in this process will result in further complications and even the scheme failure. That is why having proper knowledge of the significance of congestion pricing schemes and the effective factors in choosing them can lead to the success of this strategy. In this study, first, a variety of traffic congestion pricing schemes and their components are introduced; then, their functional usage is discussed. Next, by analyzing and comparing the barriers, limitations and advantages, the selection criteria of pricing schemes are described. The results, accordingly, show that the selection of the best scheme depends on various parameters. Finally, based on examining the effective parameters, it is concluded that the implementation of area-based schemes (cordon and zonal) has been more successful in non-diversion of traffic. That is considering the topology of the cities and the fact that traffic congestion is often created in the city centers, area-based schemes would be notably functional and appropriate.Keywords: Congestion pricing, demand management, flat toll, variable toll.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 608994 Early Diagnosis of Alzheimer's Disease Using a Combination of Images Processing and Brain Signals
Authors: E. Irankhah, M. Zarif, E. Mazrooei Rad, K. Ghandehari
Abstract:
Alzheimer's prevalence is on the rise, and the disease comes with problems like cessation of treatment, high cost of treatment, and the lack of early detection methods. The pathology of this disease causes the formation of protein deposits in the brain of patients called plaque amyloid. Generally, the diagnosis of this disease is done by performing tests such as a cerebrospinal fluid, CT scan, MRI, and spinal cord fluid testing, or mental testing tests and eye tracing tests. In this paper, we tried to use the Medial Temporal Atrophy (MTA) method and the Leave One Out (LOO) cycle to extract the statistical properties of the three Fz, Pz, and Cz channels of ERP signals for early diagnosis of this disease. In the process of CT scan images, the accuracy of the results is 81% for the healthy person and 88% for the severe patient. After the process of ERP signaling, the accuracy of the results for a healthy person in the delta band in the Cz channel is 81% and in the alpha band the Pz channel is 90%. In the results obtained from the signal processing, the results of the severe patient in the delta band of the Cz channel were 89% and in the alpha band Pz channel 92%.
Keywords: Alzheimer's disease, image and signal processing, medial temporal atrophy, LOO Cycle.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2049993 Optimization of Hemp Fiber Reinforced Concrete for Mix Design Method
Authors: Zoe Chang, Max Williams, Gautham Das
Abstract:
The purpose of this study is to evaluate the incorporation of hemp fibers (HF) in concrete. Hemp fiber reinforced concrete (HFRC) is becoming more popular as an alternative for regular mix designs. This study was done to evaluate the compressive strength of HFRC regarding mix procedure. HF were obtained from the manufacturer and hand processed to ensure uniformity in width and length. The fibers were added to concrete as both wet and dry mix to investigate and optimize the mix design process. Results indicated that the dry mix had a compressive strength of 1157 psi compared to the wet mix of 985 psi. This dry mix compressive strength was within range of the standard mix compressive strength of 1533 psi. The statistical analysis revealed that the mix design process needs further optimization and uniformity concerning the addition of HF. Regression analysis revealed that the standard mix design had a coefficient of 0.9 as compared to the dry mix of 0.375 indicating a variation in the mixing process. While completing the dry mix, the addition of plain HF caused them to intertwine creating lumps and inconsistency. However, during the wet mixing process, combining water and HF before incorporation allows the fibers to uniformly disperse within the mix hence the regression analysis indicated a better coefficient of 0.55. This study concludes that HRFC is a viable alternative to regular mixes however more research surrounding its characteristics needs to be conducted.
Keywords: hemp fibers, hemp reinforced concrete, wet and dry, freeze thaw testing, compressive strength
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 556992 Attention Based Fully Convolutional Neural Network for Simultaneous Detection and Segmentation of Optic Disc in Retinal Fundus Images
Authors: Sandip Sadhukhan, Arpita Sarkar, Debprasad Sinha, Goutam Kumar Ghorai, Gautam Sarkar, Ashis K. Dhara
Abstract:
Accurate segmentation of the optic disc is very important for computer-aided diagnosis of several ocular diseases such as glaucoma, diabetic retinopathy, and hypertensive retinopathy. The paper presents an accurate and fast optic disc detection and segmentation method using an attention based fully convolutional network. The network is trained from scratch using the fundus images of extended MESSIDOR database and the trained model is used for segmentation of optic disc. The false positives are removed based on morphological operation and shape features. The result is evaluated using three-fold cross-validation on six public fundus image databases such as DIARETDB0, DIARETDB1, DRIVE, AV-INSPIRE, CHASE DB1 and MESSIDOR. The attention based fully convolutional network is robust and effective for detection and segmentation of optic disc in the images affected by diabetic retinopathy and it outperforms existing techniques.Keywords: Ocular diseases, retinal fundus image, optic disc detection and segmentation, fully convolutional network, overlap measure.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 780991 Use of Fuzzy Logic in the Corporate Reputation Assessment: Stock Market Investors’ Perspective
Authors: Tomasz L. Nawrocki, Danuta Szwajca
Abstract:
The growing importance of reputation in building enterprise value and achieving long-term competitive advantage creates the need for its measurement and evaluation for the management purposes (effective reputation and its risk management). The paper presents practical application of self-developed corporate reputation assessment model from the viewpoint of stock market investors. The model has a pioneer character and example analysis performed for selected industry is a form of specific test for this tool. In the proposed solution, three aspects - informational, financial and development, as well as social ones - were considered. It was also assumed that the individual sub-criteria will be based on public sources of information, and as the calculation apparatus, capable of obtaining synthetic final assessment, fuzzy logic will be used. The main reason for developing this model was to fulfill the gap in the scope of synthetic measure of corporate reputation that would provide higher degree of objectivity by relying on "hard" (not from surveys) and publicly available data. It should be also noted that results obtained on the basis of proposed corporate reputation assessment method give possibilities of various internal as well as inter-branch comparisons and analysis of corporate reputation impact.Keywords: Corporate reputation, fuzzy logic, fuzzy model, stock market investors.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1371990 Model Canvas and Process for Educational Game Design in Outcome-Based Education
Authors: Ratima Damkham, Natasha Dejdumrong, Priyakorn Pusawiro
Abstract:
This paper explored the solution in game design to help game designers in the educational game designing using digital educational game model canvas (DEGMC) and digital educational game form (DEGF) based on Outcome-based Education program. DEGMC and DEGF can help designers develop an overview of the game while designing and planning their own game. The way to clearly assess players’ ability from learning outcomes and support their game learning design is by using the tools. Designers can balance educational content and entertainment in designing a game by using the strategies of the Business Model Canvas and design the gameplay and players’ ability assessment from learning outcomes they need by referring to the Constructive Alignment. Furthermore, they can use their design plan in this research to write their Game Design Document (GDD). The success of the research was evaluated by four experts’ perspectives in the education and computer field. From the experiments, the canvas and form helped the game designers model their game according to the learning outcomes and analysis of their own game elements. This method can be a path to research an educational game design in the future.Keywords: Constructive alignment, constructivist theory, educational game, outcome-based education.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 850989 Riemannian Manifolds for Brain Extraction on Multi-modal Resonance Magnetic Images
Authors: Mohamed Gouskir, Belaid Bouikhalene, Hicham Aissaoui, Benachir Elhadadi
Abstract:
In this paper, we present an application of Riemannian geometry for processing non-Euclidean image data. We consider the image as residing in a Riemannian manifold, for developing a new method to brain edge detection and brain extraction. Automating this process is a challenge due to the high diversity in appearance brain tissue, among different patients and sequences. The main contribution, in this paper, is the use of an edge-based anisotropic diffusion tensor for the segmentation task by integrating both image edge geometry and Riemannian manifold (geodesic, metric tensor) to regularize the convergence contour and extract complex anatomical structures. We check the accuracy of the segmentation results on simulated brain MRI scans of single T1-weighted, T2-weighted and Proton Density sequences. We validate our approach using two different databases: BrainWeb database, and MRI Multiple sclerosis Database (MRI MS DB). We have compared, qualitatively and quantitatively, our approach with the well-known brain extraction algorithms. We show that using a Riemannian manifolds to medical image analysis improves the efficient results to brain extraction, in real time, outperforming the results of the standard techniques.Keywords: Riemannian manifolds, Riemannian Tensor, Brain Segmentation, Non-Euclidean data, Brain Extraction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1662988 Solid Dispersions of Cefixime Using β-Cyclodextrin: Characterization and in vitro Evaluation
Authors: Nagasamy Venkatesh Dhandapani, Amged Awad El-Gied
Abstract:
Cefixime, a BCS class II drug, is insoluble in water but freely soluble in acetone and in alcohol. The aqueous solubility of cefixime in water is poor and exhibits exceptionally slow and intrinsic dissolution rate. In the present study, cefixime and β-Cyclodextrin (β-CD) solid dispersions were prepared with a view to study the effect and influence of β-CD on the solubility and dissolution rate of this poorly aqueous soluble drug. Phase solubility profile revealed that the solubility of cefixime was increased in the presence of β-CD and was classified as AL-type. Effect of variable, such as drug:carrier ratio, was studied. Physical characterization of the solid dispersion was characterized by Fourier transform infrared spectroscopy (FT-IR) and Differential scanning calorimetry (DSC). These studies revealed that a distinct loss of drug crystallinity in the solid molecular dispersions is ostensibly accounting for enhancement of dissolution rate in distilled water. The drug release from the prepared solid dispersion exhibited a first order kinetics. Solid dispersions of cefixime showed a 6.77 times fold increase in dissolution rate over the pure drug.Keywords: Cefixime, β-Cyclodextrin, solid dispersions, kneading method, dissolution, release kinetics.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1611987 Impact of Zn/Cr Ratio on ZnCrOx-SAPO-34 Bifunctional Catalyst for Direct Conversion of Syngas to Light Olefins
Authors: Yuxuan Huang, Weixin Qian, Hongfang Ma, Haitao Zhang, Weiyong Ying
Abstract:
Light olefins are important building blocks for chemical industry. Direct conversion of syngas to light olefins has been investigated for decades. Meanwhile, the limit for light olefins selectivity described by Anderson-Schulz-Flory (ASF) distribution model is still a great challenge to conventional Fischer-Tropsch synthesis. The emerging strategy called oxide-zeolite concept (OX-ZEO) is a promising way to get rid of this limit. ZnCrOx was prepared by co-precipitation method and (NH4)2CO3 was used as precipitant. SAPO-34 was prepared by hydrothermal synthesis, and Tetraethylammonium hydroxide (TEAOH) was used as template, while silica sol, pseudo-boehmite, and phosphoric acid were Al, Si and P source, respectively. The bifunctional catalyst was prepared by mechanical mixing of ZnCrOx and SAPO-34. Catalytic reactions were carried out under H2/CO=2, 380 ℃, 1 MPa and 6000 mL·gcat-1·h-1 in a fixed-bed reactor with a quartz lining. Catalysts were characterized by XRD, N2 adsorption-desorption, NH3-TPD, H2-TPR, and CO-TPD. The addition of Al as structure promoter enhances CO conversion and selectivity to light olefins. Zn/Cr ratio, which decides the active component content and chemisorption property of the catalyst, influences CO conversion and selectivity to light olefins at the same time. C2-4= distribution of 86% among hydrocarbons at CO conversion of 14% was reached when Zn/Cr=1.5.
Keywords: Light olefins, OX-ZEO, syngas, ZnCrOx.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1021986 NDENet: End-to-End Nighttime Dehazing and Enhancement
Authors: H. Baskar, A. S. Chakravarthy, P. Garg, D. Goel, A. S. Raj, K. Kumar, Lakshya, R. Parvatham, V. Sushant, B. Kumar Rout
Abstract:
In this paper, we present a computer vision task called nighttime dehaze-enhancement. This task aims to jointly perform dehazing and lightness enhancement. Our task fundamentally differs from nighttime dehazing – our goal is to jointly dehaze and enhance scenes, while nighttime dehazing aims to dehaze scenes under a nighttime setting. In order to facilitate further research on this task, we release a benchmark dataset called Reside-β Night dataset, consisting of 4122 nighttime hazed images from 2061 scenes and 2061 ground truth images. Moreover, we also propose a network called NDENet (Nighttime Dehaze-Enhancement Network), which jointly performs dehazing and low-light enhancement in an end-to-end manner. We evaluate our method on the proposed benchmark and achieve Structural Index Similarity (SSIM) of 0.8962 and Peak Signal to Noise Ratio (PSNR) of 26.25. We also compare our network with other baseline networks on our benchmark to demonstrate the effectiveness of our approach. We believe that nighttime dehaze-enhancement is an essential task particularly for autonomous navigation applications, and hope that our work will open up new frontiers in research. The code for our network is made publicly available.
Keywords: Dehazing, image enhancement, nighttime, computer vision.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 672985 Methodology Issues and Design Approach of VLE on Mathematical Concepts Acquisition within Secondary Education in England
Authors: Aaron A. R. Nwabude
Abstract:
This study used positivist quantitative approach to examine the mathematical concepts acquisition of- KS4 (14-16) Special Education Needs (SENs) students within the school sector education in England. The research is based on a pilot study and the design is completely holistic in its approach with mixing methodologies. The study combines the qualitative and quantitative methods of approach in gathering formative data for the design process. Although, the approach could best be described as a mix method, fundamentally with a strong positivist paradigm, hence my earlier understanding of the differentiation of the students, student – teacher body and the various elements of indicators that is being measured which will require an attenuated description of individual research subjects. The design process involves four phases with five key stages which are; literature review and document analysis, the survey, interview, and observation; then finally the analysis of data set. The research identified the need for triangulation with Reid-s phases of data management providing scaffold for the study. The study clearly identified the ideological and philosophical aspects of educational research design for the study of mathematics by the special education needs (SENs) students in England using the virtual learning environment (VLE) platform.
Keywords: VLE, Special Education Needs, Key stage4, School, Mathematics, Concepts Acquisition
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1978984 Load Frequency Control of Nonlinear Interconnected Hydro-Thermal System Using Differential Evolution Technique
Authors: Banaja Mohanty, Prakash Kumar Hota
Abstract:
This paper presents a differential evolution algorithm to design a robust PI and PID controllers for Load Frequency Control (LFC) of nonlinear interconnected power systems considering the boiler dynamics, Governor Dead Band (GDB), Generation Rate Constraint (GRC). Differential evolution algorithm is employed to search for the optimal controller parameters. The proposed method easily copes of with nonlinear constraints. Further the proposed controller is simple, effective and can ensure the desirable overall system performance. The superiority of the proposed approach has been shown by comparing the results with published fuzzy logic controller for the same power systems. The comparison is done using various performance measures like overshoot, settling time and standard error criteria of frequency and tie-line power deviation following a 1% step load perturbation in hydro area. It is noticed that, the dynamic performance of proposed controller is better than fuzzy logic controller. Furthermore, it is also seen that the proposed system is robust and is not affected by change in the system parameters.
Keywords: Automatic Generation control (AGC), Generation Rate Constraint (GRC), Governor Dead Band (GDB), Differential Evolution (DE)
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3371983 The Governance of Islamic Banks in Morocco: Meaning, Strategic Vision and Purposes Attributed to the Governance System
Authors: Lalla Nezha Lakmiti, Abdelkahar Zahid
Abstract:
Due to the setbacks on the international scene and the wave of cacophonic financial scandals affecting large international groups, the new Islamic finance industry is not immune despite its initial resistance. The purpose of this paper is to understand and analyze the meaning of the Corporate Governance (CG) concept in Moroccan Islamic banking systems with specific reference to their institutions. The research objective is to identify also the path taken and adopted by these banks recently set up in Morocco. The foundation is rooted in shari'a, in particular, no stakeholder (the shareholding approach) must be harmed, and the ethical value is reflected into these parties’ behavior. We chose a qualitative method, semi-structured interviews where six managers provided answers about their banking systems. Since these respondents held a senior position (directors) within their organizations, it is felt that they are well placed and have the necessary knowledge to provide us with information to answer the questions asked. The results identified the orientation of participating banks and assessing how governance works, while determining which party is fovoured: shareholders, stakeholders or both. This study discusses the favorable condition to the harmonization of the regulations and therefore a better integration between Islamic finance and conventional ones in the economic context of Morocco.
Keywords: Corporate governance, participating banks, stakeholders, shareholders, and interests.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 904982 Cold Spray Deposition of SS316L Powders on Al5052 Substrates and Their Potential Using for Biomedical Applications
Authors: B. Dikici, I. Ozdemir, M. Topuz
Abstract:
The corrosion behaviour of 316L stainless steel coatings obtained by cold spray method was investigated in this study. 316L powders were deposited onto Al5052 aluminum substrates. The coatings were produced using nitrogen (N2) process gas. In order to further improve the corrosion and mechanical properties of the coatings, heat treatment was applied at 250 and 750 °C. The corrosion performances of the coatings were compared using the potentiodynamic scanning (PDS) technique under in-vitro conditions (in Ringer’s solution at 37 °C). In addition, the hardness and porosity tests were carried out on the coatings. Microstructural characterization of the coatings was carried out by using scanning electron microscopy attached with energy dispersive spectrometer (SEM-EDS) and X-ray diffraction (XRD) technique. It was found that clean surfaces and a good adhesion were achieved for particle/substrate bonding. The heat treatment process provided both elimination of the anisotropy in the coating and resulting in healing-up of the incomplete interfaces between the deposited particles. It was found that the corrosion potential of the annealed coatings at 750 °C was higher than that of commercially 316 L stainless steel. Moreover, the microstructural investigations after the corrosion tests revealed that corrosion preferentially starts at inter-splat boundaries.
Keywords: 316L, biomaterials, cold spray, heat treatment.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2325981 Protein Secondary Structure Prediction Using Parallelized Rule Induction from Coverings
Authors: Leong Lee, Cyriac Kandoth, Jennifer L. Leopold, Ronald L. Frank
Abstract:
Protein 3D structure prediction has always been an important research area in bioinformatics. In particular, the prediction of secondary structure has been a well-studied research topic. Despite the recent breakthrough of combining multiple sequence alignment information and artificial intelligence algorithms to predict protein secondary structure, the Q3 accuracy of various computational prediction algorithms rarely has exceeded 75%. In a previous paper [1], this research team presented a rule-based method called RT-RICO (Relaxed Threshold Rule Induction from Coverings) to predict protein secondary structure. The average Q3 accuracy on the sample datasets using RT-RICO was 80.3%, an improvement over comparable computational methods. Although this demonstrated that RT-RICO might be a promising approach for predicting secondary structure, the algorithm-s computational complexity and program running time limited its use. Herein a parallelized implementation of a slightly modified RT-RICO approach is presented. This new version of the algorithm facilitated the testing of a much larger dataset of 396 protein domains [2]. Parallelized RTRICO achieved a Q3 score of 74.6%, which is higher than the consensus prediction accuracy of 72.9% that was achieved for the same test dataset by a combination of four secondary structure prediction methods [2].Keywords: data mining, protein secondary structure prediction, parallelization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1596980 Activity Recognition by Smartphone Accelerometer Data Using Ensemble Learning Methods
Authors: Eu Tteum Ha, Kwang Ryel Ryu
Abstract:
As smartphones are equipped with various sensors, there have been many studies focused on using these sensors to create valuable applications. Human activity recognition is one such application motivated by various welfare applications, such as the support for the elderly, measurement of calorie consumption, lifestyle and exercise patterns analyses, and so on. One of the challenges one faces when using smartphone sensors for activity recognition is that the number of sensors should be minimized to save battery power. In this paper, we show that a fairly accurate classifier can be built that can distinguish ten different activities by using only a single sensor data, i.e., the smartphone accelerometer data. The approach that we adopt to deal with this twelve-class problem uses various methods. The features used for classifying these activities include not only the magnitude of acceleration vector at each time point, but also the maximum, the minimum, and the standard deviation of vector magnitude within a time window. The experiments compared the performance of four kinds of basic multi-class classifiers and the performance of four kinds of ensemble learning methods based on three kinds of basic multi-class classifiers. The results show that while the method with the highest accuracy is ECOC based on Random forest.
Keywords: Ensemble learning, activity recognition, smartphone accelerometer.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2173979 Effectiveness of Moringa oleifera Coagulant Protein as Natural Coagulant aid in Removal of Turbidity and Bacteria from Turbid Waters
Authors: B. Bina, M.H. Mehdinejad, Gunnel Dalhammer, Guna RajaraoM. Nikaeen, H. Movahedian Attar
Abstract:
Coagulation of water involves the use of coagulating agents to bring the suspended matter in the raw water together for settling and the filtration stage. Present study is aimed to examine the effects of aluminum sulfate as coagulant in conjunction with Moringa Oleifera Coagulant Protein as coagulant aid on turbidity, hardness, and bacteria in turbid water. A conventional jar test apparatus was employed for the tests. The best removal was observed at a pH of 7 to 7.5 for all turbidities. Turbidity removal efficiency was resulted between % 80 to % 99 by Moringa Oleifera Coagulant Protein as coagulant aid. Dosage of coagulant and coagulant aid decreased with increasing turbidity. In addition, Moringa Oleifera Coagulant Protein significantly has reduced the required dosage of primary coagulant. Residual Al+3 in treated water were less than 0.2 mg/l and meets the environmental protection agency guidelines. The results showed that turbidity reduction of % 85.9- % 98 paralleled by a primary Escherichia coli reduction of 1-3 log units (99.2 – 99.97%) was obtained within the first 1 to 2 h of treatment. In conclusions, Moringa Oleifera Coagulant Protein as coagulant aid can be used for drinking water treatment without the risk of organic or nutrient release. We demonstrated that optimal design method is an efficient approach for optimization of coagulation-flocculation process and appropriate for raw water treatment.Keywords: MOCP, Coagulant aid, turbidity removal, E.coliremoval, water, treatment
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3541978 Clove Essential Oil Improves Lipid Peroxidation and Antioxidant Activity in Tilapia Fish Fillet Cooked by Grilling and Microwaving
Authors: E. Oskoueian, E. Maroufyan, Y.M. Goh, E. Ramezani-Fard, M. Ebrahimi
Abstract:
The fish meat plays an important role in the human health as it contains high quality protein. The tilapia fish considered as the third largest group of farmed fish. The oxidative deterioration of fish meat may occur during the cooking process. The proper cooking process and using natural antioxidant to prevent oxidation and enhance the quality of the tilapia fish fillet is necessary. Hence, this research was carried out to evaluate the potential of clove essential oil to prevent lipid peroxidation and enhance the antioxidant activity of tilapia fish fillet cooked using microwaving and grilling methods. The results showed that cooking using microwave significantly (p<0.05) increased the lipid peroxidation and decreased the DPPH and ferric reducing activity power of the fish fillet as compared to grilling method. The fortification of fish fillet using clove essential oil prevented from lipid peroxidation and enhanced the antioxidant activity of the fish fillet significantly (p<0.05). Consequently, fortification of tilapia fish fillet using clove essential oil followed by cooking using griller to have high quality cooked fish meat is recommended.
Keywords: Antioxidant activity, fillet, fish, fortification, lipid peroxidation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2549