Search results for: information systems failure
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 19763

Search results for: information systems failure

10793 Tunnelling Concepts in Overstressed Weak Rocks

Authors: Entfellner Manuel, Wannenmacher Helmut, Reisenbauer Josef, Schubert Wulf

Abstract:

When tunnelling in overstressed weak rocks ("squeezing ground"), two basic design approaches are available: the resistance principle, and the yielding principle. The resistance principle relies on rigid support systems to withstand the ground pressure. Alternatively, the yielding principle prioritizes controlled deformation, allowing the ground to deform without compromising tunnel integrity. This paper highlights the beneficial factors of the yielding principle for conventionally excavated tunnels in overstressed weak rocks. Especially the application of a ductile shotcrete lining with yielding elements is analysed in detail. Construction costs, safety, short- and long-term stabilities are discussed.

Keywords: squeezing ground, yielding principle, yielding element, conventional tunneling

Procedia PDF Downloads 53
10792 Sinhala Sign Language to Grammatically Correct Sentences using NLP

Authors: Anjalika Fernando, Banuka Athuraliya

Abstract:

This paper presents a comprehensive approach for converting Sinhala Sign Language (SSL) into grammatically correct sentences using Natural Language Processing (NLP) techniques in real-time. While previous studies have explored various aspects of SSL translation, the research gap lies in the absence of grammar checking for SSL. This work aims to bridge this gap by proposing a two-stage methodology that leverages deep learning models to detect signs and translate them into coherent sentences, ensuring grammatical accuracy. The first stage of the approach involves the utilization of a Long Short-Term Memory (LSTM) deep learning model to recognize and interpret SSL signs. By training the LSTM model on a dataset of SSL gestures, it learns to accurately classify and translate these signs into textual representations. The LSTM model achieves a commendable accuracy rate of 94%, demonstrating its effectiveness in accurately recognizing and translating SSL gestures. Building upon the successful recognition and translation of SSL signs, the second stage of the methodology focuses on improving the grammatical correctness of the translated sentences. The project employs a Neural Machine Translation (NMT) architecture, consisting of an encoder and decoder with LSTM components, to enhance the syntactical structure of the generated sentences. By training the NMT model on a parallel corpus of Sinhala wrong sentences and their corresponding grammatically correct translations, it learns to generate coherent and grammatically accurate sentences. The NMT model achieves an impressive accuracy rate of 98%, affirming its capability to produce linguistically sound translations. The proposed approach offers significant contributions to the field of SSL translation and grammar correction. Addressing the critical issue of grammar checking, it enhances the usability and reliability of SSL translation systems, facilitating effective communication between hearing-impaired and non-sign language users. Furthermore, the integration of deep learning techniques, such as LSTM and NMT, ensures the accuracy and robustness of the translation process. This research holds great potential for practical applications, including educational platforms, accessibility tools, and communication aids for the hearing-impaired. Furthermore, it lays the foundation for future advancements in SSL translation systems, fostering inclusive and equal opportunities for the deaf community. Future work includes expanding the existing datasets to further improve the accuracy and generalization of the SSL translation system. Additionally, the development of a dedicated mobile application would enhance the accessibility and convenience of SSL translation on handheld devices. Furthermore, efforts will be made to enhance the current application for educational purposes, enabling individuals to learn and practice SSL more effectively. Another area of future exploration involves enabling two-way communication, allowing seamless interaction between sign-language users and non-sign-language users.In conclusion, this paper presents a novel approach for converting Sinhala Sign Language gestures into grammatically correct sentences using NLP techniques in real time. The two-stage methodology, comprising an LSTM model for sign detection and translation and an NMT model for grammar correction, achieves high accuracy rates of 94% and 98%, respectively. By addressing the lack of grammar checking in existing SSL translation research, this work contributes significantly to the development of more accurate and reliable SSL translation systems, thereby fostering effective communication and inclusivity for the hearing-impaired community

Keywords: Sinhala sign language, sign Language, NLP, LSTM, NMT

Procedia PDF Downloads 84
10791 The Analysis of Gizmos Online Program as Mathematics Diagnostic Program: A Story from an Indonesian Private School

Authors: Shofiayuningtyas Luftiani

Abstract:

Some private schools in Indonesia started integrating the online program Gizmos in the teaching-learning process. Gizmos was developed to supplement the existing curriculum by integrating it into the instructional programs. The program has some features using an inquiry-based simulation, in which students conduct exploration by using a worksheet while teachers use the teacher guidelines to direct and assess students’ performance In this study, the discussion about Gizmos highlights its features as the assessment media of mathematics learning for secondary school students. The discussion is based on the case study and literature review from the Indonesian context. The purpose of applying Gizmos as an assessment media refers to the diagnostic assessment. As a part of the diagnostic assessment, the teachers review the student exploration sheet, analyze particularly in the students’ difficulties and consider findings in planning future learning process. This assessment becomes important since the teacher needs the data about students’ persistent weaknesses. Additionally, this program also helps to build student’ understanding by its interactive simulation. Currently, the assessment over-emphasizes the students’ answers in the worksheet based on the provided answer keys while students perform their skill in translating the question, doing the simulation and answering the question. Whereas, the assessment should involve the multiple perspectives and sources of students’ performance since teacher should adjust the instructional programs with the complexity of students’ learning needs and styles. Consequently, the approach to improving the assessment components is selected to challenge the current assessment. The purpose of this challenge is to involve not only the cognitive diagnosis but also the analysis of skills and error. Concerning the selected setting for this diagnostic assessment that develops the combination of cognitive diagnosis, skills analysis and error analysis, the teachers should create an assessment rubric. The rubric plays the important role as the guide to provide a set of criteria for the assessment. Without the precise rubric, the teacher potentially ineffectively documents and follows up the data about students at risk of failure. Furthermore, the teachers who employ the program of Gizmos as the diagnostic assessment might encounter some obstacles. Based on the condition of assessment in the selected setting, the obstacles involve the time constrain, the reluctance of higher teaching burden and the students’ behavior. Consequently, the teacher who chooses the Gizmos with those approaches has to plan, implement and evaluate the assessment. The main point of this assessment is not in the result of students’ worksheet. However, the diagnostic assessment has the two-stage process; the process to prompt and effectively follow-up both individual weaknesses and those of the learning process. Ultimately, the discussion of Gizmos as the media of the diagnostic assessment refers to the effort to improve the mathematical learning process.

Keywords: diagnostic assessment, error analysis, Gizmos online program, skills analysis

Procedia PDF Downloads 168
10790 A Review of the Major Factors of Cost Overrun in Construction Projects

Authors: Hassan Abdelgadir

Abstract:

Cost overruns have harmed the economies and reputations of several construction companies around the world. Many project management systems have been developed to keep track of a project's budget. However, due to several cost overrun difficulties in the construction industry, cost management is still deemed inadequate. As a result, the goal of this term paper is to identify and group prospective construction project cost overrun reasons based on their origin groups.Basically, all potential cost overrun elements were rigorously checked through 1iterature analysis before being divided into seven (7) groups of originating components, including project, contract, client, contractor, consultant, labor, and external. Each potential factor was completely defined with examples.

Keywords: construction projects, cost overruns, construction company, factors effect costoverruns

Procedia PDF Downloads 53
10789 MEAL Project–Modifying Eating Attitudes and Actions through Learning

Authors: E. Oliver, A. Cebolla, A. Dominguez, A. Gonzalez-Segura, E. de la Cruz, S. Albertini, L. Ferrini, K. Kronika, T. Nilsen, R. Baños

Abstract:

The main objective of MEAL is to develop a pedagogical tool aimed to help teachers and nutritionists (students and professionals) to acquire, train, promote and deliver to children basic nutritional education and healthy eating behaviours competencies. MEAL is focused on eating behaviours and not only in nutritional literacy, and will use new technologies like Information and Communication Technologies (ICTs) and serious games (SG) platforms to consolidate the nutritional competences and habits.

Keywords: nutritional education, pedagogical ICT platform, serious games, training course

Procedia PDF Downloads 511
10788 Comparing Community Detection Algorithms in Bipartite Networks

Authors: Ehsan Khademi, Mahdi Jalili

Abstract:

Despite the special features of bipartite networks, they are common in many systems. Real-world bipartite networks may show community structure, similar to what one can find in one-mode networks. However, the interpretation of the community structure in bipartite networks is different as compared to one-mode networks. In this manuscript, we compare a number of available methods that are frequently used to discover community structure of bipartite networks. These networks are categorized into two broad classes. One class is the methods that, first, transfer the network into a one-mode network, and then apply community detection algorithms. The other class is the algorithms that have been developed specifically for bipartite networks. These algorithms are applied on a model network with prescribed community structure.

Keywords: community detection, bipartite networks, co-clustering, modularity, network projection, complex networks

Procedia PDF Downloads 607
10787 Unsupervised Part-of-Speech Tagging for Amharic Using K-Means Clustering

Authors: Zelalem Fantahun

Abstract:

Part-of-speech tagging is the process of assigning a part-of-speech or other lexical class marker to each word into naturally occurring text. Part-of-speech tagging is the most fundamental and basic task almost in all natural language processing. In natural language processing, the problem of providing large amount of manually annotated data is a knowledge acquisition bottleneck. Since, Amharic is one of under-resourced language, the availability of tagged corpus is the bottleneck problem for natural language processing especially for POS tagging. A promising direction to tackle this problem is to provide a system that does not require manually tagged data. In unsupervised learning, the learner is not provided with classifications. Unsupervised algorithms seek out similarity between pieces of data in order to determine whether they can be characterized as forming a group. This paper explicates the development of unsupervised part-of-speech tagger using K-Means clustering for Amharic language since large amount of data is produced in day-to-day activities. In the development of the tagger, the following procedures are followed. First, the unlabeled data (raw text) is divided into 10 folds and tokenization phase takes place; at this level, the raw text is chunked at sentence level and then into words. The second phase is feature extraction which includes word frequency, syntactic and morphological features of a word. The third phase is clustering. Among different clustering algorithms, K-means is selected and implemented in this study that brings group of similar words together. The fourth phase is mapping, which deals with looking at each cluster carefully and the most common tag is assigned to a group. This study finds out two features that are capable of distinguishing one part-of-speech from others these are morphological feature and positional information and show that it is possible to use unsupervised learning for Amharic POS tagging. In order to increase performance of the unsupervised part-of-speech tagger, there is a need to incorporate other features that are not included in this study, such as semantic related information. Finally, based on experimental result, the performance of the system achieves a maximum of 81% accuracy.

Keywords: POS tagging, Amharic, unsupervised learning, k-means

Procedia PDF Downloads 427
10786 Rational Approach to Analysis and Construction of Curved Composite Box Girders in Bridges

Authors: Dongming Feng, Fangyin Zhang, Liling Cao

Abstract:

Horizontally curved steel-concrete composite box girders are extensively used in highway bridges. They consist of reinforced concrete deck on top of prefabricated steel box section beam which exhibits a high torsional rigidity to resist torsional effects induced by the curved structural geometry. This type of structural system is often constructed in two stages. The composite section will take the tension mainly by the steel box and, the compression by the concrete deck. The steel girders are delivered in large pre-fabricated U-shaped sections that are designed for ease of construction. They are then erected on site and overlaid by cast-in-place reinforced concrete deck. The functionality of the composite section is not achieved until the closed section is formed by fully cured concrete. Since this kind of composite section is built in two stages, the erection of the open steel box presents some challenges to contractors. When the reinforced concrete slab is cast-in-place, special care should be taken on bracings that can prevent the open U-shaped steel box from global and local buckling. In the case of multiple steel boxes, the design detailing should pay enough attention to the installation requirement of the bracings connecting adjacent steel boxes to prevent the global buckling. The slope in transverse direction and grade in longitudinal direction will result in some local deformation of the steel boxes that affect the connection of the bracings. During the design phase, it is common for engineers to model the curved composite box girder using one-dimensional beam elements. This is adequate to analyze the global behavior, however, it is unable to capture the local deformation which affects the installation of the field bracing connection. The presence of the local deformation may become a critical component to control the construction tolerance, and overlooking this deformation will produce inadequate structural details that eventually cause misalignment in field and erection failure. This paper will briefly describe the construction issues we encountered in real structures, investigate the difference between beam element modeling and shell/solid element modeling, and their impact on the different construction stages. P-delta effect due to the slope and curvature of the composite box girder is analyzed, and the secondary deformation is compared to the first-order response and evaluated for its impact on installation of lateral bracings. The paper will discuss the rational approach to prepare construction documents and recommendations are made on the communications between engineers, erectors, and fabricators to smooth out construction process.

Keywords: buckling, curved composite box girder, stage construction, structural detailing

Procedia PDF Downloads 110
10785 Opto-Mechanical Characterization of Aspheric Lenses from the Hybrid Method

Authors: Aliouane Toufik, Hamdi Amine, Bouzid Djamel

Abstract:

Aspheric optical components are an alternative to the use of conventional lenses in the implementation of imaging systems for the visible range. Spherical lenses are capable of producing aberrations. Therefore, they are not able to focus all the light into a single point. Instead, aspherical lenses correct aberrations and provide better resolution even with compact lenses incorporating a small number of lenses. Metrology of these components is very difficult especially when the resolution requirements increase and insufficient or complexity of conventional tools requires the development of specific approaches to characterization. This work is part of the problem existed because the objectives are the study and comparison of different methods used to measure surface rays hybrid aspherical lenses.

Keywords: manufacture of lenses, aspherical surface, precision molding, radius of curvature, roughness

Procedia PDF Downloads 456
10784 Detection of Phoneme [S] Mispronounciation for Sigmatism Diagnosis in Adults

Authors: Michal Krecichwost, Zauzanna Miodonska, Pawel Badura

Abstract:

The diagnosis of sigmatism is mostly based on the observation of articulatory organs. It is, however, not always possible to precisely observe the vocal apparatus, in particular in the oral cavity of the patient. Speech processing can allow to objectify the therapy and simplify the verification of its progress. In the described study the methodology for classification of incorrectly pronounced phoneme [s] is proposed. The recordings come from adults. They were registered with the speech recorder at the sampling rate of 44.1 kHz and the resolution of 16 bit. The database of pathological and normative speech has been collected for the study including reference assessments provided by the speech therapy experts. Ten adult subjects were asked to simulate a certain type of stigmatism under the speech therapy expert supervision. In the recordings, the analyzed phone [s] was surrounded by vowels, viz: ASA, ESE, ISI, SPA, USU, YSY. Thirteen MFCC (mel-frequency cepstral coefficients) and RMS (root mean square) values are calculated within each frame being a part of the analyzed phoneme. Additionally, 3 fricative formants along with corresponding amplitudes are determined for the entire segment. In order to aggregate the information within the segment, the average value of each MFCC coefficient is calculated. All features of other types are aggregated by means of their 75th percentile. The proposed method of features aggregation reduces the size of the feature vector used in the classification. Binary SVM (support vector machine) classifier is employed at the phoneme recognition stage. The first group consists of pathological phones, while the other of the normative ones. The proposed feature vector yields classification sensitivity and specificity measures above 90% level in case of individual logo phones. The employment of a fricative formants-based information improves the sole-MFCC classification results average of 5 percentage points. The study shows that the employment of specific parameters for the selected phones improves the efficiency of pathology detection referred to the traditional methods of speech signal parameterization.

Keywords: computer-aided pronunciation evaluation, sibilants, sigmatism diagnosis, speech processing

Procedia PDF Downloads 267
10783 Novel Framework for MIMO-Enhanced Robust Selection of Critical Control Factors in Auto Plastic Injection Moulding Quality Optimization

Authors: Seyed Esmail Seyedi Bariran, Khairul Salleh Mohamed Sahari

Abstract:

Apparent quality defects such as warpage, shrinkage, weld line, etc. are such an irresistible phenomenon in mass production of auto plastic appearance parts. These frequently occurred manufacturing defects should be satisfied concurrently so as to achieve a final product with acceptable quality standards. Determining the significant control factors that simultaneously affect multiple quality characteristics can significantly improve the optimization results by eliminating the deviating effect of the so-called ineffective outliers. Hence, a robust quantitative approach needs to be developed upon which major control factors and their level can be effectively determined to help improve the reliability of the optimal processing parameter design. Hence, the primary objective of current study was to develop a systematic methodology for selection of significant control factors (SCF) relevant to multiple quality optimization of auto plastic appearance part. Auto bumper was used as a specimen with the most identical quality and production characteristics to APAP group. A preliminary failure modes and effect analysis (FMEA) was conducted to nominate a database of pseudo significant significant control factors prior to the optimization phase. Later, CAE simulation Moldflow analysis was implemented to manipulate four rampant plastic injection quality defects concerned with APAP group including warpage deflection, volumetric shrinkage, sink mark and weld line. Furthermore, a step-backward elimination searching method (SESME) has been developed for systematic pre-optimization selection of SCF based on hierarchical orthogonal array design and priority-based one-way analysis of variance (ANOVA). The development of robust parameter design in the second phase was based on DOE module powered by Minitab v.16 statistical software. Based on the F-test (F 0.05, 2, 14) one-way ANOVA results, it was concluded that for warpage deflection, material mixture percentage was the most significant control factor yielding a 58.34% of contribution while for the other three quality defects, melt temperature was the most significant control factor with a 25.32%, 84.25%, and 34.57% contribution for sin mark, shrinkage and weld line strength control. Also, the results on the he least significant control factors meaningfully revealed injection fill time as the least significant factor for both warpage and sink mark with respective 1.69% and 6.12% contribution. On the other hand, for shrinkage and weld line defects, the least significant control factors were holding pressure and mold temperature with a 0.23% and 4.05% overall contribution accordingly.

Keywords: plastic injection moulding, quality optimization, FMEA, ANOVA, SESME, APAP

Procedia PDF Downloads 335
10782 Climate-Smart Agriculture Technologies and Determinants of Farmers’ Adoption Decisions in the Great Rift Valley of Ethiopia

Authors: Theodrose Sisay, Kindie Tesfaye, Mengistu Ketema, Nigussie Dechassa, Mezegebu Getnet

Abstract:

Agriculture is a sector that is very vulnerable to the effects of climate change and contributes to anthropogenic greenhouse gas (GHG) emissions in the atmosphere. By lowering emissions and adjusting to the change, it can also help to reduce climate change. Utilizing Climate-Smart Agriculture (CSA) technology that can sustainably boost productivity, improve resilience, and lower GHG emissions is crucial. This study sought to identify the CSA technologies used by farmers and assess adoption levels and factors that influence them. In order to gather information from 384 smallholder farmers in the Great Rift Valley (GRV) of Ethiopia, a cross-sectional survey was carried out. Data were analysed using percentage, chi-square test, t-test, and multivariate probit model. Results showed that crop diversification, agroforestry, and integrated soil fertility management were the most widely practiced technologies. The results of the Chi-square and t-tests showed that there are differences and significant and positive connections between adopters and non-adopters based on various attributes. The chi-square and t-test results confirmed that households who were older had higher incomes, greater credit access, knowledge of the climate, better training, better education, larger farms, higher incomes, and more frequent interactions with extension specialists had a positive and significant association with CSA technology adopters. The model result showed that age, sex, and education of the head, farmland size, livestock ownership, income, access to credit, climate information, training, and extension contact influenced the selection of CSA technologies. Therefore, effective action must be taken to remove barriers to the adoption of CSA technologies, and taking these adoption factors into account in policy and practice is anticipated to support smallholder farmers in adapting to climate change while lowering emissions.

Keywords: climate change, climate-smart agriculture, smallholder farmers, multivariate probit model

Procedia PDF Downloads 106
10781 A Decadal Flood Assessment Using Time-Series Satellite Data in Cambodia

Authors: Nguyen-Thanh Son

Abstract:

Flood is among the most frequent and costliest natural hazards. The flood disasters especially affect the poor people in rural areas, who are heavily dependent on agriculture and have lower incomes. Cambodia is identified as one of the most climate-vulnerable countries in the world, ranked 13th out of 181 countries most affected by the impacts of climate change. Flood monitoring is thus a strategic priority at national and regional levels because policymakers need reliable spatial and temporal information on flood-prone areas to form successful monitoring programs to reduce possible impacts on the country’s economy and people’s likelihood. This study aims to develop methods for flood mapping and assessment from MODIS data in Cambodia. We processed the data for the period from 2000 to 2017, following three main steps: (1) data pre-processing to construct smooth time-series vegetation and water surface indices, (2) delineation of flood-prone areas, and (3) accuracy assessment. The results of flood mapping were verified with the ground reference data, indicating the overall accuracy of 88.7% and a Kappa coefficient of 0.77, respectively. These results were reaffirmed by close agreement between the flood-mapping area and ground reference data, with the correlation coefficient of determination (R²) of 0.94. The seasonally flooded areas observed for 2010, 2015, and 2016 were remarkably smaller than other years, mainly attributed to the El Niño weather phenomenon exacerbated by impacts of climate change. Eventually, although several sources potentially lowered the mapping accuracy of flood-prone areas, including image cloud contamination, mixed-pixel issues, and low-resolution bias between the mapping results and ground reference data, our methods indicated the satisfactory results for delineating spatiotemporal evolutions of floods. The results in the form of quantitative information on spatiotemporal flood distributions could be beneficial to policymakers in evaluating their management strategies for mitigating the negative effects of floods on agriculture and people’s likelihood in the country.

Keywords: MODIS, flood, mapping, Cambodia

Procedia PDF Downloads 110
10780 Applications of Greenhouse Data in Guatemala in the Analysis of Sustainability Indicators

Authors: Maria A. Castillo H., Andres R. Leandro, Jose F. Bienvenido B.

Abstract:

In 2015, Guatemala officially adopted the Sustainable Development Goals (SDG) according to the 2030 Agenda agreed by the United Nations Organization. In 2016, these objectives and goals were reviewed, and the National Priorities were established within the K'atún 2032 National Development Plan. In 2019 and 2021, progress was evaluated with 120 defined indicators, and the need to improve quality and availability of statistical data necessary for the analysis of sustainability indicators was detected, so the values to be reached in 2024 and 2032 were adjusted. The need for greater agricultural technology is one of the priorities established within SDG 2 "Zero Hunger". Within this area, protected agricultural production provides greater productivity throughout the year, reduces the use of chemical products to control pests and diseases, reduces the negative impact of climate and improves product quality. During the crisis caused by Covid-19, there was an increase in exports of fruits and vegetables produced in greenhouses from Guatemala. However, this information has not been considered in the 2021 revision of the Plan. The objective of this study is to evaluate the information available on Greenhouse Agricultural Production and its integration into the Sustainability Indicators for Guatemala. This study was carried out in four phases: 1. Analysis of the Goals established for SDG 2 and the indicators included in the K'atún Plan. 2. Analysis of Environmental, Social and Economic Indicator Models. 3. Definition of territorial levels in 2 geographic scales: Departments and Municipalities. 4. Diagnosis of the available data on technological agricultural production with emphasis on Greenhouses at the 2 geographical scales. A summary of the results is presented for each phase and finally some recommendations for future research are added. The main contribution of this work is to improve the available data that allow the incorporation of some agricultural technology indicators in the established goals, to evaluate their impact on Food Security and Nutrition, Employment and Investment, Poverty, the use of Water and Natural Resources, and to provide a methodology applicable to other production models and other geographical areas.

Keywords: greenhouses, protected agriculture, sustainable indicators, Guatemala, sustainability, SDG

Procedia PDF Downloads 70
10779 Developing a GIS-Based Tool for the Management of Fats, Oils, and Grease (FOG): A Case Study of Thames Water Wastewater Catchment

Authors: Thomas D. Collin, Rachel Cunningham, Bruce Jefferson, Raffaella Villa

Abstract:

Fats, oils and grease (FOG) are by-products of food preparation and cooking processes. FOG enters wastewater systems through a variety of sources such as households, food service establishments, and industrial food facilities. Over time, if no source control is in place, FOG builds up on pipe walls, leading to blockages, and potentially to sewer overflows which are a major risk to the Environment and Human Health. UK water utilities spend millions of pounds annually trying to control FOG. Despite UK legislation specifying that discharge of such material is against the law, it is often complicated for water companies to identify and prosecute offenders. Hence, it leads to uncertainties regarding the attitude to take in terms of FOG management. Research is needed to seize the full potential of implementing current practices. The aim of this research was to undertake a comprehensive study to document the extent of FOG problems in sewer lines and reinforce existing knowledge. Data were collected to develop a model estimating quantities of FOG available for recovery within Thames Water wastewater catchments. Geographical Information System (GIS) software was used in conjunction to integrate data with a geographical component. FOG was responsible for at least 1/3 of sewer blockages in Thames Water waste area. A waste-based approach was developed through an extensive review to estimate the potential for FOG collection and recovery. Three main sources were identified: residential, commercial and industrial. Commercial properties were identified as one of the major FOG producers. The total potential FOG generated was estimated for the 354 wastewater catchments. Additionally, raw and settled sewage were sampled and analysed for FOG (as hexane extractable material) monthly at 20 sewage treatment works (STW) for three years. A good correlation was found with the sampled FOG and population equivalent (PE). On average, a difference of 43.03% was found between the estimated FOG (waste-based approach) and sampled FOG (raw sewage sampling). It was suggested that the approach undertaken could overestimate the FOG available, the sampling could only capture a fraction of FOG arriving at STW, and/or the difference could account for FOG accumulating in sewer lines. Furthermore, it was estimated that on average FOG could contribute up to 12.99% of the primary sludge removed. The model was further used to investigate the relationship between estimated FOG and number of blockages. The higher the FOG potential, the higher the number of FOG-related blockages is. The GIS-based tool was used to identify critical areas (i.e. high FOG potential and high number of FOG blockages). As reported in the literature, FOG was one of the main causes of sewer blockages. By identifying critical areas (i.e. high FOG potential and high number of FOG blockages) the model further explored the potential for source-control in terms of ‘sewer relief’ and waste recovery. Hence, it helped targeting where benefits from implementation of management strategies could be the highest. However, FOG is still likely to persist throughout the networks, and further research is needed to assess downstream impacts (i.e. at STW).

Keywords: fat, FOG, GIS, grease, oil, sewer blockages, sewer networks

Procedia PDF Downloads 193
10778 Design to Cryogenic System for Dilution Refrigerator with Cavity and Superconducting Magnet

Authors: Ki Woong Lee

Abstract:

The Center for Axion and Precision Physics Research is studying the search for dark matter using 12 tesla superconducting magnets. A dilution refrigerator is being used for search experiments, and superconducting magnets, superconducting cavities. The dilution refrigerator requires a stable cryogenic environment using liquid helium. Accordingly, a cryogenic system for a stable supply of liquid helium is to be established. This cryogenic system includes the liquefying, supply, storage, and purification of liquid helium. This article presents the basic design, construction, and operation plans for building cryogenic systems.

Keywords: cryogenic system, dilution refrigerator, superconducting magnet, helium recovery system

Procedia PDF Downloads 106
10777 The Effect of the Structural Arrangement of Binary Bisamide Organogelators on their Self-Assembly Behavior

Authors: Elmira Ghanbari, Jan Van Esch, Stephen J. Picken, Sahil Aggarwal

Abstract:

Low-molecular-weight organogelators form gels by self-assembly into the crystalline network which immobilizes the organic solvent. For single bisamide organogelator systems, the effect of the molecular structure on the molecular interaction and their self-assembly behavior has been explored. The spatial arrangement of bisamide molecules in the gel-state is driven by a combination of hydrogen bonding and Van der Waals interactions. The hydrogen-bonding pattern between the amide groups of bisamide molecules is regulated by the number of methylene spacers; the even number of methylene spacers between two amide groups, in even-spaced bisamides, leads to the antiparallel position of amide groups within a molecule. An even-spaced bisamide molecule with antiparallel amide groups can make two pairs of hydrogen bonding with the molecules on the same plane. The odd-spaced bisamide with a parallel directionality of amide groups can form four independent hydrogen bonds with four other bisamide molecules on different planes. The arrangement of bisamide molecules in the crystalline state and the interaction of these molecules depends on the molecular structure, particularly the parity of the spacer length between the amide groups in the bisamide molecule. In this study, the directionality of amide groups has been exploited as a structural characteristic to affect the arrangement of molecules in the crystalline state and produce different binary bisamide gelators with different degrees of crystallinities. Single odd- and even-spaced single bisamides were synthesized and blended to produce binary bisamide organogelators to be characterized in order to understand the effect of the different directionality of amide groups on the molecular interaction in the crystalline state. The pattern of molecular interactions between these blended molecules, mixing or phase separation, has been monitored via differential scanning calorimetry (DSC) and crystallography techniques; X-ray powder diffraction (XRD) and Small-angle X-ray scattering (SAXS). The formation of lamellar structures for odd- and even-spaced bisamide gelators was confirmed by using SAXS and XRD techniques. DSC results have shown that binary bisamide organogelators with different parity of methylene spacers (odd-even binary blends) have a higher tendency for phase separation compared to the binary bisamides with the same parity (odd-odd or even-even binary blends). Phase separation in binary odd-even bisamides was confirmed by the presence of individual (100) reflections of odd and even lamellar structures. The structural characteristic of bisamide organogelators, the parity of spacer length in binary systems, is a promising tool to control the arrangement of molecules and their crystalline structure.

Keywords: binary bisamide organogelators, crystalline structure, phase separation, self-assembly behavior

Procedia PDF Downloads 174
10776 Organic Farming for Sustainable Production of Some Promising Halophytic Species in Saline Environment

Authors: Medhat Tawfik, Ezzat Abd El Lateef, Bahr Amany, Mohamed Magda

Abstract:

Applying organic farming systems in biosaline agriculture is unconventional approach for sustainable use of marginal soil and desert land for planting non-traditional halophytic crops such as Leptochloa fusca, Kochia indica, Sporobolus virginicus and Spartina patens. These plants are highly salt tolerant C4 halophytic forage plants grown well in coastal salt marsh. These halophytic plant will take important place in the farming system, especially in the coastal areas and salt-affected land. We can call it environmentally smart crops because they ensure food security, contribute to energy security, guarantee environmental sustainability, and mitigate the negative impacts of climate change. Organic Agriculture is the most important and widely practiced agro-ecological farming system. It is claimed to be the most sustainable approach and long term adaptation strategy. It promotes soil fertility and diversity at all levels and makes soils less susceptible to erosion. It is also reported to be climate change resilience farming systems as it promotes the proper management of soil, water, biodiversity and local knowledge and provides producers with ecologically sound management decisions. A field experiment was carried out at the Model Farm of National Research Centre, El Tour, South Sinai to study the impact of (Mycorrhiza 1kg/fed., charcoal 4 tons/fed., chicken manure 5 tons/fed., in addition to control treatment) on some growth characters, photosynthetic pigments content, and some physiological aspects i.e. prolind and soluble carbohydrates content, succulence and osmotic pressure values, as well as nutritive values i.e. Crude fat (CF), Acid detergent fiber (ADF), Neutral detergent fiber (NDF), Ether extract (EE) and Nitrogen-free extract (NFE) of five halophytic plant species (Leptochloa fusca, Kochia indica, Sporobolus virginicus and Spartina patens). Our results showed that organic fertilizer treatment enhanced all the previous character as compared with control with superiority to chicken manure over the other treatments.

Keywords: organic agriculture, halophytic plants, saline environment, water security

Procedia PDF Downloads 212
10775 Interactive, Topic-Oriented Search Support by a Centroid-Based Text Categorisation

Authors: Mario Kubek, Herwig Unger

Abstract:

Centroid terms are single words that semantically and topically characterise text documents and so may serve as their very compact representation in automatic text processing. In the present paper, centroids are used to measure the relevance of text documents with respect to a given search query. Thus, a new graphbased paradigm for searching texts in large corpora is proposed and evaluated against keyword-based methods. The first, promising experimental results demonstrate the usefulness of the centroid-based search procedure. It is shown that especially the routing of search queries in interactive and decentralised search systems can be greatly improved by applying this approach. A detailed discussion on further fields of its application completes this contribution.

Keywords: search algorithm, centroid, query, keyword, co-occurrence, categorisation

Procedia PDF Downloads 271
10774 Characterization of Aerosol Particles in Ilorin, Nigeria: Ground-Based Measurement Approach

Authors: Razaq A. Olaitan, Ayansina Ayanlade

Abstract:

Understanding aerosol properties is the main goal of global research in order to lower the uncertainty associated with climate change in the trends and magnitude of aerosol particles. In order to identify aerosol particle types, optical properties, and the relationship between aerosol properties and particle concentration between 2019 and 2021, a study conducted in Ilorin, Nigeria, examined the aerosol robotic network's ground-based sun/sky scanning radiometer. The AERONET algorithm version 2 was utilized to retrieve monthly data on aerosol optical depth and angstrom exponent. The version 3 algorithm, which is an almucantar level 2 inversion, was employed to retrieve daily data on single scattering albedo and aerosol size distribution. Excel 2016 was used to analyze the data's monthly, seasonal, and annual mean averages. The distribution of different types of aerosols was analyzed using scatterplots, and the optical properties of the aerosol were investigated using pertinent mathematical theorems. To comprehend the relationships between particle concentration and properties, correlation statistics were employed. Based on the premise that aerosol characteristics must remain constant in both magnitude and trend across time and space, the study's findings indicate that the types of aerosols identified between 2019 and 2021 are as follows: 29.22% urban industrial (UI) aerosol type, 37.08% desert (D) aerosol type, 10.67% biomass burning (BB), and 23.03% urban mix (Um) aerosol type. Convective wind systems, which frequently carry particles as they blow over long distances in the atmosphere, have been responsible for the peak-of-the-columnar aerosol loadings, which were observed during August of the study period. The study has shown that while coarse mode particles dominate, fine particles are increasing in seasonal and annual trends. Burning biomass and human activities in the city are linked to these trends. The study found that the majority of particles are highly absorbing black carbon, with the fine mode having a volume median radius of 0.08 to 0.12 meters. The investigation also revealed that there is a positive coefficient of correlation (r = 0.57) between changes in aerosol particle concentration and changes in aerosol properties. Human activity is rapidly increasing in Ilorin, causing changes in aerosol properties, indicating potential health risks from climate change and human influence on geological and environmental systems.

Keywords: aerosol loading, aerosol types, health risks, optical properties

Procedia PDF Downloads 41
10773 Assessment of Cytogenetic Damage as a Function of Radiofrequency Electromagnetic Radiations Exposure Measured by Electric Field Strength: A Gender Based Study

Authors: Ramanpreet, Gursatej Gandhi

Abstract:

Background: Dependence on electromagnetic radiations involved in communication and information technologies has incredibly increased in the personal and professional world. Among the numerous radiations, sources are fixed site transmitters, mobile phone base stations, and power lines beside indoor devices like cordless phones, WiFi, Bluetooth, TV, radio, microwave ovens, etc. Rather there is the continuous emittance of radiofrequency radiations (RFR) even to those not using the devices from mobile phone base stations. The consistent and widespread usage of wireless devices has build-up electromagnetic fields everywhere. In fact, the radiofrequency electromagnetic field (RF-EMF) has insidiously become a part of the environment and like any contaminant may pose to be health-hazardous requiring assessment. Materials and Methods: In the present study, cytogenetic damage was assessed using the Buccal Micronucleus Cytome (BMCyt) assay as a function of radiation exposure after Institutional Ethics Committee clearance of the study and written voluntary informed consent from the participants. On a pre-designed questionnaire, general information lifestyle patterns (diet, physical activity, smoking, drinking, use of mobile phones, internet, Wi-Fi usage, etc.) genetic, reproductive (pedigrees) and medical histories were recorded. For this, 24 hour-personal exposimeter measurements (PEM) were recorded for unrelated 60 healthy adults (40 cases residing in the vicinity of mobile phone base stations since their installation and 20 controls residing in areas with no base stations). The personal exposimeter collects information from all the sources generating EMF (TETRA, GSM, UMTS, DECT, and WLAN) as total RF-EMF uplink and downlink. Findings: The cases (n=40; 23-90 years) and the controls (n=20; 19-65 years) matched for alcohol drinking, smoking habits, and mobile and cordless phone usage. The PEM in cases (149.28 ± 8.98 mV/m) revealed significantly higher (p=0.000) electric field strength compared to the recorded value (80.40 ± 0.30 mV/m) in controls. The GSM 900 uplink (p=0.000), GSM 1800 downlink (p=0.000),UMTS (both uplink; p=0.013 and downlink; p=0.001) and DECT (p=0.000) electric field strength were significantly elevated in the cases as compared to controls. The electric field strength in the cases was significantly from GSM1800 (52.26 ± 4.49mV/m) followed by GSM900 (45.69 ± 4.98mV/m), UMTS (25.03 ± 3.33mV/m), DECT (18.02 ± 2.14mV/m) and was least from WLAN (8.26 ± 2.35mV/m). The higher significantly (p=0.000) increased exposure to the cases was from GSM (97.96 ± 6.97mV/m) in comparison to UMTS, DECT, and WLAN. The frequencies of micronuclei (1.86X, p=0.007), nuclear buds (2.95X, p=0.002) and cell death parameter (condensed chromatin cells) were significantly (1.75X, p=0.007) elevated in cases compared to that in controls probably as a function of radiofrequency radiation exposure. Conclusion: In the absence of other exposure(s), any cytogenetic damage if unrepaired is a cause of concern as it can cause malignancy. Larger sample size with the clinical assessment will prove more insightful of such an effect.

Keywords: Buccal micronucleus cytome assay, cytogenetic damage, electric field strength, personal exposimeter

Procedia PDF Downloads 146
10772 A Study of Topical and Similarity of Sebum Layer Using Interactive Technology in Image Narratives

Authors: Chao Wang

Abstract:

Under rapid innovation of information technology, the media plays a very important role in the dissemination of information, and it has a totally different analogy generations face. However, the involvement of narrative images provides more possibilities of narrative text. "Images" through the process of aperture, a camera shutter and developable photosensitive processes are manufactured, recorded and stamped on paper, displayed on a computer screen-concretely saved. They exist in different forms of files, data, or evidence as the ultimate looks of events. By the interface of media and network platforms and special visual field of the viewer, class body space exists and extends out as thin as sebum layer, extremely soft and delicate with real full tension. The physical space of sebum layer of confuses the fact that physical objects exist, needs to be established under a perceived consensus. As at the scene, the existing concepts and boundaries of physical perceptions are blurred. Sebum layer physical simulation shapes the “Topical-Similarity" immersing, leading the contemporary social practice communities, groups, network users with a kind of illusion without the presence, i.e. a non-real illusion. From the investigation and discussion of literatures, digital movies editing manufacture and produce the variability characteristics of time (for example, slices, rupture, set, and reset) are analyzed. Interactive eBook has an unique interaction in "Waiting-Greeting" and "Expectation-Response" that makes the operation of image narrative structure more interpretations functionally. The works of digital editing and interactive technology are combined and further analyze concept and results. After digitization of Interventional Imaging and interactive technology, real events exist linked and the media handing cannot be cut relationship through movies, interactive art, practical case discussion and analysis. Audience needs more rational thinking about images carried by the authenticity of the text.

Keywords: sebum layer, topical and similarity, interactive technology, image narrative

Procedia PDF Downloads 378
10771 Antimicrobial Properties of SEBS Compounds with Copper Microparticles

Authors: Vanda Ferreira Ribeiro, Daiane Tomacheski, Douglas Naue Simões, Michele Pitto, Ruth Marlene Campomanes Santana

Abstract:

Indoor environments, such as car cabins and public transportation vehicles are places where users are subject to air quality. Microorganisms (bacteria, fungi, yeasts) enter these environments through windows, ventilation systems and may use the organic particles present as a growth substrate. In addition, atmospheric pollutants can act as potential carbon and nitrogen sources for some microorganisms. Compounds base SEBS copolymers, poly(styrene-b-(ethylene-co-butylene)-b-styrene, are a class of thermoplastic elastomers (TPEs), fully recyclable and largely used in automotive parts. Metals, such as cooper and silver, have biocidal activities and the production of the SEBS compounds by melting blending with these agents can be a good option for producing compounds for use in plastic parts of ventilation systems and automotive air-conditioning, in order to minimize the problems caused by growth of pathogenic microorganisms. In this sense, the aim of this work was to evaluate the effect of copper microparticles as antimicrobial agent in compositions based on SEBS/PP/oil/calcite. Copper microparticles were used in weight proportion of 0%, 1%, 2% and 4%. The compounds were prepared using a co-rotating double screw extruder (L/D ratio of 40/1 and 16 mm screw diameter). The processing parameters were 300 rpm of screw rotation rate, with a temperature profile between 150 to 190°C. SEBS based TPE compounds were injection molded. The compounds emission were characterized by gravimetric fogging test. Compounds were characterized by physical (density and staining by contact), mechanical (hardness and tension properties) and rheological properties (melt volume rate – MVR). Antibacterial properties were evaluated against Staphylococcus aureus (S. aureus) and Escherichia coli (E. coli) strains. To avaluate the abilities toward the fungi have been chosen Aspergillus niger (A. niger), Candida albicans (C. albicans), Cladosporium cladosporioides (C. cladosporioides) and Penicillium chrysogenum (P. chrysogenum). The results of biological tests showed a reduction on bacteria in up to 88% in E.coli and up to 93% in S. aureus. The tests with fungi showed no conclusive results because the sample without copper also demonstrated inhibition of the development of these microorganisms. The copper addition did not cause significant variations in mechanical properties, in the MVR and the emission behavior of the compounds. The density increases with the increment of copper in compounds.

Keywords: air conditioner, antimicrobial, cooper, SEBS

Procedia PDF Downloads 269
10770 Social Media and Political Mobilization in Nigeria: A Study in E-Participation

Authors: Peter Amobi Chiamogu

Abstract:

Communication has subsisted as the basis for mass mobilization and political education through history with the media as a generic concept. Revolutions in ICTs have occasioned a limitless environment for the dissemination of information and ideas especially with the use of a seemingly pervasive access, penetration and use of the internet which has engendered a connected society. This study seeks to analyze the prospects and challenges for the adaptation of social media for free election and how this process can enhance public policy making, implementation and evaluation in a developing state.

Keywords: social media, e-participation, political mobilization, public policy, electioneering

Procedia PDF Downloads 335
10769 Confidence Envelopes for Parametric Model Selection Inference and Post-Model Selection Inference

Authors: I. M. L. Nadeesha Jayaweera, Adao Alex Trindade

Abstract:

In choosing a candidate model in likelihood-based modeling via an information criterion, the practitioner is often faced with the difficult task of deciding just how far up the ranked list to look. Motivated by this pragmatic necessity, we construct an uncertainty band for a generalized (model selection) information criterion (GIC), defined as a criterion for which the limit in probability is identical to that of the normalized log-likelihood. This includes common special cases such as AIC & BIC. The method starts from the asymptotic normality of the GIC for the joint distribution of the candidate models in an independent and identically distributed (IID) data framework and proceeds by deriving the (asymptotically) exact distribution of the minimum. The calculation of an upper quantile for its distribution then involves the computation of multivariate Gaussian integrals, which is amenable to efficient implementation via the R package "mvtnorm". The performance of the methodology is tested on simulated data by checking the coverage probability of nominal upper quantiles and compared to the bootstrap. Both methods give coverages close to nominal for large samples, but the bootstrap is two orders of magnitude slower. The methodology is subsequently extended to two other commonly used model structures: regression and time series. In the regression case, we derive the corresponding asymptotically exact distribution of the minimum GIC invoking Lindeberg-Feller type conditions for triangular arrays and are thus able to similarly calculate upper quantiles for its distribution via multivariate Gaussian integration. The bootstrap once again provides a default competing procedure, and we find that similar comparison performance metrics hold as for the IID case. The time series case is complicated by far more intricate asymptotic regime for the joint distribution of the model GIC statistics. Under a Gaussian likelihood, the default in most packages, one needs to derive the limiting distribution of a normalized quadratic form for a realization from a stationary series. Under conditions on the process satisfied by ARMA models, a multivariate normal limit is once again achieved. The bootstrap can, however, be employed for its computation, whence we are once again in the multivariate Gaussian integration paradigm for upper quantile evaluation. Comparisons of this bootstrap-aided semi-exact method with the full-blown bootstrap once again reveal a similar performance but faster computation speeds. One of the most difficult problems in contemporary statistical methodological research is to be able to account for the extra variability introduced by model selection uncertainty, the so-called post-model selection inference (PMSI). We explore ways in which the GIC uncertainty band can be inverted to make inferences on the parameters. This is being attempted in the IID case by pivoting the CDF of the asymptotically exact distribution of the minimum GIC. For inference one parameter at a time and a small number of candidate models, this works well, whence the attained PMSI confidence intervals are wider than the MLE-based Wald, as expected.

Keywords: model selection inference, generalized information criteria, post model selection, Asymptotic Theory

Procedia PDF Downloads 71
10768 Inherent Difficulties in Countering Islamophobia

Authors: Imbesat Daudi

Abstract:

Islamophobia, which is a billion-dollar industry, is widespread, especially in the United States, Europe, India, Israel, and countries that have Muslim minorities at odds with their governmental policies. Hatred of Islam in the West did not evolve spontaneously; it was methodically created. Islamophobia's current format has been designed to spread on its own, find a space in the Western psyche, and resist its eradication. Hatred has been sustained by neoconservative ideologues and their allies, which are supported by the mainstream media. Social scientists have evaluated how ideas spread, why any idea can go viral, and where new ideas find space in our brains. This was possible because of the advances in the computational power of software and computers. Spreading of ideas, including Islamophobia, follows a sine curve; it has three phases: An initial exploratory phase with a long lag period, an explosive phase if ideas go viral, and the final phase when ideas find space in the human psyche. In the initial phase, the ideas are quickly examined in a center in the prefrontal lobe. When it is deemed relevant, it is sent for evaluation to another center of the prefrontal lobe; there, it is critically examined. Once it takes a final shape, the idea is sent as a final product to a center in the occipital lobe. This center cannot critically evaluate ideas; it can only defend them from its critics. Counterarguments, no matter how scientific, are automatically rejected. Therefore, arguments that could be highly effective in the early phases are counterproductive once they are stored in the occipital lobe. Anti-Islamophobic intellectuals have done a very good job of countering Islamophobic arguments. However, they have not been as effective as neoconservative ideologues who have promoted anti-Muslim rhetoric that was based on half-truths, misinformation, or outright lies. The failure is partly due to the support pro-war activists receive from the mainstream media, state institutions, mega-corporations engaged in violent conflicts, and think tanks that provide Islamophobic arguments. However, there are also scientific reasons why anti-Islamophobic thinkers have been less effective. There are different dynamics of spreading ideas once they are stored in the occipital lobe. The human brain is incapable of evaluating further once it accepts ideas as its own; therefore, a different strategy is required to be effective. This paper examines 1) why anti-Islamophobic intellectuals have failed in changing the minds of non-Muslims and 2) the steps of countering hatred. Simply put, a new strategy is needed that can effectively counteract hatred of Islam and Muslims. Islamophobia is a disease that requires strong measures. Fighting hatred is always a challenge, but if we understand why Islamophobia is taking root in the twenty-first century, one can succeed in challenging Islamophobic arguments. That will need a coordinated effort of Intellectuals, writers and the media.

Keywords: islamophobia, Islam and violence, anti-islamophobia, demonization of Islam

Procedia PDF Downloads 34
10767 Multi-scale Geographic Object-Based Image Analysis (GEOBIA) Approach to Segment a Very High Resolution Images for Extraction of New Degraded Zones. Application to The Region of Mécheria in The South-West of Algeria

Authors: Bensaid A., Mostephaoui T., Nedjai R.

Abstract:

A considerable area of Algerian lands are threatened by the phenomenon of wind erosion. For a long time, wind erosion and its associated harmful effects on the natural environment have posed a serious threat, especially in the arid regions of the country. In recent years, as a result of increases in the irrational exploitation of natural resources (fodder) and extensive land clearing, wind erosion has particularly accentuated. The extent of degradation in the arid region of the Algerian Mécheriadepartment generated a new situation characterized by the reduction of vegetation cover, the decrease of land productivity, as well as sand encroachment on urban development zones. In this study, we attempt to investigate the potential of remote sensing and geographic information systems for detecting the spatial dynamics of the ancient dune cords based on the numerical processing of PlanetScope PSB.SB sensors images by September 29, 2021. As a second step, we prospect the use of a multi-scale geographic object-based image analysis (GEOBIA) approach to segment the high spatial resolution images acquired on heterogeneous surfaces that vary according to human influence on the environment. We have used the fractal net evolution approach (FNEA) algorithm to segment images (Baatz&Schäpe, 2000). Multispectral data, a digital terrain model layer, ground truth data, a normalized difference vegetation index (NDVI) layer, and a first-order texture (entropy) layer were used to segment the multispectral images at three segmentation scales, with an emphasis on accurately delineating the boundaries and components of the sand accumulation areas (Dune, dunes fields, nebka, and barkhane). It is important to note that each auxiliary data contributed to improve the segmentation at different scales. The silted areas were classified using a nearest neighbor approach over the Naâma area using imagery. The classification of silted areas was successfully achieved over all study areas with an accuracy greater than 85%, although the results suggest that, overall, a higher degree of landscape heterogeneity may have a negative effect on segmentation and classification. Some areas suffered from the greatest over-segmentation and lowest mapping accuracy (Kappa: 0.79), which was partially attributed to confounding a greater proportion of mixed siltation classes from both sandy areas and bare ground patches. This research has demonstrated a technique based on very high-resolution images for mapping sanded and degraded areas using GEOBIA, which can be applied to the study of other lands in the steppe areas of the northern countries of the African continent.

Keywords: land development, GIS, sand dunes, segmentation, remote sensing

Procedia PDF Downloads 96
10766 A Grid Synchronization Method Based On Adaptive Notch Filter for SPV System with Modified MPPT

Authors: Priyanka Chaudhary, M. Rizwan

Abstract:

This paper presents a grid synchronization technique based on adaptive notch filter for SPV (Solar Photovoltaic) system along with MPPT (Maximum Power Point Tracking) techniques. An efficient grid synchronization technique offers proficient detection of various components of grid signal like phase and frequency. It also acts as a barrier for harmonics and other disturbances in grid signal. A reference phase signal synchronized with the grid voltage is provided by the grid synchronization technique to standardize the system with grid codes and power quality standards. Hence, grid synchronization unit plays important role for grid connected SPV systems. As the output of the PV array is fluctuating in nature with the meteorological parameters like irradiance, temperature, wind etc. In order to maintain a constant DC voltage at VSC (Voltage Source Converter) input, MPPT control is required to track the maximum power point from PV array. In this work, a variable step size P & O (Perturb and Observe) MPPT technique with DC/DC boost converter has been used at first stage of the system. This algorithm divides the dPpv/dVpv curve of PV panel into three separate zones i.e. zone 0, zone 1 and zone 2. A fine value of tracking step size is used in zone 0 while zone 1 and zone 2 requires a large value of step size in order to obtain a high tracking speed. Further, adaptive notch filter based control technique is proposed for VSC in PV generation system. Adaptive notch filter (ANF) approach is used to synchronize the interfaced PV system with grid to maintain the amplitude, phase and frequency parameters as well as power quality improvement. This technique offers the compensation of harmonics current and reactive power with both linear and nonlinear loads. To maintain constant DC link voltage a PI controller is also implemented and presented in this paper. The complete system has been designed, developed and simulated using SimPower System and Simulink toolbox of MATLAB. The performance analysis of three phase grid connected solar photovoltaic system has been carried out on the basis of various parameters like PV output power, PV voltage, PV current, DC link voltage, PCC (Point of Common Coupling) voltage, grid voltage, grid current, voltage source converter current, power supplied by the voltage source converter etc. The results obtained from the proposed system are found satisfactory.

Keywords: solar photovoltaic systems, MPPT, voltage source converter, grid synchronization technique

Procedia PDF Downloads 581
10765 Robust Single/Multi bit Memristor Based Memory

Authors: Ahmed Emara, Maged Ghoneima, Mohamed Dessouky

Abstract:

Demand for low power fast memories is increasing with the increase in IC’s complexity, in this paper we introduce a proposal for a compact SRAM based on memristor devices. The compact size of the proposed cell (1T2M compared to 6T of traditional SRAMs) allows denser memories on the same area. In this paper, we will discuss the proposed memristor memory cell for single/multi bit data storing configurations along with the writing and reading operations. Stored data stability across successive read operation will be illustrated, operational simulation results and a comparison of our proposed design with previously conventional SRAM and previously proposed memristor cells will be provided.

Keywords: memristor, multi-bit, single-bit, circuits, systems

Procedia PDF Downloads 357
10764 Adapting Inclusive Residential Models to Match Universal Accessibility and Fire Protection

Authors: Patricia Huedo, Maria José Ruá, Raquel Agost-Felip

Abstract:

Ensuring sustainable development of urban environments means guaranteeing adequate environmental conditions, being resilient and meeting conditions of safety and inclusion for all people, regardless of their condition. All existing buildings should meet basic safety conditions and be equipped with safe and accessible routes, along with visual, acoustic and tactile signals to protect their users or potential visitors, and regardless of whether they undergo rehabilitation or change of use processes. Moreover, from a social perspective, we consider the need to prioritize buildings occupied by the most vulnerable groups of people that currently do not have specific regulations tailored to their needs. Some residential models in operation are not only outside the scope of application of the regulations in force; they also lack a project or technical data that would allow knowing the fire behavior of the construction materials. However, the difficulty and cost involved in adapting the entire building stock to current regulations can never justify the lack of safety for people. Hence, this work develops a simplified model to assess compliance with the basic safety conditions in case of fire and its compatibility with the specific accessibility needs of each user. The purpose is to support the designer in decision making, as well as to contribute to the development of a basic fire safety certification tool to be applied in inclusive residential models. This work has developed a methodology to support designers in adapting Social Services Centers, usually intended to vulnerable people. It incorporates a checklist of 9 items and information from sources or standards that designers can use to justify compliance or propose solutions. For each item, the verification system is justified, and possible sources of consultation are provided, considering the possibility of lacking technical documentation of construction systems or building materials. The procedure is based on diagnosing the degree of compliance with fire conditions of residential models used by vulnerable groups, considering the special accessibility conditions required by each user group. Through visual inspection and site surveying, the verification model can serve as a support tool, significantly streamlining the diagnostic phase and reducing the number of tests to be requested by over 75%. This speeds up and simplifies the diagnostic phase. To illustrate the methodology, two different buildings in the Valencian Region (Spain) have been selected. One case study is a mental health facility for residential purposes, located in a rural area, on the outskirts of a small town; the other one, is a day care facility for individuals with intellectual disabilities, located in a medium-sized city. The comparison between the case studies allow to validate the model in distinct conditions. Verifying compliance with a basic security level can allow a quality seal and a public register of buildings adapted to fire regulations to be established, similarly to what is being done with other types of attributes such as energy performance.

Keywords: fire safety, inclusive housing, universal accessibility, vulnerable people

Procedia PDF Downloads 12