Search results for: data comparison
26876 Comparison of Allowable Stress Method and Time History Response Analysis for Seismic Design of Buildings
Authors: Sayuri Inoue, Naohiro Nakamura, Tsubasa Hamada
Abstract:
The seismic design method of buildings is classified into two types: static design and dynamic design. The static design is a design method that exerts static force as seismic force and is a relatively simple design method created based on the experience of seismic motion in the past 100 years. At present, static design is used for most of the Japanese buildings. Dynamic design mainly refers to the time history response analysis. It is a comparatively difficult design method that input the earthquake motion assumed in the building model and examine the response. Currently, it is only used for skyscrapers and specific buildings. In the present design standard in Japan, it is good to use either the design method of the static design and the dynamic design in the medium and high-rise buildings. However, when actually designing middle and high-rise buildings by two kinds of design methods, the relatively simple static design method satisfies the criteria, but in the case of a little difficult dynamic design method, the criterion isn't often satisfied. This is because the dynamic design method was built with the intention of designing super high-rise buildings. In short, higher safety is required as compared with general buildings, and criteria become stricter. The authors consider applying the dynamic design method to general buildings designed by the static design method so far. The reason is that application of the dynamic design method is reasonable for buildings that are out of the conventional standard structural form such as emphasizing design. For the purpose, it is important to compare the design results when the criteria of both design methods are arranged side by side. In this study, we performed time history response analysis to medium-rise buildings that were actually designed with allowable stress method. Quantitative comparison between static design and dynamic design was conducted, and characteristics of both design methods were examined.Keywords: buildings, seismic design, allowable stress design, time history response analysis, Japanese seismic code
Procedia PDF Downloads 15526875 Analyzing Data Protection in the Era of Big Data under the Framework of Virtual Property Layer Theory
Authors: Xiaochen Mu
Abstract:
Data rights confirmation, as a key legal issue in the development of the digital economy, is undergoing a transition from a traditional rights paradigm to a more complex private-economic paradigm. In this process, data rights confirmation has evolved from a simple claim of rights to a complex structure encompassing multiple dimensions of personality rights and property rights. Current data rights confirmation practices are primarily reflected in two models: holistic rights confirmation and process rights confirmation. The holistic rights confirmation model continues the traditional "one object, one right" theory, while the process rights confirmation model, through contractual relationships in the data processing process, recognizes rights that are more adaptable to the needs of data circulation and value release. In the design of the data property rights system, there is a hierarchical characteristic aimed at decoupling from raw data to data applications through horizontal stratification and vertical staging. This design not only respects the ownership rights of data originators but also, based on the usufructuary rights of enterprises, constructs a corresponding rights system for different stages of data processing activities. The subjects of data property rights include both data originators, such as users, and data producers, such as enterprises, who enjoy different rights at different stages of data processing. The intellectual property rights system, with the mission of incentivizing innovation and promoting the advancement of science, culture, and the arts, provides a complete set of mechanisms for protecting innovative results. However, unlike traditional private property rights, the granting of intellectual property rights is not an end in itself; the purpose of the intellectual property system is to balance the exclusive rights of the rights holders with the prosperity and long-term development of society's public learning and the entire field of science, culture, and the arts. Therefore, the intellectual property granting mechanism provides both protection and limitations for the rights holder. This perfectly aligns with the dual attributes of data. In terms of achieving the protection of data property rights, the granting of intellectual property rights is an important institutional choice that can enhance the effectiveness of the data property exchange mechanism. Although this is not the only path, the granting of data property rights within the framework of the intellectual property rights system helps to establish fundamental legal relationships and rights confirmation mechanisms and is more compatible with the classification and grading system of data. The modernity of the intellectual property rights system allows it to adapt to the needs of big data technology development through special clauses or industry guidelines, thus promoting the comprehensive advancement of data intellectual property rights legislation. This paper analyzes data protection under the virtual property layer theory and two-fold virtual property rights system. Based on the “bundle of right” theory, this paper establishes specific three-level data rights. This paper analyzes the cases: Google v. Vidal-Hall, Halliday v Creation Consumer Finance, Douglas v Hello Limited, Campbell v MGN and Imerman v Tchenquiz. This paper concluded that recognizing property rights over personal data and protecting data under the framework of intellectual property will be beneficial to establish the tort of misuse of personal information.Keywords: data protection, property rights, intellectual property, Big data
Procedia PDF Downloads 4026874 Comparative Analysis of Pet-parent Reported Pruritic Symptoms in Cats: Data from Social Media Listening and Surveys Similar
Authors: Georgina Cherry, Taranpreet Rai, Luke Boyden, Sitira Williams, Andrea Wright, Richard Brown, Viva Chu, Alasdair Cook, Kevin Wells
Abstract:
Estimating population-level burden, abilities of pet-parents to identify disease and demand for veterinary services worldwide is challenging. The purpose of this study is to compare a feline pruritus survey with social media listening (SML) data discussing this condition. Surveys are expensive and labour intensive to analyse, but SML data is freeform and requires careful filtering for relevancy. This study considers data from a survey of owner-observed symptoms of 156 pruritic cats conducted using Pet Parade® and SML posts collected through web-scraping to gain insights into the characterisation and management of feline pruritus. SML posts meeting a feline body area, behaviour and symptom were captured and reviewed for relevance representing 1299 public posts collected from 2021 to 2023. The survey involved 1067 pet-parents who reported on pruritic symptoms in their cats. Among the observed cats, approximately 18.37% (n=196) exhibited at least one symptom. The most frequently reported symptoms were hair loss (9.2%), bald spots (7.3%) and infection, crusting, scaling, redness, scabbing, scaling, or bumpy skin (8.2%). Notably, bald spots were the primary symptom reported for short-haired cats, while other symptoms were more prevalent in medium and long-haired cats. Affected body areas, according to pet-parents, were primarily the head, face, chin, neck (27%), and the top of the body, along the spine (22%). 35% of all cats displayed excessive behaviours consistent with pruritic skin disease. Interestingly, 27% of these cats were perceived as non-symptomatic by their owners, suggesting an under-identification of itch-related signs. Furthermore, a significant proportion of symptomatic cats did not receive any skin disease medication, whether prescribed or over the counter (n=41). These findings indicate a higher incidence of pruritic skin disease in cats than recognized by pet owners, potentially leading to a lack of medical intervention for clinically symptomatic cases. The comparison between the survey and social media listening data revealed bald spots were reported in similar proportions in both datasets (25% in the survey and 28% in SML). Infection, crusting, scaling, redness, scabbing, scaling, or bumpy skin accounted for 31% of symptoms in the survey, whereas it represented 53% of relevant SML posts (excluding bumpy skin). Abnormal licking or chewing behaviours were mentioned by pet-parents in 40% of SML posts compared to 38% in the survey. The consistency in the findings of these two disparate data sources, including a complete overlap in affected body areas for the top 80% of social media listening posts, indicates minimal biases in each method, as significant biases would likely yield divergent results. Therefore, the strong agreement across pruritic symptoms, affected body areas, and reported behaviours enhances our confidence in the reliability of the findings. Moreover, the small differences identified between the datasets underscore the valuable insights that arise from utilising multiple data sources. These variations provide additional depth in characterising and managing feline pruritus, allowing for more comprehensive understanding of the condition. By combining survey data and social media listening, researchers can obtain a nuanced perspective and capture a wider range of experiences and perspectives, supporting informed decision-making in veterinary practice.Keywords: social media listening, feline pruritus, surveys, felines, cats, pet owners
Procedia PDF Downloads 12726873 The Influence of Housing Choice Vouchers on the Private Rental Market
Authors: Randy D. Colon
Abstract:
Through a freedom of information request, data pertaining to Housing Choice Voucher (HCV) households has been obtained from the Chicago Housing Authority, including rent price and number of bedrooms per HCV household, community area, and zip code from 2013 to the first quarter of 2018. Similar data pertaining to the private rental market will be obtained through public records found through the United States Department of Housing and Urban Development. The datasets will be analyzed through statistical and mapping software to investigate the potential link between HCV households and distorted rent prices. Quantitative data will be supplemented by qualitative data to investigate the lived experience of Chicago residents. Qualitative data will be collected at community meetings in the Chicago Englewood neighborhood through participation in neighborhood meetings and informal interviews with residents and community leaders. The qualitative data will be used to gain insight on the lived experience of community leaders and residents of the Englewood neighborhood in relation to housing, the rental market, and HCV. While there is an abundance of quantitative data on this subject, this qualitative data is necessary to capture the lived experience of local residents effected by a changing rental market. This topic reflects concerns voiced by members of the Englewood community, and this study aims to keep the community relevant in its findings.Keywords: Chicago, housing, housing choice voucher program, housing subsidies, rental market
Procedia PDF Downloads 11826872 The Dynamic Metadata Schema in Neutron and Photon Communities: A Case Study of X-Ray Photon Correlation Spectroscopy
Authors: Amir Tosson, Mohammad Reza, Christian Gutt
Abstract:
Metadata stands at the forefront of advancing data management practices within research communities, with particular significance in the realms of neutron and photon scattering. This paper introduces a groundbreaking approach—dynamic metadata schema—within the context of X-ray Photon Correlation Spectroscopy (XPCS). XPCS, a potent technique unravelling nanoscale dynamic processes, serves as an illustrative use case to demonstrate how dynamic metadata can revolutionize data acquisition, sharing, and analysis workflows. This paper explores the challenges encountered by the neutron and photon communities in navigating intricate data landscapes and highlights the prowess of dynamic metadata in addressing these hurdles. Our proposed approach empowers researchers to tailor metadata definitions to the evolving demands of experiments, thereby facilitating streamlined data integration, traceability, and collaborative exploration. Through tangible examples from the XPCS domain, we showcase how embracing dynamic metadata standards bestows advantages, enhancing data reproducibility, interoperability, and the diffusion of knowledge. Ultimately, this paper underscores the transformative potential of dynamic metadata, heralding a paradigm shift in data management within the neutron and photon research communities.Keywords: metadata, FAIR, data analysis, XPCS, IoT
Procedia PDF Downloads 6226871 A Study on Green Building Certification Systems within the Context of Anticipatory Systems
Authors: Taner Izzet Acarer, Ece Ceylan Baba
Abstract:
This paper examines green building certification systems and their current processes in comparison with anticipatory systems. Rapid growth of human population and depletion of natural resources are causing irreparable damage to urban and natural environment. In this context, the concept of ‘sustainable architecture’ has emerged in the 20th century so as to establish and maintain standards for livable urban spaces, to improve quality of urban life, and to preserve natural resources for future generations. The construction industry is responsible for a large part of the resource consumption and it is believed that the ‘green building’ designs that emerge in construction industry can reduce environmental problems and contribute to sustainable development around the world. A building must meet a specific set of criteria, set forth through various certification systems, in order to be eligible for designation as a green building. It is disputable whether methods used by green building certification systems today truly serve the purposes of creating a sustainable world. Accordingly, this study will investigate the sets of rating systems used by the most popular green building certification programs, including LEED (Leadership in Energy and Environmental Design), BREEAM (Building Research Establishment's Environmental Assessment Methods), DGNB (Deutsche Gesellschaft für Nachhaltiges Bauen System), in terms of ‘Anticipatory Systems’ in accordance with the certification processes and their goals, while discussing their contribution to architecture. The basic methodology of the study is as follows. Firstly analyzes of brief historical and literature review of green buildings and certificate systems will be stated. Secondly, processes of green building certificate systems will be disputed by the help of anticipatory systems. Anticipatory Systems is a set of systems designed to generate action-oriented projections and to forecast potential side effects using the most current data. Anticipatory Systems pull the future into the present and take action based on future predictions. Although they do not have a claim to see into the future, they can provide foresight data. When shaping the foresight data, Anticipatory Systems use feedforward instead of feedback, enabling them to forecast the system’s behavior and potential side effects by establishing a correlation between the system’s present/past behavior and projected results. This study indicates the goals and current status of LEED, BREEAM and DGNB rating systems that created by using the feedback technique will be examined and presented in a chart. In addition, by examining these rating systems with the anticipatory system that using the feedforward method, the negative influences of the potential side effects on the purpose and current status of the rating systems will be shown in another chart. By comparing the two obtained data, the findings will be shown that rating systems are used for different goals than the purposes they are aiming for. In conclusion, the side effects of green building certification systems will be stated by using anticipatory system models.Keywords: anticipatory systems, BREEAM, certificate systems, DGNB, green buildings, LEED
Procedia PDF Downloads 22026870 MNH-886(Bt.): A Cotton Cultivar (G. Hirsutum L.) for Cultivation in Virus Infested Regions of Pakistan, Having High Seed Cotton Yield and Desirable Fibre Characteristics
Authors: Wajad Nazeer, Saghir Ahmad, Khalid Mahmood, Altaf Hussain, Abid Mahmood, Baoliang Zhou
Abstract:
MNH-886(Bt.) is a upland cotton cultivar (Gossypium hirsutum L.) developed through hybridization of three parents [(FH-207×MNH-770)×Bollgard-1] at Cotton Research Station Multan, Pakistan. It is resistant to CLCuVD with 16.25 % disease incidence (60 DAS, March sowing) whereas moderately susceptible to CLCuVD when planted in June with disease incidence 34 % (60 DAS). This disease reaction was lowest among 25 cotton advanced lines/varieties tested at hot spots of CLCuVD. Its performance was tested during 2009 to 2012 in various indigenous, provincial, and national varietal trials in comparison with the commercial variety IR-3701 and AA-802 & CIM-496. In PCCT trial during 2009-10; 2011-12, MNH-886 surpassed all the existing Bt. strains along with commercial varieties across the Punjab province with seed cotton yield production 2658 kg ha-1 and 2848 kg ha-1 which was 81.31 and 13% higher than checks, respectively. In National Coordinated Bt. Trial, MNH-886(Bt.) produced 3347 kg ha-1 seed cotton at CCRI, Multan; the hot spot of CLCuVD, in comparison to IR-3701 which gave 2556 kg ha-1. It possesses higher lint percentage (41.01%), along with the most desirable fibre traits (staple length 28.210mm, micronaire value 4.95 µg inch-1 and fibre strength 99.5 tppsi, and uniformity ratio 82.0%). The quantification of toxicity level of crystal protein was found positive for Cry1Ab/Ac protein with toxicity level 2.76µg g-1 and Mon 531 event was confirmed. Having tremendous yield potential, good fibre traits, and great tolerance to CLCuVD we can recommended this variety for cultivation in CLCuVD hotspots of Pakistan.Keywords: cotton, cultivar, cotton leaf curl virus, CLCuVD hit districts
Procedia PDF Downloads 31826869 Comparing Productivity of the Foreign versus Local Construction Workers Based on Their Level of Technical Training and Cultural Characteristics: Case Study of Kish Island, Iran
Authors: Mansour Rezvani, Mohammad Mahdi Mortaheb
Abstract:
This study considers the employment of foreign workforce in Kish Free Trade and Industrial Zone and aims to investigate the productivity of foreign construction labours as compared to their local counterpart. Moreover, this study compares work skills and experience of foreign and local Iranian construction workers to optimize construction working conditions. The results and findings have been effectively applied to develop a training program to optimize and promote Iranian workforce productivity and effectiveness in construction industry in comparison with foreign workforce. It is hoped that the accumulated findings contribute to decrease demand for foreign workers and skills shortages in construction sectors. Therefore, job vacancies for local residents in Kish and other looking for job people in main lands will be increased. The method of collecting data has been conducted by distributing a questionnaire and interviewing most foreign construction workers, local Iranian construction works and the project managers of five mega projects in Kish Island including Mica mall, Basak, Persian, Damoon and Sarina mall. All data have been analyzed by SPSS and Excel software. A topic-related survey was conducted through a structured questionnaire including 54 employers, 20 contractors and 13 consultants. About 56 factors were identified. After implementing the context validity test, 52 factors were stated in 52 questions based on five major groups consist of: (1) economical, (2) social and cultural, (3) individual, (4) technical, (5) organizational, environmental and legal. Based on the quantified Relative Importance Index, the ten most important factors, ten less important factors, and three most important categories were identified. To date, there is not any comprehensive study that explores the important critical factors in mega construction projects on Kish Island to identify the major problems to decrease demand for foreign workers.Keywords: cultural characteristics, foreign worker, local construction workers, productivity, technical training
Procedia PDF Downloads 14826868 An Integrated Lightweight Naïve Bayes Based Webpage Classification Service for Smartphone Browsers
Authors: Mayank Gupta, Siba Prasad Samal, Vasu Kakkirala
Abstract:
The internet world and its priorities have changed considerably in the last decade. Browsing on smart phones has increased manifold and is set to explode much more. Users spent considerable time browsing different websites, that gives a great deal of insight into user’s preferences. Instead of plain information classifying different aspects of browsing like Bookmarks, History, and Download Manager into useful categories would improve and enhance the user’s experience. Most of the classification solutions are server side that involves maintaining server and other heavy resources. It has security constraints and maybe misses on contextual data during classification. On device, classification solves many such problems, but the challenge is to achieve accuracy on classification with resource constraints. This on device classification can be much more useful in personalization, reducing dependency on cloud connectivity and better privacy/security. This approach provides more relevant results as compared to current standalone solutions because it uses content rendered by browser which is customized by the content provider based on user’s profile. This paper proposes a Naive Bayes based lightweight classification engine targeted for a resource constraint devices. Our solution integrates with Web Browser that in turn triggers classification algorithm. Whenever a user browses a webpage, this solution extracts DOM Tree data from the browser’s rendering engine. This DOM data is a dynamic, contextual and secure data that can’t be replicated. This proposal extracts different features of the webpage that runs on an algorithm to classify into multiple categories. Naive Bayes based engine is chosen in this solution for its inherent advantages in using limited resources compared to other classification algorithms like Support Vector Machine, Neural Networks, etc. Naive Bayes classification requires small memory footprint and less computation suitable for smartphone environment. This solution has a feature to partition the model into multiple chunks that in turn will facilitate less usage of memory instead of loading a complete model. Classification of the webpages done through integrated engine is faster, more relevant and energy efficient than other standalone on device solution. This classification engine has been tested on Samsung Z3 Tizen hardware. The Engine is integrated into Tizen Browser that uses Chromium Rendering Engine. For this solution, extensive dataset is sourced from dmoztools.net and cleaned. This cleaned dataset has 227.5K webpages which are divided into 8 generic categories ('education', 'games', 'health', 'entertainment', 'news', 'shopping', 'sports', 'travel'). Our browser integrated solution has resulted in 15% less memory usage (due to partition method) and 24% less power consumption in comparison with standalone solution. This solution considered 70% of the dataset for training the data model and the rest 30% dataset for testing. An average accuracy of ~96.3% is achieved across the above mentioned 8 categories. This engine can be further extended for suggesting Dynamic tags and using the classification for differential uses cases to enhance browsing experience.Keywords: chromium, lightweight engine, mobile computing, Naive Bayes, Tizen, web browser, webpage classification
Procedia PDF Downloads 16326867 Exploring SSD Suitable Allocation Schemes Incompliance with Workload Patterns
Authors: Jae Young Park, Hwansu Jung, Jong Tae Kim
Abstract:
Whether the data has been well parallelized is an important factor in the Solid-State-Drive (SSD) performance. SSD parallelization is affected by allocation scheme and it is directly connected to SSD performance. There are dynamic allocation and static allocation in representative allocation schemes. Dynamic allocation is more adaptive in exploiting write operation parallelism, while static allocation is better in read operation parallelism. Therefore, it is hard to select the appropriate allocation scheme when the workload is mixed read and write operations. We simulated conditions on a few mixed data patterns and analyzed the results to help the right choice for better performance. As the results, if data arrival interval is long enough prior operations to be finished and continuous read intensive data environment static allocation is more suitable. Dynamic allocation performs the best on write performance and random data patterns.Keywords: dynamic allocation, NAND flash based SSD, SSD parallelism, static allocation
Procedia PDF Downloads 34026866 Employing Remotely Sensed Soil and Vegetation Indices and Predicting by Long Short-Term Memory to Irrigation Scheduling Analysis
Authors: Elham Koohikerade, Silvio Jose Gumiere
Abstract:
In this research, irrigation is highlighted as crucial for improving both the yield and quality of potatoes due to their high sensitivity to soil moisture changes. The study presents a hybrid Long Short-Term Memory (LSTM) model aimed at optimizing irrigation scheduling in potato fields in Quebec City, Canada. This model integrates model-based and satellite-derived datasets to simulate soil moisture content, addressing the limitations of field data. Developed under the guidance of the Food and Agriculture Organization (FAO), the simulation approach compensates for the lack of direct soil sensor data, enhancing the LSTM model's predictions. The model was calibrated using indices like Surface Soil Moisture (SSM), Normalized Vegetation Difference Index (NDVI), Enhanced Vegetation Index (EVI), and Normalized Multi-band Drought Index (NMDI) to effectively forecast soil moisture reductions. Understanding soil moisture and plant development is crucial for assessing drought conditions and determining irrigation needs. This study validated the spectral characteristics of vegetation and soil using ECMWF Reanalysis v5 (ERA5) and Moderate Resolution Imaging Spectrometer (MODIS) data from 2019 to 2023, collected from agricultural areas in Dolbeau and Peribonka, Quebec. Parameters such as surface volumetric soil moisture (0-7 cm), NDVI, EVI, and NMDI were extracted from these images. A regional four-year dataset of soil and vegetation moisture was developed using a machine learning approach combining model-based and satellite-based datasets. The LSTM model predicts soil moisture dynamics hourly across different locations and times, with its accuracy verified through cross-validation and comparison with existing soil moisture datasets. The model effectively captures temporal dynamics, making it valuable for applications requiring soil moisture monitoring over time, such as anomaly detection and memory analysis. By identifying typical peak soil moisture values and observing distribution shapes, irrigation can be scheduled to maintain soil moisture within Volumetric Soil Moisture (VSM) values of 0.25 to 0.30 m²/m², avoiding under and over-watering. The strong correlations between parcels suggest that a uniform irrigation strategy might be effective across multiple parcels, with adjustments based on specific parcel characteristics and historical data trends. The application of the LSTM model to predict soil moisture and vegetation indices yielded mixed results. While the model effectively captures the central tendency and temporal dynamics of soil moisture, it struggles with accurately predicting EVI, NDVI, and NMDI.Keywords: irrigation scheduling, LSTM neural network, remotely sensed indices, soil and vegetation monitoring
Procedia PDF Downloads 4226865 Waters Colloidal Phase Extraction and Preconcentration: Method Comparison
Authors: Emmanuelle Maria, Pierre Crançon, Gaëtane Lespes
Abstract:
Colloids are ubiquitous in the environment and are known to play a major role in enhancing the transport of trace elements, thus being an important vector for contaminants dispersion. Colloids study and characterization are necessary to improve our understanding of the fate of pollutants in the environment. However, in stream water and groundwater, colloids are often very poorly concentrated. It is therefore necessary to pre-concentrate colloids in order to get enough material for analysis, while preserving their initial structure. Many techniques are used to extract and/or pre-concentrate the colloidal phase from bulk aqueous phase, but yet there is neither reference method nor estimation of the impact of these different techniques on the colloids structure, as well as the bias introduced by the separation method. In the present work, we have tested and compared several methods of colloidal phase extraction/pre-concentration, and their impact on colloids properties, particularly their size distribution and their elementary composition. Ultrafiltration methods (frontal, tangential and centrifugal) have been considered since they are widely used for the extraction of colloids in natural waters. To compare these methods, a ‘synthetic groundwater’ was used as a reference. The size distribution (obtained by Field-Flow Fractionation (FFF)) and the chemical composition of the colloidal phase (obtained by Inductively Coupled Plasma Mass Spectrometry (ICPMS) and Total Organic Carbon analysis (TOC)) were chosen as comparison factors. In this way, it is possible to estimate the pre-concentration impact on the colloidal phase preservation. It appears that some of these methods preserve in a more efficient manner the colloidal phase composition while others are easier/faster to use. The choice of the extraction/pre-concentration method is therefore a compromise between efficiency (including speed and ease of use) and impact on the structural and chemical composition of the colloidal phase. In perspective, the use of these methods should enhance the consideration of colloidal phase in the transport of pollutants in environmental assessment studies and forensics.Keywords: chemical composition, colloids, extraction, preconcentration methods, size distribution
Procedia PDF Downloads 21626864 The Challenges of Digital Crime Nowadays
Authors: Bendes Ákos
Abstract:
Digital evidence will be the most widely used type of evidence in the future. With the development of the modern world, more and more new types of crimes have evolved and transformed. For this reason, it is extremely important to examine these types of crimes in order to get a comprehensive picture of them, with which we can help the authorities work. In 1865, with early technologies, people were able to forge a picture of a quality that is not even recognized today. With the help of today's technology, authorities receive a lot of false evidence. Officials are not able to process such a large amount of data, nor do they have the necessary technical knowledge to get a real picture of the authenticity of the given evidence. The digital world has many dangers. Unfortunately, we live in an age where we must protect everything digitally: our phones, our computers, our cars, and all the smart devices that are present in our personal lives and this is not only a burden on us, since companies, state and public utilities institutions are also forced to do so. The training of specialists and experts is essential so that the authorities can manage the incoming digital evidence at some level. When analyzing evidence, it is important to be able to examine it from the moment it is created. Establishing authenticity is a very important issue during official procedures. After the proper acquisition of the evidence, it is essential to store it safely and use it professionally. After the proper acquisition of the evidence, it is essential to store it safely and use it professionally. Otherwise, they will not have sufficient probative value and in case of doubt, the court will always decide in favor of the defendant. One of the most common problems in the world of digital data and evidence is doubt, which is why it is extremely important to examine the above-mentioned problems. The most effective way to avoid digital crimes is to prevent them, for which proper education and knowledge are essential. The aim is to present the dangers inherent in the digital world and the new types of digital crimes. After the comparison of the Hungarian investigative techniques with international practice, modernizing proposals will be given. A sufficiently stable yet flexible legislation is needed that can monitor the rapid changes in the world and not regulate afterward but rather provide an appropriate framework. It is also important to be able to distinguish between digital and digitalized evidence, as the degree of probative force differs greatly. The aim of the research is to promote effective international cooperation and uniform legal regulation in the world of digital crimes.Keywords: digital crime, digital law, cyber crime, international cooperation, new crimes, skepticism
Procedia PDF Downloads 6326863 Social Data Aggregator and Locator of Knowledge (STALK)
Authors: Rashmi Raghunandan, Sanjana Shankar, Rakshitha K. Bhat
Abstract:
Social media contributes a vast amount of data and information about individuals to the internet. This project will greatly reduce the need for unnecessary manual analysis of large and diverse social media profiles by filtering out and combining the useful information from various social media profiles, eliminating irrelevant data. It differs from the existing social media aggregators in that it does not provide a consolidated view of various profiles. Instead, it provides consolidated INFORMATION derived from the subject’s posts and other activities. It also allows analysis over multiple profiles and analytics based on several profiles. We strive to provide a query system to provide a natural language answer to questions when a user does not wish to go through the entire profile. The information provided can be filtered according to the different use cases it is used for.Keywords: social network, analysis, Facebook, Linkedin, git, big data
Procedia PDF Downloads 44426862 Comparison of Modulus from Repeated Plate Load Test and Resonant Column Test for Compaction Control of Trackbed Foundation
Authors: JinWoog Lee, SeongHyeok Lee, ChanYong Choi, Yujin Lim, Hojin Cho
Abstract:
Primary function of the trackbed in a conventional railway track system is to decrease the stresses in the subgrade to be in an acceptable level. A properly designed trackbed layer performs this task adequately. Many design procedures have used assumed and/or are based on critical stiffness values of the layers obtained mostly in the field to calculate an appropriate thickness of the sublayers of the trackbed foundation. However, those stiffness values do not consider strain levels clearly and precisely in the layers. This study proposes a method of computation of stiffness that can handle with strain level in the layers of the trackbed foundation in order to provide properly selected design values of the stiffness of the layers. The shear modulus values are dependent on shear strain level so that the strain levels generated in the subgrade in the trackbed under wheel loading and below plate of Repeated Plate Bearing Test (RPBT) are investigated by finite element analysis program ABAQUS and PLAXIS programs. The strain levels generated in the subgrade from RPBT are compared to those values from RC (Resonant Column) test after some consideration of strain levels and stress consideration. For comparison of shear modulus G obtained from RC test and stiffness moduli Ev2 obtained from RPBT in the field, many numbers of mid-size RC tests in laboratory and RPBT in field were performed extensively. It was found in this study that there is a big difference in stiffness modulus when the converted Ev2 values were compared to those values of RC test. It is verified in this study that it is necessary to use precise and increased loading steps to construct nonlinear curves from RPBT in order to get correct Ev2 values in proper strain levels.Keywords: modulus, plate load test, resonant column test, trackbed foundation
Procedia PDF Downloads 49626861 Data Integrity between Ministry of Education and Private Schools in the United Arab Emirates
Authors: Rima Shishakly, Mervyn Misajon
Abstract:
Education is similar to other businesses and industries. Achieving data integrity is essential in order to attain a significant supporting for all the stakeholders in the educational sector. Efficient data collect, flow, processing, storing and retrieving are vital in order to deliver successful solutions to the different stakeholders. Ministry of Education (MOE) in United Arab Emirates (UAE) has adopted ‘Education 2020’ a series of five-year plans designed to introduce advanced education management information systems. As part of this program, in 2010 MOE implemented Student Information Systems (SIS) to manage and monitor the students’ data and information flow between MOE and international private schools in UAE. This paper is going to discuss data integrity concerns between MOE, and private schools. The paper will clarify the data integrity issues and will indicate the challenges that face private schools in UAE.Keywords: education management information systems (EMIS), student information system (SIS), United Arab Emirates (UAE), ministry of education (MOE), (KHDA) the knowledge and human development authority, Abu Dhabi educational counsel (ADEC)
Procedia PDF Downloads 22226860 The Cartometric-Geographical Analysis of Ivane Javakhishvili 1922: The Map of the Republic of Georgia
Authors: Manana Kvetenadze, Dali Nikolaishvili
Abstract:
The study revealed the territorial changes of Georgia before the Soviet and Post-Soviet periods. This includes the estimation of the country's borders, its administrative-territorial arrangement change as well as the establishment of territorial losses. Georgia’s old and new borders marked on the map are of great interest. The new boundary shows the condition of 1922 year, following the Soviet period. Neither on this map nor in other works Ivane Javakhishvili talks about what he implies in the old borders, though it is evident that this is the Pre-Soviet boundary until 1921 – i.e., before the period when historical Tao, Zaqatala, Lore, Karaia represented the parts of Georgia. According to cartometric-geographical terms, the work presents detailed analysis of Georgia’s borders, along with this the comparison of research results has been carried out: 1) At the boundary line on Soviet topographic maps, the maps of 100,000; 50,000 and 25,000 scales are used; 2) According to Ivane Javakhishvili’s work ('The borders of Georgia in terms of historical and contemporary issues'). During that research, we used multi-disciplined methodology and software. We used Arc GIS for Georeferencing maps, and after that, we compare all post-Soviet Union maps, in order to determine how the borders have changed. During this work, we also use many historical data. The features of the spatial distribution of the territorial administrative units of Georgia, as well as the distribution of administrative-territorial units of the objects depicted on the map, have been established. The results obtained are presented in the forms of thematic maps and diagrams.Keywords: border, GIS, georgia, historical cartography, old maps
Procedia PDF Downloads 24326859 Is HR in a State of Transition? An International Comparative Study on the Development of HR Competencies
Authors: Barbara Covarrubias Venegas, Sabine Groblschegg, Bernhard Klaus, Julia Domnanovich
Abstract:
Research Objectives: The roles and activities of human resource management (HRM) have changed a lot in the past years. Driven by a changing environment and therefore, new business requirements, the scope of human resource (HR) activities has widened. The extent to which these activities should focus on strategic issues to support the long-term success of a company has been discussed in science for many years. As many economies of Central and Eastern Europe (CEE) experienced a phase of transition after the socialist era and are now recovering from the 2008 global crisis it is needed to examine the current state of HR positioning. Furthermore, a trend in HR work developing from rather administrative units to being strategic partners of management can be noticed. This leads to the question of better understanding the underlying competencies which are necessary to support organisations. This topic was addressed by the international study “HR Competencies in international comparison”. The quantitative survey was conducted by the Institute for Human Resources & Organisation of FHWien University of Applied Science of WKW (A) in cooperation with partner universities in the countries Bosnia-Herzegovina, Croatia, Serbia and Slovenia. Methodology: Using the questionnaire developed by Dave Ulrich we tested whether the HR Competency model can be used for Austria, Bosnia and Herzegovina, Croatia, Serbia and Slovenia. After performing confirmatory and exploratory factor analysis for the whole data set containing all five countries we could clearly distinguish between four competencies. In a further step, our analysis focused on median and average comparisons between the HR competency dimensions. Conclusion: Our literature review, in alignment with other studies, shows a relatively rapid pace of development of HR Roles and HR Competencies in BCSS in the past decades. Comparing data from BCSS and Austria we still can notice that regards strategic orientation there is a lack in BCSS countries, thus competencies are not as developed as in Austria. This leads us to the tentative conclusion that HR has undergone a rapid change but is still in a State of Transition from being a rather administrative unit to performing the role of a strategic partner.Keywords: comparative study, HR competencies, HRM, HR roles
Procedia PDF Downloads 31026858 Towards a Balancing Medical Database by Using the Least Mean Square Algorithm
Authors: Kamel Belammi, Houria Fatrim
Abstract:
imbalanced data set, a problem often found in real world application, can cause seriously negative effect on classification performance of machine learning algorithms. There have been many attempts at dealing with classification of imbalanced data sets. In medical diagnosis classification, we often face the imbalanced number of data samples between the classes in which there are not enough samples in rare classes. In this paper, we proposed a learning method based on a cost sensitive extension of Least Mean Square (LMS) algorithm that penalizes errors of different samples with different weight and some rules of thumb to determine those weights. After the balancing phase, we applythe different classifiers (support vector machine (SVM), k- nearest neighbor (KNN) and multilayer neuronal networks (MNN)) for balanced data set. We have also compared the obtained results before and after balancing method.Keywords: multilayer neural networks, k- nearest neighbor, support vector machine, imbalanced medical data, least mean square algorithm, diabetes
Procedia PDF Downloads 53226857 A Gendered Perspective of the Influence of Public Transport Infrastructural Design on Accessibility
Authors: Ajeni Ari, Chiara Maria Leva, Lorraine D’Arcy, Mary Kinahan
Abstract:
In addressing gender and transport, considerations of mobility disparities amongst users are important. Public transport (PT) policy and design do not efficiently account for the varied mobility practices between men and women, with literature only recently showing a movement towards gender inclusion in transport. Arrantly, transport policy and designs remain gender-blind to the variation of mobility needs. The global movement towards sustainability highlights the need for expeditious strategies that could mitigate biases within the existing system. At the forefront of such a plan of action, in part, may be mandated inclusive infrastructural designs that stimulate user engagement with the transport system. Fundamentally access requires a means or an opportunity for the entity, which for PT is an establishment of its physical environment and/or infrastructural design. Its practicality may be utilised with knowledge of shortcomings in tangible or intangible aspects of the service offerings allowing access to opportunities. To inform on existing biases in PT planning and design, this study analyses qualitative data to examine the opinions and lived experiences among transport users in Ireland. Findings show that infrastructural design plays a significant role in users’ engagement with the service. Paramount to accessibility are service provisions that cater to both user interactions and those of their dependents. Apprehension to use the service is more so evident in women in comparison to men, particularly while carrying out household duties and caring responsibilities at peak times or dark hours. Furthermore, limitations are apparent with infrastructural service offerings that do not accommodate the physical (dis)ability of users, especially universal design. There are intersecting factors that impinge on accessibility, e.g., safety and security, yet essentially; the infrastructural design is an important influencing parameter to user perceptual conditioning. Additionally, data discloses the need for user intricacies to be factored in transport planning geared towards gender inclusivity, including mobility practices, travel purpose, transit time or location, and system integration.Keywords: infrastructure design, public transport, accessibility, women, gender
Procedia PDF Downloads 7526856 Performance of Shariah-Based Investment: Evidence from Pakistani Listed Firms
Authors: Mohsin Sadaqat, Hilal Anwar Butt
Abstract:
Following the stock selection guidelines provided by the Sharia Board (SB), we segregate the firms listed at Pakistan Stock Exchange (PSX) into Sharia Compliant (SC) and Non-Sharia Compliant (NSC) stocks. Subsequently, we form portfolios within each group based on market capitalization and volatility. The purpose is to analyze and compare the performance of these two groups as the SC stocks have lesser diversification opportunities due to SB restrictions. Using data ranging from January 2004 until June 2016, our results indicate that in most of the cases the risk-adjusted returns (alphas) for the returns differential between SC and NCS firms are positive. In addition, the SC firms in comparison to their counterparts in PSX provides excess returns that are hedged against the market, size, and value-based systematic risks factors. Overall, these results reconcile with one prevailing notion that the SC stocks that have lower financial leverage and higher investment in real assets are lesser exposed to market-based risks. Further, the SC firms that are more capitalized and less volatile, perform better than lower capitalized and higher volatile SC and NSC firms. To sum up our results, we do not find any substantial evidence for opportunity loss due to limited diversification opportunities in case of SC firms. To optimally utilize scarce resources, investors should consider SC firms as a candidate in portfolio construction.Keywords: diversification, performance, sharia compliant stocks, risk adjusted returns
Procedia PDF Downloads 19926855 Decarbonising Urban Building Heating: A Case Study on the Benefits and Challenges of Fifth-Generation District Heating Networks
Authors: Mazarine Roquet, Pierre Dewallef
Abstract:
The building sector, both residential and tertiary, accounts for a significant share of greenhouse gas emissions. In Belgium, partly due to poor insulation of the building stock, but certainly because of the massive use of fossil fuels for heating buildings, this share reaches almost 30%. To reduce carbon emissions from urban building heating, district heating networks emerge as a promising solution as they offer various assets such as improving the load factor, integrating combined heat and power systems, and enabling energy source diversification, including renewable sources and waste heat recovery. However, mainly for sake of simple operation, most existing district heating networks still operate at high or medium temperatures ranging between 120°C and 60°C (the socalled second and third-generations district heating networks). Although these district heating networks offer energy savings in comparison with individual boilers, such temperature levels generally require the use of fossil fuels (mainly natural gas) with combined heat and power. The fourth-generation district heating networks improve the transport and energy conversion efficiency by decreasing the operating temperature between 50°C and 30°C. Yet, to decarbonise the building heating one must increase the waste heat recovery and use mainly wind, solar or geothermal sources for the remaining heat supply. Fifth-generation networks operating between 35°C and 15°C offer the possibility to decrease even more the transport losses, to increase the share of waste heat recovery and to use electricity from renewable resources through the use of heat pumps to generate low temperature heat. The main objective of this contribution is to exhibit on a real-life test case the benefits of replacing an existing third-generation network by a fifth-generation one and to decarbonise the heat supply of the building stock. The second objective of the study is to highlight the difficulties resulting from the use of a fifth-generation, low-temperature, district heating network. To do so, a simulation model of the district heating network including its regulation is implemented in the modelling language Modelica. This model is applied to the test case of the heating network on the University of Liège's Sart Tilman campus, consisting of around sixty buildings. This model is validated with monitoring data and then adapted for low-temperature networks. A comparison of primary energy consumptions as well as CO2 emissions is done between the two cases to underline the benefits in term of energy independency and GHG emissions. To highlight the complexity of operating a lowtemperature network, the difficulty of adapting the mass flow rate to the heat demand is considered. This shows the difficult balance between the thermal comfort and the electrical consumption of the circulation pumps. Several control strategies are considered and compared to the global energy savings. The developed model can be used to assess the potential for energy and CO2 emissions savings retrofitting an existing network or when designing a new one.Keywords: building simulation, fifth-generation district heating network, low-temperature district heating network, urban building heating
Procedia PDF Downloads 8326854 Enhanced Calibration Map for a Four-Hole Probe for Measuring High Flow Angles
Authors: Jafar Mortadha, Imran Qureshi
Abstract:
This research explains and compares the modern techniques used for measuring the flow angles of a flowing fluid with the traditional technique of using multi-hole pressure probes. In particular, the focus of the study is on four-hole probes, which offer great reliability and benefits in several applications where the use of modern measurement techniques is either inconvenient or impractical. Due to modern advancements in manufacturing, small multi-hole pressure probes can be made with high precision, which eliminates the need for calibrating every manufactured probe. This study aims to improve the range of calibration maps for a four-hole probe to allow high flow angles to be measured accurately. The research methodology comprises a literature review of the successful calibration definitions that have been implemented on five-hole probes. These definitions are then adapted and applied on a four-hole probe using a set of raw pressures data. A comparison of the different definitions will be carried out in Matlab and the results will be analyzed to determine the best calibration definition. Taking simplicity of implementation into account as well as the reliability of flow angles estimation, an adapted technique from a research paper written in 2002 offered the most promising outcome. Consequently, the method is seen as a good enhancement for four-hole probes and it can substitute for the existing calibration definitions that offer less accuracy.Keywords: calibration definitions, calibration maps, flow measurement techniques, four-hole probes, multi-hole pressure probes
Procedia PDF Downloads 29526853 Data Protection, Data Privacy, Research Ethics in Policy Process Towards Effective Urban Planning Practice for Smart Cities
Authors: Eugenio Ferrer Santiago
Abstract:
The growing complexities of the modern world on high-end gadgets, software applications, scams, identity theft, and Artificial Intelligence (AI) make the “uninformed” the weak and vulnerable to be victims of cybercrimes. Artificial Intelligence is not a new thing in our daily lives; the principles of database management, logical programming, and garbage in and garbage out are all connected to AI. The Philippines had in place legal safeguards against the abuse of cyberspace, but self-regulation of key industry players and self-protection by individuals are primordial to attain the success of these initiatives. Data protection, Data Privacy, and Research Ethics must work hand in hand during the policy process in the course of urban planning practice in different environments. This paper focuses on the interconnection of data protection, data privacy, and research ethics in coming up with clear-cut policies against perpetrators in the urban planning professional practice relevant in sustainable communities and smart cities. This paper shall use expository methodology under qualitative research using secondary data from related literature, interviews/blogs, and the World Wide Web resources. The claims and recommendations of this paper will help policymakers and implementers in the policy cycle. This paper shall contribute to the body of knowledge as a simple treatise and communication channel to the reading community and future researchers to validate the claims and start an intellectual discourse for better knowledge generation for the good of all in the near future.Keywords: data privacy, data protection, urban planning, research ethics
Procedia PDF Downloads 6026852 The Effect of Hesperidin on Troponin's Serum Level Changes as a Heart Tissue Damage Biomarker Due to Gamma Irradiation of Rat's Mediastinum
Authors: G. H. Haddadi, S. Sajadi, R. Fardid, Z. Haddadi
Abstract:
The heart is a radiosensitive organ, and its damage is a dose-limiting factor in radiotherapy. Different side effects including vascular plaque and heart fibrosis occur in patients with thorax irradiation. The present study aimed to evaluate the radioprotective efficacy of Hesperidin (HES), a naturally occurring citrus flavanoglycone, against γ-radiation induced tissue damage in the heart of male rats. Sixty-eight rats were divided into four groups. The rats in group 1 received PBS, and those in group 2 received HES. Also, the rats in group 3 received PBS and underwent γ-irradiation, and those in group 4 received HES and underwent γ-irradiation. They were exposed to 20 Gy γ-radiation using a single fraction cobalt-60 unit, and the dose of Hesperidin was (100 mg/kg/d, orally) for 7 days prior irradiation. Each group was divided into two subgroups. Samplings of rats in subgroup A was done 4-6 hours after irradiation. The samples were sent to laboratory for determination of Troponin’s I (TnI) serum level changes as a cardiac biomarker. The remaining animals (subgroups B) were sacrificed 8 weeks after radiotherapy for histopathological evaluation. In group 3, TnI obviously increased in comparison with group 1 (p < 0.05). The comparison of groups 1 and 4 showed no significant difference. Evaluation of histopathological parameters in subgroup B showed significant differences between groups 1 and 3 in some of the cases. Inflammation (p=0.008), pericardial effusion (p=0.001) and vascular plaque (p=0.001) increased in the rats exposed to 20 Gy γ-irradiation. Using oral administration of HES significantly decreased all the above factors when compared to group 4 (P > 0.016). Administration of 100 mg/kg/day Hesperidin for 7 days resulted in decreased Troponin I and radiation heart injury. This agent may have protective effects against radiation-induced heart damage.Keywords: hesperidin, radioprotector, troponin I, cardiac inflammation, vascular plaque
Procedia PDF Downloads 25426851 Relationship-Centred Care in Cross-Linguistic Medical Encounters
Authors: Nami Matsumoto
Abstract:
This study explores the experiences of cross-linguistic medical encounters by patients, and their views of receiving language support therein, with a particular focus on Japanese-English cases. The aim of this study is to investigate the reason for the frequent use of a spouse as a communication mediator from a Japanese perspective, through a comparison with that of English speakers. This study conducts an empirical qualitative analysis of the accounts of informants. A total of 31 informants who have experienced Japanese-English cross-linguistic medical encounters were recruited in Australia and Japan for semi-structured in-depth interviews. A breakdown of informants is 15 English speakers and 16 Japanese speakers. In order to obtain a further insight into collected data, additional interviews were held with 4 Australian doctors who are familiar with using interpreters. This study was approved by the Australian National University Human Research Ethics Committee, and written consent to participate in this study was obtained from all participants. The interviews lasted up to over one hour. They were audio-recorded and subsequently transcribed by the author. Japanese transcriptions were translated into English by the author. An analysis of interview data found that patients value relationship in communication. Particularly, Japanese informants, who have an English-speaking spouse, value trust-based communication interventions by their spouse, regardless of the language proficiency of the spouse. In Australia, health care interpreters are required to abide by the national code of ethics for interpreters. The Code defines the role of an interpreter exclusively to be language rendition and enshrines the tenets of accuracy, confidentiality and professional role boundaries. However, the analysis found that an interpreter who strictly complies with the Code sometimes fails to render the real intentions of the patient and their doctor. Findings from the study suggest that an interpreter should not be detached from the context and should be more engaged in the needs of patients. Their needs are not always communicated by an interpreter when they simply follow a professional code of ethics. The concept of relationship-centred care should be incorporated in the professional practice of health care interpreters.Keywords: health care, Japanese-English medical encounters, language barriers, trust
Procedia PDF Downloads 26426850 Synopsis of Izmir Regional Plan and Interpretations about Tourism in Izmir
Authors: Yakin Ekin, Onur Akbulut
Abstract:
This study aims not only to create a summarized background for the effective and efficient use of the potential of Izmir by providing the strategic planning works and institutional and sectoral strategy documents with different purposes realized by all relevant institutions and organizations in Izmir and Aegean Region to steer towards the same priorities and aims, but also focuses on a criticism and comparison viewpoint about tourism sector in Izmir.Keywords: regional plan, Izmir, tourism, sectoral strategy
Procedia PDF Downloads 45126849 Eliciting and Confirming Data, Information, Knowledge and Wisdom in a Specialist Health Care Setting - The Wicked Method
Authors: Sinead Impey, Damon Berry, Selma Furtado, Miriam Galvin, Loretto Grogan, Orla Hardiman, Lucy Hederman, Mark Heverin, Vincent Wade, Linda Douris, Declan O'Sullivan, Gaye Stephens
Abstract:
Healthcare is a knowledge-rich environment. This knowledge, while valuable, is not always accessible outside the borders of individual clinics. This research aims to address part of this problem (at a study site) by constructing a maximal data set (knowledge artefact) for motor neurone disease (MND). This data set is proposed as an initial knowledge base for a concurrent project to develop an MND patient data platform. It represents the domain knowledge at the study site for the duration of the research (12 months). A knowledge elicitation method was also developed from the lessons learned during this process - the WICKED method. WICKED is an anagram of the words: eliciting and confirming data, information, knowledge, wisdom. But it is also a reference to the concept of wicked problems, which are complex and challenging, as is eliciting expert knowledge. The method was evaluated at a second site, and benefits and limitations were noted. Benefits include that the method provided a systematic way to manage data, information, knowledge and wisdom (DIKW) from various sources, including healthcare specialists and existing data sets. Limitations surrounded the time required and how the data set produced only represents DIKW known during the research period. Future work is underway to address these limitations.Keywords: healthcare, knowledge acquisition, maximal data sets, action design science
Procedia PDF Downloads 36226848 Biogas from Cover Crops and Field Residues: Effects on Soil, Water, Climate and Ecological Footprint
Authors: Manfred Szerencsits, Christine Weinberger, Maximilian Kuderna, Franz Feichtinger, Eva Erhart, Stephan Maier
Abstract:
Cover or catch crops have beneficial effects for soil, water, erosion, etc. If harvested, they also provide feedstock for biogas without competition for arable land in regions, where only one main crop can be produced per year. On average gross energy yields of approx. 1300 m³ methane (CH4) ha-1 can be expected from 4.5 tonnes (t) of cover crop dry matter (DM) in Austria. Considering the total energy invested from cultivation to compression for biofuel use a net energy yield of about 1000 m³ CH4 ha-1 is remaining. With the straw of grain maize or Corn Cob Mix (CCM) similar energy yields can be achieved. In comparison to catch crops remaining on the field as green manure or to complete fallow between main crops the effects on soil, water and climate can be improved if cover crops are harvested without soil compaction and digestate is returned to the field in an amount equivalent to cover crop removal. In this way, the risk of nitrate leaching can be reduced approx. by 25% in comparison to full fallow. The risk of nitrous oxide emissions may be reduced up to 50% by contrast with cover crops serving as green manure. The effects on humus content and erosion are similar or better than those of cover crops used as green manure when the same amount of biomass was produced. With higher biomass production the positive effects increase even if cover crops are harvested and the only digestate is brought back to the fields. The ecological footprint of arable farming can be reduced by approx. 50% considering the substitution of natural gas with CH4 produced from cover crops.Keywords: biogas, cover crops, catch crops, land use competition, sustainable agriculture
Procedia PDF Downloads 54226847 Effects of Various Wavelet Transforms in Dynamic Analysis of Structures
Authors: Seyed Sadegh Naseralavi, Sadegh Balaghi, Ehsan Khojastehfar
Abstract:
Time history dynamic analysis of structures is considered as an exact method while being computationally intensive. Filtration of earthquake strong ground motions applying wavelet transform is an approach towards reduction of computational efforts, particularly in optimization of structures against seismic effects. Wavelet transforms are categorized into continuum and discrete transforms. Since earthquake strong ground motion is a discrete function, the discrete wavelet transform is applied in the present paper. Wavelet transform reduces analysis time by filtration of non-effective frequencies of strong ground motion. Filtration process may be repeated several times while the approximation induces more errors. In this paper, strong ground motion of earthquake has been filtered once applying each wavelet. Strong ground motion of Northridge earthquake is filtered applying various wavelets and dynamic analysis of sampled shear and moment frames is implemented. The error, regarding application of each wavelet, is computed based on comparison of dynamic response of sampled structures with exact responses. Exact responses are computed by dynamic analysis of structures applying non-filtered strong ground motion.Keywords: wavelet transform, computational error, computational duration, strong ground motion data
Procedia PDF Downloads 378