Search results for: algorithms and data structure
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 32319

Search results for: algorithms and data structure

30009 Crystal Structures and High-Temperature Phase Transitions of the New Ordered Double Perovskites SrCaCoTeO6 and SrCaNiTeO6

Authors: Asmaa Zaraq

Abstract:

In the present work we report X-ray powder diffraction measurements of SrCaCoTeO6 and SrCaNiTeO6, at different temperatures. The crystal structures at room temperature of both compounds are determined; and results showing the existence of high-temperature phase transitions in them are presented. Both compounds have double perovskite structure with 1:1 ordered arrangement of the B site cations. At room temperature their symmetries are described with the P21/n space group, that correspond to the (a+b-b-) tilt system. The evolution with temperature of the structure of both compounds shows the presence of three phase transitions: a continuous one, at 450 and 500 K, a discontinuous one, at 700 and 775 K, and a continuous one at 900 and 950 K for SrCaCoTeO6 and SrCaNiTeO6, respectively with the following phase-transition sequence: P21/n → I2/m → I4/m → Fm-3m.

Keywords: double perovskites, caracterisation DRX, transition de phase

Procedia PDF Downloads 525
30008 Structural and Electronic Properties of the Rock-salt BaxSr1−xS Alloys

Authors: B. Bahloul, K. Babesse, A. Dkhira, Y. Bahloul, L. Amirouche

Abstract:

Structural and electronic properties of the rock-salt BaxSr1−xS are calculated using the first-principles calculations based on the density functional theory (DFT) within the generalized gradient approximation (GGA), the local density approximation (LDA) and the virtual-crystal approximation (VCA). The calculated lattice parameters at equilibrium volume for x=0 and x=1 are in good agreement with the literature data. The BaxSr1−xS alloys are found to be an indirect band gap semiconductor. Moreoever, for the composition (x) ranging between [0-1], we think that our results are well discussed and well predicted.

Keywords: semiconductor, Ab initio calculations, rocksalt, band structure, BaxSr1−xS

Procedia PDF Downloads 400
30007 Seismic Behavior of Self-Balancing Post-Tensioned Reinforced Concrete Spatial Structure

Authors: Mircea Pastrav, Horia Constantinescu

Abstract:

The construction industry is currently trying to develop sustainable reinforced concrete structures. In trying to aid in the effort, the research presented in this paper aims to prove the efficiency of modified special hybrid moment frames composed of discretely jointed precast and post-tensioned concrete members. This aim is due to the fact that current design standards do not cover the spatial design of moment frame structures assembled by post-tensioning with special hybrid joints. This lack of standardization is coupled with the fact that previous experimental programs, available in scientific literature, deal mainly with plane structures and offer little information regarding spatial behavior. A spatial model of a modified hybrid moment frame is experimentally analyzed. The experimental results of a natural scale model test of a corner column-beams sub-structure, cut from an actual multilevel building tested to seismic type loading are presented in order to highlight the behavior of this type of structure. The test is performed under alternative cycles of imposed lateral displacements, up to a storey drift ratio of 0.035. Seismic response of the spatial model is discussed considering the acceptance criteria for reinforced concrete frame structures designed based on experimental tests, as well as some of its major sustainability features. The results obtained show an overall excellent behavior of the system. The joint detailing allows for quick and cheap repairs after an accidental event and a self-balancing behavior of the system that ensures it can be used almost immediately after an accidental event it.

Keywords: modified hybrid joint, seismic type loading response, self-balancing structure, acceptance criteria

Procedia PDF Downloads 242
30006 Study of the Best Algorithm to Estimate Sunshine Duration from Global Radiation on Horizontal Surface for Tropical Region

Authors: Tovondahiniriko Fanjirindratovo, Olga Ramiarinjanahary, Paulisimone Rasoavonjy

Abstract:

The sunshine duration, which is the sum of all the moments when the solar beam radiation is up to a minimal value, is an important parameter for climatology, tourism, agriculture and solar energy. Its measure is usually given by a pyrheliometer installed on a two-axis solar tracker. Due to the high cost of this device and the availability of global radiation on a horizontal surface, on the other hand, several studies have been done to make a correlation between global radiation and sunshine duration. Most of these studies are fitted for the northern hemisphere using a pyrheliometric database. The aim of the present work is to list and assess all the existing methods and apply them to Reunion Island, a tropical region in the southern hemisphere. Using a database of ten years, global, diffuse and beam radiation for a horizontal surface are employed in order to evaluate the uncertainty of existing algorithms for a tropical region. The methodology is based on indirect comparison because the solar beam radiation is not measured but calculated by the beam radiation on a horizontal surface and the sun elevation angle.

Keywords: Carpentras method, data fitting, global radiation, sunshine duration, Slob and Monna algorithm, step algorithm

Procedia PDF Downloads 130
30005 Evaluation of the Families' Psychological Nature and the Relationship between the Academic Success According to the Students' Opinion

Authors: Sebnem Erismen, Ahmet Guneyli, Azize Ummanel

Abstract:

The purpose of this study is to explore the relationship between the students' academic success and families' psychological nature. The study based upon the quantitative research, and descriptive model is used. Relational descriptive model is used while evaluating the relation between families’ psychological nature and the academic success level of the students. A total of 523 secondary school students have participated the study. Personal Information Form, Family Structure Evaluation Form (FSEF) and School Reports were employed as the primary methods of data gathering. ANOVA and LSD Scheffe Test were used for analysing the data. Results of the study indicate that there are differences between the FSEF scores according to the students’ and teachers’ gender; however, no differences between the class level and seniority of the teachers were seen. Regarding the academic success of the students, it was seen that majority of them have high points. It was also seen that the academic success level of the students differentiates regarding to the classroom teachers’ gender and seniority. In conclusion, it was seen that there is a relation between the families’ psychological nature and students' academic success.

Keywords: families’ perceived psychological nature, academic success, families effect on the academic success, education

Procedia PDF Downloads 297
30004 Vertical Structure and Frequencies of Deep Convection during Active Periods of the West African Monsoon Season

Authors: Balogun R. Ayodeji, Adefisan E. Adesanya, Adeyewa Z. Debo, E. C. Okogbue

Abstract:

Deep convective systems during active periods of the West African monsoon season have not been properly investigated over better temporal and spatial resolution in West Africa. Deep convective systems are investigated over seven climatic zones of the West African sub-region, which are; west-coast rainforest, dry rainforest, Nigeria-Cameroon rainforest, Nigeria savannah, Central African and South Sudan (CASS) Savannah, Sudano-Sahel, and Sahel, using data from Tropical Rainfall Measurement Mission (TRMM) Precipitation Feature (PF) database. The vertical structure of the convective systems indicated by the presence of at least one 40 dBZ and reaching (attaining) at least 1km in the atmosphere showed strong core (highest frequency (%)) of reflectivity values around 2 km which is below the freezing level (4-5km) for all the zones. Echoes are detected above the 15km altitude much more frequently in the rainforest and Savannah zones than the Sudano and Sahel zones during active periods in March-May (MAM), whereas during active periods in June-September (JJAS) the savannahs, Sudano and Sahel zones convections tend to reach higher altitude more frequently than the rainforest zones. The percentage frequencies of deep convection indicated that the occurrences of the systems are within the range of 2.3-2.8% during both March-May (MAM) and June-September (JJAS) active periods in the rainforest and savannah zones. On the contrary, the percentage frequencies were found to be less than 2% in the Sudano and Sahel zones, except during the active-JJAS period in the Sudano zone.

Keywords: active periods, convective system, frequency, reflectivity

Procedia PDF Downloads 156
30003 Multiscale Hub: An Open-Source Framework for Practical Atomistic-To-Continuum Coupling

Authors: Masoud Safdari, Jacob Fish

Abstract:

Despite vast amount of existing theoretical knowledge, the implementation of a universal multiscale modeling, analysis, and simulation software framework remains challenging. Existing multiscale software and solutions are often domain-specific, closed-source and mandate a high-level of experience and skills in both multiscale analysis and programming. Furthermore, tools currently existing for Atomistic-to-Continuum (AtC) multiscaling are developed with the assumptions such as accessibility of high-performance computing facilities to the users. These issues mentioned plus many other challenges have reduced the adoption of multiscale in academia and especially industry. In the current work, we introduce Multiscale Hub (MsHub), an effort towards making AtC more accessible through cloud services. As a joint effort between academia and industry, MsHub provides a universal web-enabled framework for practical multiscaling. Developed on top of universally acclaimed scientific programming language Python, the package currently provides an open-source, comprehensive, easy-to-use framework for AtC coupling. MsHub offers an easy to use interface to prominent molecular dynamics and multiphysics continuum mechanics packages such as LAMMPS and MFEM (a free, lightweight, scalable C++ library for finite element methods). In this work, we first report on the design philosophy of MsHub, challenges identified and issues faced regarding its implementation. MsHub takes the advantage of a comprehensive set of tools and algorithms developed for AtC that can be used for a variety of governing physics. We then briefly report key AtC algorithms implemented in MsHub. Finally, we conclude with a few examples illustrating the capabilities of the package and its future directions.

Keywords: atomistic, continuum, coupling, multiscale

Procedia PDF Downloads 179
30002 A Data Driven Methodological Approach to Economic Pre-Evaluation of Reuse Projects of Ancient Urban Centers

Authors: Pietro D'Ambrosio, Roberta D'Ambrosio

Abstract:

The upgrading of the architectural and urban heritage of the urban historic centers almost always involves the planning for the reuse and refunctionalization of the structures. Such interventions have complexities linked to the need to take into account the urban and social context in which the structure and its intrinsic characteristics such as historical and artistic value are inserted. To these, of course, we have to add the need to make a preliminary estimate of recovery costs and more generally to assess the economic and financial sustainability of the whole project of re-socialization. Particular difficulties are encountered during the pre-assessment of costs since it is often impossible to perform analytical surveys and structural tests for both structural conditions and obvious cost and time constraints. The methodology proposed in this work, based on a multidisciplinary and data-driven approach, is aimed at obtaining, at very low cost, reasonably priced economic evaluations of the interventions to be carried out. In addition, the specific features of the approach used, derived from the predictive analysis techniques typically applied in complex IT domains (big data analytics), allow to obtain as a result indirectly the evaluation process of a shared database that can be used on a generalized basis to estimate such other projects. This makes the methodology particularly indicated in those cases where it is expected to intervene massively across entire areas of historical city centers. The methodology has been partially tested during a study aimed at assessing the feasibility of a project for the reuse of the monumental complex of San Massimo, located in the historic center of Salerno, and is being further investigated.

Keywords: evaluation, methodology, restoration, reuse

Procedia PDF Downloads 191
30001 Artificial Law: Legal AI Systems and the Need to Satisfy Principles of Justice, Equality and the Protection of Human Rights

Authors: Begum Koru, Isik Aybay, Demet Celik Ulusoy

Abstract:

The discipline of law is quite complex and has its own terminology. Apart from written legal rules, there is also living law, which refers to legal practice. Basic legal rules aim at the happiness of individuals in social life and have different characteristics in different branches such as public or private law. On the other hand, law is a national phenomenon. The law of one nation and the legal system applied on the territory of another nation may be completely different. People who are experts in a particular field of law in one country may have insufficient expertise in the law of another country. Today, in addition to the local nature of law, international and even supranational law rules are applied in order to protect basic human values and ensure the protection of human rights around the world. Systems that offer algorithmic solutions to legal problems using artificial intelligence (AI) tools will perhaps serve to produce very meaningful results in terms of human rights. However, algorithms to be used should not be developed by only computer experts, but also need the contribution of people who are familiar with law, values, judicial decisions, and even the social and political culture of the society to which it will provide solutions. Otherwise, even if the algorithm works perfectly, it may not be compatible with the values of the society in which it is applied. The latest developments involving the use of AI techniques in legal systems indicate that artificial law will emerge as a new field in the discipline of law. More AI systems are already being applied in the field of law, with examples such as predicting judicial decisions, text summarization, decision support systems, and classification of documents. Algorithms for legal systems employing AI tools, especially in the field of prediction of judicial decisions and decision support systems, have the capacity to create automatic decisions instead of judges. When the judge is removed from this equation, artificial intelligence-made law created by an intelligent algorithm on its own emerges, whether the domain is national or international law. In this work, the aim is to make a general analysis of this new topic. Such an analysis needs both a literature survey and a perspective from computer experts' and lawyers' point of view. In some societies, the use of prediction or decision support systems may be useful to integrate international human rights safeguards. In this case, artificial law can serve to produce more comprehensive and human rights-protective results than written or living law. In non-democratic countries, it may even be thought that direct decisions and artificial intelligence-made law would be more protective instead of a decision "support" system. Since the values of law are directed towards "human happiness or well-being", it requires that the AI algorithms should always be capable of serving this purpose and based on the rule of law, the principle of justice and equality, and the protection of human rights.

Keywords: AI and law, artificial law, protection of human rights, AI tools for legal systems

Procedia PDF Downloads 79
30000 A Compact Via-less Ultra-Wideband Microstrip Filter by Utilizing Open-Circuit Quarter Wavelength Stubs

Authors: Muhammad Yasir Wadood, Fatemeh Babaeian

Abstract:

By developing ultra-wideband (UWB) systems, there is a high demand for UWB filters with low insertion loss, wide bandwidth, and having a planar structure which is compatible with other components of the UWB system. A microstrip interdigital filter is a great option for designing UWB filters. However, the presence of via holes in this structure creates difficulties in the fabrication procedure of the filter. Especially in the higher frequency band, any misalignment of the drilled via hole with the Microstrip stubs causes large errors in the measurement results compared to the desired results. Moreover, in this case (high-frequency designs), the line width of the stubs are very narrow, so highly precise small via holes are required to be implemented, which increases the cost of fabrication significantly. Also, in this case, there is a risk of having fabrication errors. To combat this issue, in this paper, a via-less UWB microstrip filter is proposed which is designed based on a modification of a conventional inter-digital bandpass filter. The novel approaches in this filter design are 1) replacement of each via hole with a quarter-wavelength open circuit stub to avoid the complexity of manufacturing, 2) using a bend structure to reduce the unwanted coupling effects and 3) minimising the size. Using the proposed structure, a UWB filter operating in the frequency band of 3.9-6.6 GHz (1-dB bandwidth) is designed and fabricated. The promising results of the simulation and measurement are presented in this paper. The selected substrate for these designs was Rogers RO4003 with a thickness of 20 mils. This is a common substrate in most of the industrial projects. The compact size of the proposed filter is highly beneficial for applications which require a very miniature size of hardware.

Keywords: band-pass filters, inter-digital filter, microstrip, via-less

Procedia PDF Downloads 161
29999 An Integrated Framework for Wind-Wave Study in Lakes

Authors: Moien Mojabi, Aurelien Hospital, Daniel Potts, Chris Young, Albert Leung

Abstract:

The wave analysis is an integral part of the hydrotechnical assessment carried out during the permitting and design phases for coastal structures, such as marinas. This analysis aims in quantifying: i) the Suitability of the coastal structure design against Small Craft Harbour wave tranquility safety criterion; ii) Potential environmental impacts of the structure (e.g., effect on wave, flow, and sediment transport); iii) Mooring and dock design and iv) Requirements set by regulatory agency’s (e.g., WSA section 11 application). While a complex three-dimensional hydrodynamic modelling approach can be applied on large-scale projects, the need for an efficient and reliable wave analysis method suitable for smaller scale marina projects was identified. As a result, Tetra Tech has developed and applied an integrated analysis framework (hereafter TT approach), which takes the advantage of the state-of-the-art numerical models while preserving the level of simplicity that fits smaller scale projects. The present paper aims to describe the TT approach and highlight the key advantages of using this integrated framework in lake marina projects. The core of this methodology is made by integrating wind, water level, bathymetry, and structure geometry data. To respond to the needs of specific projects, several add-on modules have been added to the core of the TT approach. The main advantages of this method over the simplified analytical approaches are i) Accounting for the proper physics of the lake through the modelling of the entire lake (capturing real lake geometry) instead of a simplified fetch approach; ii) Providing a more realistic representation of the waves by modelling random waves instead of monochromatic waves; iii) Modelling wave-structure interaction (e.g. wave transmission/reflection application for floating structures and piles amongst others); iv) Accounting for wave interaction with the lakebed (e.g. bottom friction, refraction, and breaking); v) Providing the inputs for flow and sediment transport assessment at the project site; vi) Taking in consideration historical and geographical variations of the wind field; and vii) Independence of the scale of the reservoir under study. Overall, in comparison with simplified analytical approaches, this integrated framework provides a more realistic and reliable estimation of wave parameters (and its spatial distribution) in lake marinas, leading to a realistic hydrotechnical assessment accessible to any project size, from the development of a new marina to marina expansion and pile replacement. Tetra Tech has successfully utilized this approach since many years in the Okanagan area.

Keywords: wave modelling, wind-wave, extreme value analysis, marina

Procedia PDF Downloads 87
29998 Data Quality as a Pillar of Data-Driven Organizations: Exploring the Benefits of Data Mesh

Authors: Marc Bachelet, Abhijit Kumar Chatterjee, José Manuel Avila

Abstract:

Data quality is a key component of any data-driven organization. Without data quality, organizations cannot effectively make data-driven decisions, which often leads to poor business performance. Therefore, it is important for an organization to ensure that the data they use is of high quality. This is where the concept of data mesh comes in. Data mesh is an organizational and architectural decentralized approach to data management that can help organizations improve the quality of data. The concept of data mesh was first introduced in 2020. Its purpose is to decentralize data ownership, making it easier for domain experts to manage the data. This can help organizations improve data quality by reducing the reliance on centralized data teams and allowing domain experts to take charge of their data. This paper intends to discuss how a set of elements, including data mesh, are tools capable of increasing data quality. One of the key benefits of data mesh is improved metadata management. In a traditional data architecture, metadata management is typically centralized, which can lead to data silos and poor data quality. With data mesh, metadata is managed in a decentralized manner, ensuring accurate and up-to-date metadata, thereby improving data quality. Another benefit of data mesh is the clarification of roles and responsibilities. In a traditional data architecture, data teams are responsible for managing all aspects of data, which can lead to confusion and ambiguity in responsibilities. With data mesh, domain experts are responsible for managing their own data, which can help provide clarity in roles and responsibilities and improve data quality. Additionally, data mesh can also contribute to a new form of organization that is more agile and adaptable. By decentralizing data ownership, organizations can respond more quickly to changes in their business environment, which in turn can help improve overall performance by allowing better insights into business as an effect of better reports and visualization tools. Monitoring and analytics are also important aspects of data quality. With data mesh, monitoring, and analytics are decentralized, allowing domain experts to monitor and analyze their own data. This will help in identifying and addressing data quality problems in quick time, leading to improved data quality. Data culture is another major aspect of data quality. With data mesh, domain experts are encouraged to take ownership of their data, which can help create a data-driven culture within the organization. This can lead to improved data quality and better business outcomes. Finally, the paper explores the contribution of AI in the coming years. AI can help enhance data quality by automating many data-related tasks, like data cleaning and data validation. By integrating AI into data mesh, organizations can further enhance the quality of their data. The concepts mentioned above are illustrated by AEKIDEN experience feedback. AEKIDEN is an international data-driven consultancy that has successfully implemented a data mesh approach. By sharing their experience, AEKIDEN can help other organizations understand the benefits and challenges of implementing data mesh and improving data quality.

Keywords: data culture, data-driven organization, data mesh, data quality for business success

Procedia PDF Downloads 140
29997 Using Corpora in Semantic Studies of English Adjectives

Authors: Oxana Lukoshus

Abstract:

The methods of corpus linguistics, a well-established field of research, are being increasingly applied in cognitive linguistics. Corpora data are especially useful for different quantitative studies of grammatical and other aspects of language. The main objective of this paper is to demonstrate how present-day corpora can be applied in semantic studies in general and in semantic studies of adjectives in particular. Polysemantic adjectives have been the subject of numerous studies. But most of them have been carried out on dictionaries. Undoubtedly, dictionaries are viewed as one of the basic data sources, but only at the initial steps of a research. The author usually starts with the analysis of the lexicographic data after which s/he comes up with a hypothesis. In the research conducted three polysemantic synonyms true, loyal, faithful have been analyzed in terms of differences and similarities in their semantic structure. A corpus-based approach in the study of the above-mentioned adjectives involves the following. After the analysis of the dictionary data there was the reference to the following corpora to study the distributional patterns of the words under study – the British National Corpus (BNC) and the Corpus of Contemporary American English (COCA). These corpora are continually updated and contain thousands of examples of the words under research which make them a useful and convenient data source. For the purpose of this study there were no special needs regarding genre, mode or time of the texts included in the corpora. Out of the range of possibilities offered by corpus-analysis software (e.g. word lists, statistics of word frequencies, etc.), the most useful tool for the semantic analysis was the extracting a list of co-occurrence for the given search words. Searching by lemmas, e.g. true, true to, and grouping the results by lemmas have proved to be the most efficient corpora feature for the adjectives under the study. Following the search process, the corpora provided a list of co-occurrences, which were then to be analyzed and classified. Not every co-occurrence was relevant for the analysis. For example, the phrases like An enormous sense of responsibility to protect the minds and hearts of the faithful from incursions by the state was perceived to be the basic duty of the church leaders or ‘True,’ said Phoebe, ‘but I'd probably get to be a Union Official immediately were left out as in the first example the faithful is a substantivized adjective and in the second example true is used alone with no other parts of speech. The subsequent analysis of the corpora data gave the grounds for the distribution groups of the adjectives under the study which were then investigated with the help of a semantic experiment. To sum it up, the corpora-based approach has proved to be a powerful, reliable and convenient tool to get the data for the further semantic study.

Keywords: corpora, corpus-based approach, polysemantic adjectives, semantic studies

Procedia PDF Downloads 320
29996 Architectures and Implementations of Data Spaces: A Comparative Study of Gaia-X and Eclipse Data Space Components Frameworks

Authors: Ryan Kelvin Ford

Abstract:

For individuals and organizations, significant potential benefits were assured by sharing the data in a secure, trusted, and standardized environment. Technical trust and standards help each participant to use data space securely to share and access data. Sharing data in a safe environment helps acquire new business opportunities. Data sovereignty, interoperability, and trust were considered key factors to evaluate data spaces. Businesses and policymakers assure a fair data economy by integrating data space in organizations. A collaborative environment was needed to facilitate data sharing among organizations, satisfied with the implementation of different architectures using data spaces such as Eclipse Data Space Components (EDC), International Data Space Association (IDSA), Gaia-X, and Gaia-X Federation Services (GXFS). The last 15 years of application were reviewed and compared based on the architectures and implementations of different data spaces such as IDSA, EDC, Gaia-X and GXFS, EDC framework, IDSA, GXFS, data connector, data space architecture, characteristics of data space connectors, federated data spaces initiatives, data spaces overview, eclipse data space connector, designing data spaces, building data spaces based on technical overview, European future digital ecosystem based on Gaia-Vision and strategy of Gaia-Architecture. Empirical research based on an organized view was conducted. The current discussion elaborates on the systematic review of the impact of data space technology from various perspectives. The systematic review uses multiple databases such as IEEE Explore, Taylor & Francis, Science Direct, and Google Scholar to pursue publications on the impact of Data space from January 2019 to December 2024. The search results showcased a comparative review of 150 articles, out of which 20 were related to the IDSA, Gaia‑X, and EDC architecture and implementation.

Keywords: IDSA, Gaia-X, Gaia-X architecture, EDC, EDC architecture, GXFS architecture, IDSA, data space connector

Procedia PDF Downloads 9
29995 Development of Work Breakdown Structure for EVMS in South Korea

Authors: Dong-Ho Kim, Su-Sang Lim, Sang-Won Han, Chang-Taek Hyun

Abstract:

In the construction site, the cost and schedules are the most important management elements. Despite efforts to integrated management the cost and schedule, WBS classification is struggling to differ from each other. The cost and schedule can be integrated and can be managed due to the characteristic of the detail system in the case of Korea around the axis of pressure and official fixture system. In this research, the Work Breakdown Structure (WBS) integrating the cost and schedules around in government office construction, WBS which can be used in common was presented in order to analyze the detail system of the public institution construction and improve. As to this method, the efficient administration of not only the link application of the cost and schedule but also construction project is expected.

Keywords: WBS, EVMS, integrated cost and schedule, Korea case

Procedia PDF Downloads 389
29994 Thermal Reduction of Perfect Well Identified Hexagonal Graphene Oxide Nano-Sheets for Super-Capacitor Applications

Authors: A. N. Fouda

Abstract:

A novel well identified hexagonal graphene oxide (GO) nano-sheets were synthesized using modified Hummer method. Low temperature thermal reduction at 350°C in air ambient was performed. After thermal reduction, typical few layers of thermal reduced GO (TRGO) with dimension of few hundreds nanometers were observed using high resolution transmission electron microscopy (HRTEM). GO has a lot of structure models due to variation of the preparation process. Determining the atomic structure of GO is essential for a better understanding of its fundamental properties and for realization of the future technological applications. Structural characterization was identified by x-ray diffraction (XRD), Fourier transform infra-red spectroscopy (FTIR) measurements. A comparison between exper- imental and theoretical IR spectrum were done to confirm the match between experimentally and theoretically proposed GO structure. Partial overlap of the experimental IR spectrum with the theoretical IR was confirmed. The electrochemical properties of TRGO nano-sheets as electrode materials for supercapacitors were investigated by cyclic voltammetry and electrochemical impedance spectroscopy (EIS) measurements. An enhancement in supercapacitance after reduction was confirmed and the area of the CV curve for the TRGO electrode is larger than those for the GO electrode indicating higher specific capacitance which is promising in super-capacitor applications

Keywords: hexagonal graphene oxide, thermal reduction, cyclic voltammetry

Procedia PDF Downloads 497
29993 Regret-Regression for Multi-Armed Bandit Problem

Authors: Deyadeen Ali Alshibani

Abstract:

In the literature, the multi-armed bandit problem as a statistical decision model of an agent trying to optimize his decisions while improving his information at the same time. There are several different algorithms models and their applications on this problem. In this paper, we evaluate the Regret-regression through comparing with Q-learning method. A simulation on determination of optimal treatment regime is presented in detail.

Keywords: optimal, bandit problem, optimization, dynamic programming

Procedia PDF Downloads 456
29992 Model-Based Fault Diagnosis in Carbon Fiber Reinforced Composites Using Particle Filtering

Authors: Hong Yu, Ion Matei

Abstract:

Carbon fiber reinforced composites (CFRP) used as aircraft structure are subject to lightning strike, putting structural integrity under risk. Indirect damage may occur after a lightning strike where the internal structure can be damaged due to excessive heat induced by lightning current, while the surface of the structures remains intact. Three damage modes may be observed after a lightning strike: fiber breakage, inter-ply delamination and intra-ply cracks. The assessment of internal damage states in composite is challenging due to complicated microstructure, inherent uncertainties, and existence of multiple damage modes. In this work, a model based approach is adopted to diagnose faults in carbon composites after lighting strikes. A resistor network model is implemented to relate the overall electrical and thermal conduction behavior under simulated lightning current waveform to the intrinsic temperature dependent material properties, microstructure and degradation of materials. A fault detection and identification (FDI) module utilizes the physics based model and a particle filtering algorithm to identify damage mode as well as calculate the probability of structural failure. Extensive simulation results are provided to substantiate the proposed fault diagnosis methodology with both single fault and multiple faults cases. The approach is also demonstrated on transient resistance data collected from a IM7/Epoxy laminate under simulated lightning strike.

Keywords: carbon composite, fault detection, fault identification, particle filter

Procedia PDF Downloads 198
29991 Multichannel Scheme under Fairness Environment for Cognitive Radio Networks

Authors: Hans Marquez Ramos, Cesar Hernandez, Ingrid Páez

Abstract:

This paper develops a multiple channel assignment model, which allows to take advantage in most efficient way, spectrum opportunities in cognitive radio networks. Developed scheme allows make several available and frequency adjacent channel assignments, which require a bigger wide band, under an equality environment. The hybrid assignment model it is made by to algorithms, one who makes the ranking and select available frequency channels and the other one in charge of establishing an equality criteria, in order to not restrict spectrum opportunities for all other secondary users who wish to make transmissions. Measurements made were done for average bandwidth, average delay, as well fairness computation for several channel assignment. Reached results were evaluated with experimental spectrum occupational data from GSM frequency band captured. Developed model, shows evidence of improvement in spectrum opportunity use and a wider average transmit bandwidth for each secondary user, maintaining equality criteria in channel assignment.

Keywords: bandwidth, fairness, multichannel, secondary users

Procedia PDF Downloads 508
29990 The Role of the Elastic Foundation Having Nonlinear Stiffness Properties in the Vibration of Structures

Authors: E. Feulefack Songong, A. Zingoni

Abstract:

A vibration is a mechanical phenomenon whereby oscillations occur about an equilibrium point. Although vibrations can be linear or nonlinear depending on the basic components of the system, the interest is mostly pointed towards nonlinear vibrations. This is because most structures around us are to some extent nonlinear and also because we need more accurate values in an analysis. The goal of this research is the integration of nonlinearities in the development and validation of structural models and to ameliorate the resistance of structures when subjected to loads. Although there exist many types of nonlinearities, this thesis will mostly focus on the vibration of free and undamped systems incorporating nonlinearity due to stiffness. Nonlinear stiffness has been a concern to many engineers in general and Civil engineers in particular because it is an important factor that can bring a good modification and amelioration to the response of structures when subjected to loads. The analysis of systems will be done analytically and then numerically to validate the analytical results. We will first show the benefit and importance of stiffness nonlinearity when it is implemented in the structure. Secondly, We will show how its integration in the structure can improve not only the structure’s performance but also its response when subjected to loads. The results of this study will be valuable to practicing engineers as well as industry practitioners in developing better designs and tools for their structures and mechanical devices. They will also serve to engineers to design lighter and stronger structures and to give good predictions as for the behavior of structures when subjected to external loads.

Keywords: elastic foundation, nonlinear, plates, stiffness, structures, vibration

Procedia PDF Downloads 139
29989 Big Data Analysis with RHadoop

Authors: Ji Eun Shin, Byung Ho Jung, Dong Hoon Lim

Abstract:

It is almost impossible to store or analyze big data increasing exponentially with traditional technologies. Hadoop is a new technology to make that possible. R programming language is by far the most popular statistical tool for big data analysis based on distributed processing with Hadoop technology. With RHadoop that integrates R and Hadoop environment, we implemented parallel multiple regression analysis with different sizes of actual data. Experimental results showed our RHadoop system was much faster as the number of data nodes increases. We also compared the performance of our RHadoop with lm function and big lm packages available on big memory. The results showed that our RHadoop was faster than other packages owing to paralleling processing with increasing the number of map tasks as the size of data increases.

Keywords: big data, Hadoop, parallel regression analysis, R, RHadoop

Procedia PDF Downloads 439
29988 A Mutually Exclusive Task Generation Method Based on Data Augmentation

Authors: Haojie Wang, Xun Li, Rui Yin

Abstract:

In order to solve the memorization overfitting in the meta-learning MAML algorithm, a method of generating mutually exclusive tasks based on data augmentation is proposed. This method generates a mutex task by corresponding one feature of the data to multiple labels, so that the generated mutex task is inconsistent with the data distribution in the initial dataset. Because generating mutex tasks for all data will produce a large number of invalid data and, in the worst case, lead to exponential growth of computation, this paper also proposes a key data extraction method, that only extracts part of the data to generate the mutex task. The experiments show that the method of generating mutually exclusive tasks can effectively solve the memorization overfitting in the meta-learning MAML algorithm.

Keywords: data augmentation, mutex task generation, meta-learning, text classification.

Procedia PDF Downloads 99
29987 Evaluating Seismic Earth Pressure Effects on Building Lateral Stability: Sensitivity to Retention Height Differences and Sloped Site Conditions

Authors: Rod Davis, Sara Saminfar

Abstract:

Earthquakes can induce dynamic earth pressures on retaining walls, which are in addition to the static earth pressures. This raises questions about how to effectively combine the seismic lateral earth pressure with other loads on buildings, including static lateral earth pressure. When basement walls retain soil with differing exterior grades on opposite sides, the seismic increment of active earth pressure should be considered. Additionally, buildings situated on sloped sites with stepped retention may experience unique dynamic effects due to soil-structure interactions, potentially amplifying the lateral pressures exerted on the retaining walls and influencing the building's response during seismic events. To account for the dynamic effects of the retained soil on the building's responses, it is essential to interconnect the building structure with the surrounding soil to facilitate their interaction as the embedded structure and the surrounding soil move together during an earthquake. Consequently, a finite element model of the building is developed, with the rigid retaining walls and restrained to the floor diaphragms. This paper aims to explore the dynamic effects of retained soil on the lateral stability of buildings and the sensitivity of the building's responses to differences in the retained heights on opposite sides of the building basement. Furthermore, the results are compared with those from a sloped site to evaluate the impact of stepped retention on dynamic soil pressure. These findings will help establish a minimum threshold for differences in retained heights on opposite sides of a building that necessitates the inclusion of dynamic soil pressure in the building's lateral stability analysis.

Keywords: dynamic earth pressures, soil-structure interaction, stepped retention, building retention

Procedia PDF Downloads 16
29986 Through Additive Manufacturing. A New Perspective for the Mass Production of Made in Italy Products

Authors: Elisabetta Cianfanelli, Paolo Pupparo, Maria Claudia Coppola

Abstract:

The recent evolutions in the innovation processes and in the intrinsic tendencies of the product development process, lead to new considerations on the design flow. The instability and complexity that contemporary life describes, defines new problems in the production of products, stimulating at the same time the adoption of new solutions across the entire design process. The advent of Additive Manufacturing, but also of IOT and AI technologies, continuously puts us in front of new paradigms regarding design as a social activity. The totality of these technologies from the point of view of application describes a whole series of problems and considerations immanent to design thinking. Addressing these problems may require some initial intuition and the use of some provisional set of rules or plausible strategies, i.e., heuristic reasoning. At the same time, however, the evolution of digital technology and the computational speed of new design tools describe a new and contrary design framework in which to operate. It is therefore interesting to understand the opportunities and boundaries of the new man-algorithm relationship. The contribution investigates the man-algorithm relationship starting from the state of the art of the Made in Italy model, the most known fields of application are described and then focus on specific cases in which the mutual relationship between man and AI becomes a new driving force of innovation for entire production chains. On the other hand, the use of algorithms could engulf many design phases, such as the definition of shape, dimensions, proportions, materials, static verifications, and simulations. Operating in this context, therefore, becomes a strategic action, capable of defining fundamental choices for the design of product systems in the near future. If there is a human-algorithm combination within a new integrated system, quantitative values can be controlled in relation to qualitative and material values. The trajectory that is described therefore becomes a new design horizon in which to operate, where it is interesting to highlight the good practices that already exist. In this context, the designer developing new forms can experiment with ways still unexpressed in the project and can define a new synthesis and simplification of algorithms, so that each artifact has a signature in order to define in all its parts, emotional and structural. This signature of the designer, a combination of values and design culture, will be internal to the algorithms and able to relate to digital technologies, creating a generative dialogue for design purposes. The result that is envisaged indicates a new vision of digital technologies, no longer understood only as of the custodians of vast quantities of information, but also as a valid integrated tool in close relationship with the design culture.

Keywords: decision making, design euristics, product design, product design process, design paradigms

Procedia PDF Downloads 121
29985 An Artificial Neural Network Model Based Study of Seismic Wave

Authors: Hemant Kumar, Nilendu Das

Abstract:

A study based on ANN structure gives us the information to predict the size of the future in realizing a past event. ANN, IMD (Indian meteorological department) data and remote sensing were used to enable a number of parameters for calculating the size that may occur in the future. A threshold selected specifically above the high-frequency harvest reached the area during the selected seismic activity. In the field of human and local biodiversity it remains to obtain the right parameter compared to the frequency of impact. But during the study the assumption is that predicting seismic activity is a difficult process, not because of the parameters involved here, which can be analyzed and funded in research activity.

Keywords: ANN, Bayesion class, earthquakes, IMD

Procedia PDF Downloads 129
29984 Efficient Positioning of Data Aggregation Point for Wireless Sensor Network

Authors: Sifat Rahman Ahona, Rifat Tasnim, Naima Hassan

Abstract:

Data aggregation is a helpful technique for reducing the data communication overhead in wireless sensor network. One of the important tasks of data aggregation is positioning of the aggregator points. There are a lot of works done on data aggregation. But, efficient positioning of the aggregators points is not focused so much. In this paper, authors are focusing on the positioning or the placement of the aggregation points in wireless sensor network. Authors proposed an algorithm to select the aggregators positions for a scenario where aggregator nodes are more powerful than sensor nodes.

Keywords: aggregation point, data communication, data aggregation, wireless sensor network

Procedia PDF Downloads 165
29983 Application of Ontologies to Contract for Difference Documents

Authors: Renato Figueira Franco

Abstract:

This paper aims to create a representational information system applied to the securities market, particularly the development of an ontology applied to the analysis of the Key Information Documents of Contracts for Difference. The process of obtaining knowledge and its proper formal representation has raised the attention both from the scientific literature and the capital markets supervisory authorities. The formal knowledge representation is embodied in the construction of ontologies, which are responsible for defining a knowledge base structure of a given scientific domain, facilitating its understanding, and allowing its sharing among the scientific community. The scope of this study is restricted to the analysis of capital markets ontologies in order to capture its structure, semantics and knowledge sharing between people and systems.

Keywords: ontology, financial markets, CFD, PRIIPs, key information documents

Procedia PDF Downloads 71
29982 Spatial Econometric Approaches for Count Data: An Overview and New Directions

Authors: Paula Simões, Isabel Natário

Abstract:

This paper reviews a number of theoretical aspects for implementing an explicit spatial perspective in econometrics for modelling non-continuous data, in general, and count data, in particular. It provides an overview of the several spatial econometric approaches that are available to model data that are collected with reference to location in space, from the classical spatial econometrics approaches to the recent developments on spatial econometrics to model count data, in a Bayesian hierarchical setting. Considerable attention is paid to the inferential framework, necessary for structural consistent spatial econometric count models, incorporating spatial lag autocorrelation, to the corresponding estimation and testing procedures for different assumptions, to the constrains and implications embedded in the various specifications in the literature. This review combines insights from the classical spatial econometrics literature as well as from hierarchical modeling and analysis of spatial data, in order to look for new possible directions on the processing of count data, in a spatial hierarchical Bayesian econometric context.

Keywords: spatial data analysis, spatial econometrics, Bayesian hierarchical models, count data

Procedia PDF Downloads 598
29981 Increasing the Speed of the Apriori Algorithm by Dimension Reduction

Authors: A. Abyar, R. Khavarzadeh

Abstract:

The most basic and important decision-making tool for industrial and service managers is understanding the market and customer behavior. In this regard, the Apriori algorithm, as one of the well-known machine learning methods, is used to identify customer preferences. On the other hand, with the increasing diversity of goods and services and the speed of changing customer behavior, we are faced with big data. Also, due to the large number of competitors and changing customer behavior, there is an urgent need for continuous analysis of this big data. While the speed of the Apriori algorithm decreases with increasing data volume. In this paper, the big data PCA method is used to reduce the dimension of the data in order to increase the speed of Apriori algorithm. Then, in the simulation section, the results are examined by generating data with different volumes and different diversity. The results show that when using this method, the speed of the a priori algorithm increases significantly.

Keywords: association rules, Apriori algorithm, big data, big data PCA, market basket analysis

Procedia PDF Downloads 12
29980 A Novel RLS Based Adaptive Filtering Method for Speech Enhancement

Authors: Pogula Rakesh, T. Kishore Kumar

Abstract:

Speech enhancement is a long standing problem with numerous applications like teleconferencing, VoIP, hearing aids, and speech recognition. The motivation behind this research work is to obtain a clean speech signal of higher quality by applying the optimal noise cancellation technique. Real-time adaptive filtering algorithms seem to be the best candidate among all categories of the speech enhancement methods. In this paper, we propose a speech enhancement method based on Recursive Least Squares (RLS) adaptive filter of speech signals. Experiments were performed on noisy data which was prepared by adding AWGN, Babble and Pink noise to clean speech samples at -5dB, 0dB, 5dB, and 10dB SNR levels. We then compare the noise cancellation performance of proposed RLS algorithm with existing NLMS algorithm in terms of Mean Squared Error (MSE), Signal to Noise ratio (SNR), and SNR loss. Based on the performance evaluation, the proposed RLS algorithm was found to be a better optimal noise cancellation technique for speech signals.

Keywords: adaptive filter, adaptive noise canceller, mean squared error, noise reduction, NLMS, RLS, SNR, SNR loss

Procedia PDF Downloads 486