Search results for: raw complex data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 28319

Search results for: raw complex data

27179 Resveratrol-Phospholipid Complex for Sustained Delivery of Resveratrol via the Skin for the Treatment of Inflammatory Diseases

Authors: Malay K. Das, Bhupen Kalita

Abstract:

The poor oral bioavailability of resveratrol (RSV) due to presystemic metabolism can be avoided via dermal route of administration. The hydrophilic-lipophilic nature of resveratrol-phospholipid complex (RSVPs) favors the delivery of resveratrol via the skin. The RSVPs embedded polymeric patch with moderate adhesiveness was developed for dermal application for sustained anti-inflammatory effect. The prepared patches were evaluated for various physicochemical properties, surface morphology by SEM, TEM, and compatibility of patch components by FT-IR and DSC studies. The dermal flux of the optimized patch formulation was found to be at 4.28 ± 0.48 mg/cm2/24 h. The analysis of skin extract after permeation study revealed the presence of resveratrol, which confirmed the localization of RSVPs in the skin. The stability of RSVPs in the polymeric patch and the physiologic environment was confirmed by FE-SEM studies on the patches after drug release and skin permeation studies. The RSVPs particles released from the polymer matrix maintaining the structural integrity and permeate the keratinized horney layer of skin. The optimized patch formulation showed sustained anti-inflammatory effect (84.10% inhibition of inflammation at 24 h) in carrageenan-induced rat paw edema model compared to marketed diclofenac sodium gel (39.58% inhibition of inflammation at 24 h). The CLSM study confirmed the localization of RSVPs for a longer period, thus enabling drug targeting to the dermis for sustained anti-inflammatory effect. Histological studies with phase contrast trinocular microscope suggested no alteration of skin integrity and no evidence of the presence of inflammatory cells after exposure to the permeants. The patch was found to be safe for skin application as evaluated by Draize method for skin irritation scoring in a rabbit model. These results suggest the therapeutic efficacy of the developed patch in both acute and chronic inflammatory diseases.

Keywords: resveratrol-phospholipid complex, skin delivery, sustained anti-inflammatory effect, inflammatory diseases, dermal patch

Procedia PDF Downloads 228
27178 The Role of Land Consolidation to Reduce Soil Degradation in the Czech Republic

Authors: Miroslav Dumbrovsky

Abstract:

The paper deals with positive impacts of land consolidation on decreasing soil degradation with the main emphasis on soil and water conservation in the landscape. The importance of land degradation is very high because of its impact on crop productivity and many other adverse effects. Soil degradation through soil erosion is causing losses in crop productivity and quality of the environment, through decreasing quality of soil and water (especially water resources). Negative effects of conventional farming practices are increased water erosion, as well as crusting and compaction of the topsoil and subsoil. Soil erosion caused by water destructs the soil’s structure, reduces crop productivity due to deterioration in soil physical and chemical properties such as infiltration rate, water holding capacity, loss of nutrients needed for crop production, and loss of soil carbon. Recently, a new process of complex land consolidation in the Czech Republic has provided a unique opportunity for improving the quality of the environment and sustainability of the crop production by means a better soil and water conservation. The present process of the complex land consolidation is not only a reallocation of plots, but this system consists of a new layout of plots within a certain territory, aimed at establishing the integrated land-use economic units, based on the needs of individual landowners and land users. On the other hand, the interests of the general public and the environmental protection have to be solved, too. From the general point of view, a large part of the Czech landscape shall be reconstructed in the course of complex land consolidation projects. These projects will be based on new integrated soil-economic units, spatially arranged in a designed multifunctional system of soil and water conservation measures, such as path network and a territorial system of ecological stability, according to structural changes in agriculture. This new approach will be the basis of a rational economic utilization of the region which will comply with the present ecological and aesthetic demands at present.

Keywords: soil degradation, land consolidation, soil erosion, soil conservation

Procedia PDF Downloads 350
27177 An Improved Approach for Hybrid Rocket Injection System Design

Authors: M. Invigorito, G. Elia, M. Panelli

Abstract:

Hybrid propulsion combines beneficial properties of both solid and liquid rockets, such as multiple restarts, throttability as well as simplicity and reduced costs. A nitrous oxide (N2O)/paraffin-based hybrid rocket engine demonstrator is currently under development at the Italian Aerospace Research Center (CIRA) within the national research program HYPROB, funded by the Italian Ministry of Research. Nitrous oxide belongs to the class of self-pressurizing propellants that exhibit a high vapor pressure at standard ambient temperature. This peculiar feature makes those fluids very attractive for space rocket applications because it avoids the use of complex pressurization systems, leading to great benefits in terms of weight savings and reliability. To avoid feed-system-coupled instabilities, the phase change is required to occur through the injectors. In this regard, the oxidizer is stored in liquid condition while target chamber pressures are designed to lie below vapor pressure. The consequent cavitation and flash vaporization constitute a remarkably complex phenomenology that arises great modelling challenges. Thus, it is clear that the design of the injection system is fundamental for the full exploitation of hybrid rocket engine throttability. The Analytical Hierarchy Process has been used to select the injection architecture as best compromise among different design criteria such as functionality, technology innovation and cost. The impossibility to use engineering simplified relations for the dimensioning of the injectors led to the needs of applying a numerical approach based on OpenFOAM®. The numerical tool has been validated with selected experimental data from literature. Quantitative, as well as qualitative comparisons are performed in terms of mass flow rate and pressure drop across the injector for several operating conditions. The results show satisfactory agreement with the experimental data. Modeling assumptions, together with their impact on numerical predictions are discussed in the paper. Once assessed the reliability of the numerical tool, the injection plate has been designed and sized to guarantee the required amount of oxidizer in the combustion chamber and therefore to assure high combustion efficiency. To this purpose, the plate has been designed with multiple injectors whose number and diameter have been selected in order to reach the requested mass flow rate for the two operating conditions of maximum and minimum thrust. The overall design has been finally verified through three-dimensional computations in cavitating non-reacting conditions and it has been verified that the proposed design solution is able to guarantee the requested values of mass flow rates.

Keywords: hybrid rocket, injection system design, OpenFOAM®, cavitation

Procedia PDF Downloads 211
27176 Chemical and Biological Examination of De-Oiled Indian Propolis

Authors: Harshada Vaidya-Kannur, Dattatraya Naik

Abstract:

Propolis, one of the beehive products also referred as bee-glue is sticky dark coloured complex mixture of compounds. The volatile oil can be isolated from the propolis by hydrodistillation. The mark that is left behind after the removal of volatile oil is referred as the de-oiled propolis. Antioxidant as well as anti-inflammatory properties of total ethanolic extract of de-oiled propolis (TEEDP) was investigated. Another lot of deoiled propolis was successively exacted with hexane, ethyl acetate and ethanol. Activities of these fractions were also determined. Antioxidant activity was determined by studying ABTS, DPPH and NO radical scavenging. Determination of anti-inflammatory activity was carried out by topical TPA induced mouse ear oedema model. It is noteworthy that ethyl acetate fraction of deoiled propolis (EAFDP) exhibited 49.45 % TEAC activity at the concentration 0.2 mg/ml which is equivalent to the activity of trolox at the concentration 0.2 mg/ml. Its DPPH scavenging activity (72.56%) was closely comparable to that of trolox (75%). However its NO scavenging activity was comparatively low. From IC50 values it could be concluded that the efficiency of scavenging ABTS radicals by the de-oiled propolis was more pronounced as compared to scavenging of other radicals. Studies by TPA induced mouse ear inflammation model indicated that the de-oiled propolis of Indian origin had significant topical anti-inflammatory activity. The EAFDP was found to be the most active fraction for this activity also. The purification of EAFP yielded six pure crystalline compounds. These compounds were identified by their physical data and spectral data.

Keywords: anti-inflammatory activity, anti-oxidant activity, column chromatography, de-oiled propolis

Procedia PDF Downloads 285
27175 Steps towards the Development of National Health Data Standards in Developing Countries

Authors: Abdullah I. Alkraiji, Thomas W. Jackson, Ian Murray

Abstract:

The proliferation of health data standards today is somewhat overlapping and conflicting, resulting in market confusion and leading to increasing proprietary interests. The government role and support in standardization for health data are thought to be crucial in order to establish credible standards for the next decade, to maximize interoperability across the health sector, and to decrease the risks associated with the implementation of non-standard systems. The normative literature missed out the exploration of the different steps required to be undertaken by the government towards the development of national health data standards. Based on the lessons learned from a qualitative study investigating the different issues to the adoption of health data standards in the major tertiary hospitals in Saudi Arabia and the opinions and feedback from different experts in the areas of data exchange and standards and medical informatics in Saudi Arabia and UK, a list of steps required towards the development of national health data standards was constructed. Main steps are the existence of: a national formal reference for health data standards, an agreed national strategic direction for medical data exchange, a national medical information management plan and a national accreditation body, and more important is the change management at the national and organizational level. The outcome of this study can be used by academics and practitioners to develop the planning of health data standards, and in particular those in developing countries.

Keywords: interoperabilty, medical data exchange, health data standards, case study, Saudi Arabia

Procedia PDF Downloads 333
27174 A Proposal for U-City (Smart City) Service Method Using Real-Time Digital Map

Authors: SangWon Han, MuWook Pyeon, Sujung Moon, DaeKyo Seo

Abstract:

Recently, technologies based on three-dimensional (3D) space information are being developed and quality of life is improving as a result. Research on real-time digital map (RDM) is being conducted now to provide 3D space information. RDM is a service that creates and supplies 3D space information in real time based on location/shape detection. Research subjects on RDM include the construction of 3D space information with matching image data, complementing the weaknesses of image acquisition using multi-source data, and data collection methods using big data. Using RDM will be effective for space analysis using 3D space information in a U-City and for other space information utilization technologies.

Keywords: RDM, multi-source data, big data, U-City

Procedia PDF Downloads 428
27173 Agile Methodology for Modeling and Design of Data Warehouses -AM4DW-

Authors: Nieto Bernal Wilson, Carmona Suarez Edgar

Abstract:

The organizations have structured and unstructured information in different formats, sources, and systems. Part of these come from ERP under OLTP processing that support the information system, however these organizations in OLAP processing level, presented some deficiencies, part of this problematic lies in that does not exist interesting into extract knowledge from their data sources, as also the absence of operational capabilities to tackle with these kind of projects.  Data Warehouse and its applications are considered as non-proprietary tools, which are of great interest to business intelligence, since they are repositories basis for creating models or patterns (behavior of customers, suppliers, products, social networks and genomics) and facilitate corporate decision making and research. The following paper present a structured methodology, simple, inspired from the agile development models as Scrum, XP and AUP. Also the models object relational, spatial data models, and the base line of data modeling under UML and Big data, from this way sought to deliver an agile methodology for the developing of data warehouses, simple and of easy application. The methodology naturally take into account the application of process for the respectively information analysis, visualization and data mining, particularly for patterns generation and derived models from the objects facts structured.

Keywords: data warehouse, model data, big data, object fact, object relational fact, process developed data warehouse

Procedia PDF Downloads 400
27172 Identifying Model to Predict Deterioration of Water Mains Using Robust Analysis

Authors: Go Bong Choi, Shin Je Lee, Sung Jin Yoo, Gibaek Lee, Jong Min Lee

Abstract:

In South Korea, it is difficult to obtain data for statistical pipe assessment. In this paper, to address these issues, we find that various statistical model presented before is how data mixed with noise and are whether apply in South Korea. Three major type of model is studied and if data is presented in the paper, we add noise to data, which affects how model response changes. Moreover, we generate data from model in paper and analyse effect of noise. From this we can find robustness and applicability in Korea of each model.

Keywords: proportional hazard model, survival model, water main deterioration, ecological sciences

Procedia PDF Downloads 738
27171 Human Absorbed Dose Assessment of 68Ga-Dotatoc Based on Biodistribution Data in Syrian Rats

Authors: S. Zolghadri, M. Naderi, H. Yousefnia, A. Ramazani, A. R. Jalilian

Abstract:

The aim of this work was to evaluate the values of absorbed dose of 68Ga-DOTATOC in numerous human organs. 68Ga-DOTATOC was prepared with the radiochemical purity of higher than 98% and by specific activity of 39.6 MBq/nmol. The complex demonstrated great stability at room temperature and in human serum at 37° C at least 2 h after preparation. Significant uptake was observed in somatostatin receptor-positive tissues such as pancreas and adrenal. The absorbed dose received by human organs was evaluated based on biodistribution studies in Syrian rats by the radiation absorbed dose assessment resource (RADAR) method. Maximum absorbed dose was obtained in the pancreas, kidneys, and adrenal with 0.105, 0.074, and 0.010 mGy/MBq, respectively. The effective absorbed dose was 0.026 mSv/MBq for 68Ga-DOTATOC. The results showed that 68Ga-DOTATOC can be considered as a safe and effective agent for clinically PET imaging applications.

Keywords: effective absorbed dose, Ga-68, octreotide, MIRD

Procedia PDF Downloads 517
27170 Automated Testing to Detect Instance Data Loss in Android Applications

Authors: Anusha Konduru, Zhiyong Shan, Preethi Santhanam, Vinod Namboodiri, Rajiv Bagai

Abstract:

Mobile applications are increasing in a significant amount, each to address the requirements of many users. However, the quick developments and enhancements are resulting in many underlying defects. Android apps create and handle a large variety of 'instance' data that has to persist across runs, such as the current navigation route, workout results, antivirus settings, or game state. Due to the nature of Android, an app can be paused, sent into the background, or killed at any time. If the instance data is not saved and restored between runs, in addition to data loss, partially-saved or corrupted data can crash the app upon resume or restart. However, it is difficult for the programmer to manually test this issue for all the activities. This results in the issue of data loss that the data entered by the user are not saved when there is any interruption. This issue can degrade user experience because the user needs to reenter the information each time there is an interruption. Automated testing to detect such data loss is important to improve the user experience. This research proposes a tool, DroidDL, a data loss detector for Android, which detects the instance data loss from a given android application. We have tested 395 applications and found 12 applications with the issue of data loss. This approach is proved highly accurate and reliable to find the apps with this defect, which can be used by android developers to avoid such errors.

Keywords: Android, automated testing, activity, data loss

Procedia PDF Downloads 230
27169 Big Data: Appearance and Disappearance

Authors: James Moir

Abstract:

The mainstay of Big Data is prediction in that it allows practitioners, researchers, and policy analysts to predict trends based upon the analysis of large and varied sources of data. These can range from changing social and political opinions, patterns in crimes, and consumer behaviour. Big Data has therefore shifted the criterion of success in science from causal explanations to predictive modelling and simulation. The 19th-century science sought to capture phenomena and seek to show the appearance of it through causal mechanisms while 20th-century science attempted to save the appearance and relinquish causal explanations. Now 21st-century science in the form of Big Data is concerned with the prediction of appearances and nothing more. However, this pulls social science back in the direction of a more rule- or law-governed reality model of science and away from a consideration of the internal nature of rules in relation to various practices. In effect Big Data offers us no more than a world of surface appearance and in doing so it makes disappear any context-specific conceptual sensitivity.

Keywords: big data, appearance, disappearance, surface, epistemology

Procedia PDF Downloads 414
27168 From Data Processing to Experimental Design and Back Again: A Parameter Identification Problem Based on FRAP Images

Authors: Stepan Papacek, Jiri Jablonsky, Radek Kana, Ctirad Matonoha, Stefan Kindermann

Abstract:

FRAP (Fluorescence Recovery After Photobleaching) is a widely used measurement technique to determine the mobility of fluorescent molecules within living cells. While the experimental setup and protocol for FRAP experiments are usually fixed, data processing part is still under development. In this paper, we formulate and solve the problem of data selection which enhances the processing of FRAP images. We introduce the concept of the irrelevant data set, i.e., the data which are almost not reducing the confidence interval of the estimated parameters and thus could be neglected. Based on sensitivity analysis, we both solve the problem of the optimal data space selection and we find specific conditions for optimizing an important experimental design factor, e.g., the radius of bleach spot. Finally, a theorem announcing less precision of the integrated data approach compared to the full data case is proven; i.e., we claim that the data set represented by the FRAP recovery curve lead to a larger confidence interval compared to the spatio-temporal (full) data.

Keywords: FRAP, inverse problem, parameter identification, sensitivity analysis, optimal experimental design

Procedia PDF Downloads 273
27167 Exploring the Feasibility of Utilizing Blockchain in Cloud Computing and AI-Enabled BIM for Enhancing Data Exchange in Construction Supply Chain Management

Authors: Tran Duong Nguyen, Marwan Shagar, Qinghao Zeng, Aras Maqsoodi, Pardis Pishdad, Eunhwa Yang

Abstract:

Construction supply chain management (CSCM) involves the collaboration of many disciplines and actors, which generates vast amounts of data. However, inefficient, fragmented, and non-standardized data storage often hinders this data exchange. The industry has adopted building information modeling (BIM) -a digital representation of a facility's physical and functional characteristics to improve collaboration, enhance transmission security, and provide a common data exchange platform. Still, the volume and complexity of data require tailored information categorization, aligning with stakeholders' preferences and demands. To address this, artificial intelligence (AI) can be integrated to handle this data’s magnitude and complexities. This research aims to develop an integrated and efficient approach for data exchange in CSCM by utilizing AI. The paper covers five main objectives: (1) Investigate existing framework and BIM adoption; (2) Identify challenges in data exchange; (3) Propose an integrated framework; (4) Enhance data transmission security; and (5) Develop data exchange in CSCM. The proposed framework demonstrates how integrating BIM and other technologies, such as cloud computing, blockchain, and AI applications, can significantly improve the efficiency and accuracy of data exchange in CSCM.

Keywords: construction supply chain management, BIM, data exchange, artificial intelligence

Procedia PDF Downloads 13
27166 Representation Data without Lost Compression Properties in Time Series: A Review

Authors: Nabilah Filzah Mohd Radzuan, Zalinda Othman, Azuraliza Abu Bakar, Abdul Razak Hamdan

Abstract:

Uncertain data is believed to be an important issue in building up a prediction model. The main objective in the time series uncertainty analysis is to formulate uncertain data in order to gain knowledge and fit low dimensional model prior to a prediction task. This paper discusses the performance of a number of techniques in dealing with uncertain data specifically those which solve uncertain data condition by minimizing the loss of compression properties.

Keywords: compression properties, uncertainty, uncertain time series, mining technique, weather prediction

Procedia PDF Downloads 422
27165 Macrocycles Enable Tuning of Uranyl Electrochemistry by Lewis Acids

Authors: Amit Kumar, Davide Lionetti, Victor Day, James Blakemore

Abstract:

Capture and activation of the water-soluble uranyl dication (UO22+) remains a challenging problem, as few rational approaches are available for modulating the reactivity of this species. Here, we report the divergent synthesis of heterobimetallic complexes in which UO22+ is held in close proximity to a range of redox-inactive metals by tailored macrocyclic ligands. Crystallographic and spectroscopic studies confirm assembly of homologous UVI(μ-OAr)2Mn+ cores with a range of mono-, di-, and trivalent Lewis acids (Mn+). X-ray diffraction (XRD) and cyclic voltammetry (CV) data suggest preferential binding of K+ in an 18-crown-6-like cavity and Na+ in a 15-crown-5-like cavity, both appended to Schiff-base type sites that selectively bind UO22+. CV data demonstrate that the UVI/UV reduction potential in these complexes shifts positive and the rate of electron transfer decreases with increasing Lewis acidity of the incorporated redox-inactive metals. Moreover, spectroelectrochemical studies confirm the formation of [UV] species in the case of monometallic UO22+ complex, consistent with results from prior studies. However, unique features were observed during spectroelectrochemical studies in the presence of the K+ ion, suggesting new insights into electronic structure may be accessible with the heterobimetallic complexes. Overall, these findings suggest that interactions with Lewis acids could be effectively leveraged for rational tuning of the electronic and thermochemical properties of the 5f elements, reminiscent of strategies more commonly employed with 3d transition metals.

Keywords: electrochemistry, Lewis acid, macrocycle, uranyl

Procedia PDF Downloads 135
27164 Data Mining As A Tool For Knowledge Management: A Review

Authors: Maram Saleh

Abstract:

Knowledge has become an essential resource in today’s economy and become the most important asset of maintaining competition advantage in organizations. The importance of knowledge has made organizations to manage their knowledge assets and resources through all multiple knowledge management stages such as: Knowledge Creation, knowledge storage, knowledge sharing and knowledge use. Researches on data mining are continues growing over recent years on both business and educational fields. Data mining is one of the most important steps of the knowledge discovery in databases process aiming to extract implicit, unknown but useful knowledge and it is considered as significant subfield in knowledge management. Data miming have the great potential to help organizations to focus on extracting the most important information on their data warehouses. Data mining tools and techniques can predict future trends and behaviors, allowing businesses to make proactive, knowledge-driven decisions. This review paper explores the applications of data mining techniques in supporting knowledge management process as an effective knowledge discovery technique. In this paper, we identify the relationship between data mining and knowledge management, and then focus on introducing some application of date mining techniques in knowledge management for some real life domains.

Keywords: Data Mining, Knowledge management, Knowledge discovery, Knowledge creation.

Procedia PDF Downloads 202
27163 Long-Term Resilience Performance Assessment of Dual and Singular Water Distribution Infrastructures Using a Complex Systems Approach

Authors: Kambiz Rasoulkhani, Jeanne Cole, Sybil Sharvelle, Ali Mostafavi

Abstract:

Dual water distribution systems have been proposed as solutions to enhance the sustainability and resilience of urban water systems by improving performance and decreasing energy consumption. The objective of this study was to evaluate the long-term resilience and robustness of dual water distribution systems versus singular water distribution systems under various stressors such as demand fluctuation, aging infrastructure, and funding constraints. To this end, the long-term dynamics of these infrastructure systems was captured using a simulation model that integrates institutional agency decision-making processes with physical infrastructure degradation to evaluate the long-term transformation of water infrastructure. A set of model parameters that varies for dual and singular distribution infrastructure based on the system attributes, such as pipes length and material, energy intensity, water demand, water price, average pressure and flow rate, as well as operational expenditures, were considered and input in the simulation model. Accordingly, the model was used to simulate various scenarios of demand changes, funding levels, water price growth, and renewal strategies. The long-term resilience and robustness of each distribution infrastructure were evaluated based on various performance measures including network average condition, break frequency, network leakage, and energy use. An ecologically-based resilience approach was used to examine regime shifts and tipping points in the long-term performance of the systems under different stressors. Also, Classification and Regression Tree analysis was adopted to assess the robustness of each system under various scenarios. Using data from the City of Fort Collins, the long-term resilience and robustness of the dual and singular water distribution systems were evaluated over a 100-year analysis horizon for various scenarios. The results of the analysis enabled: (i) comparison between dual and singular water distribution systems in terms of long-term performance, resilience, and robustness; (ii) identification of renewal strategies and decision factors that enhance the long-term resiliency and robustness of dual and singular water distribution systems under different stressors.

Keywords: complex systems, dual water distribution systems, long-term resilience performance, multi-agent modeling, sustainable and resilient water systems

Procedia PDF Downloads 284
27162 A Long Short-Term Memory Based Deep Learning Model for Corporate Bond Price Predictions

Authors: Vikrant Gupta, Amrit Goswami

Abstract:

The fixed income market forms the basis of the modern financial market. All other assets in financial markets derive their value from the bond market. Owing to its over-the-counter nature, corporate bonds have relatively less data publicly available and thus is researched upon far less compared to Equities. Bond price prediction is a complex financial time series forecasting problem and is considered very crucial in the domain of finance. The bond prices are highly volatile and full of noise which makes it very difficult for traditional statistical time-series models to capture the complexity in series patterns which leads to inefficient forecasts. To overcome the inefficiencies of statistical models, various machine learning techniques were initially used in the literature for more accurate forecasting of time-series. However, simple machine learning methods such as linear regression, support vectors, random forests fail to provide efficient results when tested on highly complex sequences such as stock prices and bond prices. hence to capture these intricate sequence patterns, various deep learning-based methodologies have been discussed in the literature. In this study, a recurrent neural network-based deep learning model using long short term networks for prediction of corporate bond prices has been discussed. Long Short Term networks (LSTM) have been widely used in the literature for various sequence learning tasks in various domains such as machine translation, speech recognition, etc. In recent years, various studies have discussed the effectiveness of LSTMs in forecasting complex time-series sequences and have shown promising results when compared to other methodologies. LSTMs are a special kind of recurrent neural networks which are capable of learning long term dependencies due to its memory function which traditional neural networks fail to capture. In this study, a simple LSTM, Stacked LSTM and a Masked LSTM based model has been discussed with respect to varying input sequences (three days, seven days and 14 days). In order to facilitate faster learning and to gradually decompose the complexity of bond price sequence, an Empirical Mode Decomposition (EMD) has been used, which has resulted in accuracy improvement of the standalone LSTM model. With a variety of Technical Indicators and EMD decomposed time series, Masked LSTM outperformed the other two counterparts in terms of prediction accuracy. To benchmark the proposed model, the results have been compared with traditional time series models (ARIMA), shallow neural networks and above discussed three different LSTM models. In summary, our results show that the use of LSTM models provide more accurate results and should be explored more within the asset management industry.

Keywords: bond prices, long short-term memory, time series forecasting, empirical mode decomposition

Procedia PDF Downloads 134
27161 Dynamic Modelling and Assessment for Urban Growth and Transport in Riyadh City, Saudi Arabia

Authors: Majid Aldalbahi

Abstract:

In 2009, over 3.4 billion people in the world resided in urban areas as a result of rapid urban growth. This figure is estimated to increase to 6.5 billion by 2050. This urban growth phenomenon has raised challenges for many countries in both the developing and developed worlds. Urban growth is a complicated process involving the spatiotemporal changes of all socio-economic and physical components at different scales. The socio-economic components of urban growth are related to urban population growth and economic growth, while physical components of urban growth and economic growth are related to spatial expansion, land cover change and land use change which are the focus of this research. The interactions between these components are complex and no-linear. Several factors and forces cause these complex interactions including transportation and communication, internal and international migrations, public policies, high natural growth rates of urban populations and public policies. Urban growth has positive and negative consequences. The positive effects relates to planned and orderly urban growth, while negative effects relate to unplanned and scattered growth, which is called sprawl. Although urban growth is considered as necessary for sustainable urbanization, uncontrolled and rapid growth cause various problems including consumption of precious rural land resources at urban fringe, landscape alteration, traffic congestion, infrastructure pressure, and neighborhood conflicts. Traditional urban planning approaches in fast growing cities cannot accommodate the negative consequences of rapid urban growth. Microsimulation programme, and modelling techniques are effective means to provide new urban development, management and planning methods and approaches. This paper aims to use these techniques to understand and analyse the complex interactions for the case study of Riyadh city, a fast growing city in Saudi Arabia.

Keywords: policy implications, urban planning, traffic congestion, urban growth, Suadi Arabia, Riyadh

Procedia PDF Downloads 479
27160 Two-Dimensional Analysis and Numerical Simulation of the Navier-Stokes Equations for Principles of Turbulence around Isothermal Bodies Immersed in Incompressible Newtonian Fluids

Authors: Romulo D. C. Santos, Silvio M. A. Gama, Ramiro G. R. Camacho

Abstract:

In this present paper, the thermos-fluid dynamics considering the mixed convection (natural and forced convections) and the principles of turbulence flow around complex geometries have been studied. In these applications, it was necessary to analyze the influence between the flow field and the heated immersed body with constant temperature on its surface. This paper presents a study about the Newtonian incompressible two-dimensional fluid around isothermal geometry using the immersed boundary method (IBM) with the virtual physical model (VPM). The numerical code proposed for all simulations satisfy the calculation of temperature considering Dirichlet boundary conditions. Important dimensionless numbers such as Strouhal number is calculated using the Fast Fourier Transform (FFT), Nusselt number, drag and lift coefficients, velocity and pressure. Streamlines and isothermal lines are presented for each simulation showing the flow dynamics and patterns. The Navier-Stokes and energy equations for mixed convection were discretized using the finite difference method for space and a second order Adams-Bashforth and Runge-Kuta 4th order methods for time considering the fractional step method to couple the calculation of pressure, velocity, and temperature. This work used for simulation of turbulence, the Smagorinsky, and Spalart-Allmaras models. The first model is based on the local equilibrium hypothesis for small scales and hypothesis of Boussinesq, such that the energy is injected into spectrum of the turbulence, being equal to the energy dissipated by the convective effects. The Spalart-Allmaras model, use only one transport equation for turbulent viscosity. The results were compared with numerical data, validating the effect of heat-transfer together with turbulence models. The IBM/VPM is a powerful tool to simulate flow around complex geometries. The results showed a good numerical convergence in relation the references adopted.

Keywords: immersed boundary method, mixed convection, turbulence methods, virtual physical model

Procedia PDF Downloads 110
27159 Screening of Wheat Wild Relatives as a Gene Pool for Improved Photosynthesis in Wheat Breeding

Authors: Amanda J. Burridge, Keith J. Edwards, Paul A. Wilkinson, Tom Batstone, Erik H. Murchie, Lorna McAusland, Ana Elizabete Carmo-Silva, Ivan Jauregui, Tracy Lawson, Silvere R. M. Vialet-Chabrand

Abstract:

The rate of genetic progress in wheat production must be improved to meet global food security targets. However, past selection for domestication traits has reduced the genetic variation in modern wheat cultivars, a fact that could severely limit the future rate of genetic gain. The genetic variation in agronomically important traits for the wild relatives and progenitors of wheat is far greater than that of the current domesticated cultivars, but transferring these traits into modern cultivars is not straightforward. Between the elite cultivars of wheat, photosynthetic capacity is a key trait for which there is limited variation. Early screening of wheat wild relative and progenitors has shown differences in photosynthetic capacity and efficiency not only between wild relative species but marked differences between the accessions of each species. By identifying wild relative accessions with improved photosynthetic traits and characterising the genetic variation responsible, it is possible to incorporate these traits into advanced breeding programmes by wide crossing and introgression programmes. To identify the potential variety of photosynthetic capacity and efficiency available in the secondary and tertiary genepool, a wide scale survey was carried out for over 600 accessions from 80 species including those from the genus Aegilops, Triticum, Thinopyrum, Elymus, and Secale. Genotype data were generated for each accession using a ‘Wheat Wild Relative’ Single Nucleotide Polymorphism (SNP) genotyping array composed of 35,000 SNP markers polymorphic between wild relatives and elite hexaploid wheat. This genotype data was combined with phenotypic measurements such as gas exchange (CO₂, H₂O), chlorophyll fluorescence, growth, morphology, and RuBisCO activity to identify potential breeding material with enhanced photosynthetic capacity and efficiency. The data and associated analysis tools presented here will prove useful to anyone interested in increasing the genetic diversity in hexaploid wheat or the application of complex genotyping data to plant breeding.

Keywords: wheat, wild relatives, pre-breeding, genomics, photosynthesis

Procedia PDF Downloads 215
27158 Anomaly Detection Based Fuzzy K-Mode Clustering for Categorical Data

Authors: Murat Yazici

Abstract:

Anomalies are irregularities found in data that do not adhere to a well-defined standard of normal behavior. The identification of outliers or anomalies in data has been a subject of study within the statistics field since the 1800s. Over time, a variety of anomaly detection techniques have been developed in several research communities. The cluster analysis can be used to detect anomalies. It is the process of associating data with clusters that are as similar as possible while dissimilar clusters are associated with each other. Many of the traditional cluster algorithms have limitations in dealing with data sets containing categorical properties. To detect anomalies in categorical data, fuzzy clustering approach can be used with its advantages. The fuzzy k-Mode (FKM) clustering algorithm, which is one of the fuzzy clustering approaches, by extension to the k-means algorithm, is reported for clustering datasets with categorical values. It is a form of clustering: each point can be associated with more than one cluster. In this paper, anomaly detection is performed on two simulated data by using the FKM cluster algorithm. As a significance of the study, the FKM cluster algorithm allows to determine anomalies with their abnormality degree in contrast to numerous anomaly detection algorithms. According to the results, the FKM cluster algorithm illustrated good performance in the anomaly detection of data, including both one anomaly and more than one anomaly.

Keywords: fuzzy k-mode clustering, anomaly detection, noise, categorical data

Procedia PDF Downloads 48
27157 Big Data Analytics and Data Security in the Cloud via Fully Homomorphic Encyption Scheme

Authors: Victor Onomza Waziri, John K. Alhassan, Idris Ismaila, Noel Dogonyara

Abstract:

This paper describes the problem of building secure computational services for encrypted information in the Cloud. Computing without decrypting the encrypted data; therefore, it meets the yearning of computational encryption algorithmic aspiration model that could enhance the security of big data for privacy or confidentiality, availability and integrity of the data and user’s security. The cryptographic model applied for the computational process of the encrypted data is the Fully Homomorphic Encryption Scheme. We contribute a theoretical presentations in a high-level computational processes that are based on number theory that is derivable from abstract algebra which can easily be integrated and leveraged in the Cloud computing interface with detail theoretic mathematical concepts to the fully homomorphic encryption models. This contribution enhances the full implementation of big data analytics based on cryptographic security algorithm.

Keywords: big data analytics, security, privacy, bootstrapping, Fully Homomorphic Encryption Scheme

Procedia PDF Downloads 475
27156 Consumer Protection Law For Users Mobile Commerce as a Global Effort to Improve Business in Indonesia

Authors: Rina Arum Prastyanti

Abstract:

Information technology has changed the ways of transacting and enabling new opportunities in business transactions. Problems to be faced by consumers M Commerce, among others, the consumer will have difficulty accessing the full information about the products on offer and the forms of transactions given the small screen and limited storage capacity, the need to protect children from various forms of excess supply and usage as well as errors in access and disseminate personal data, not to mention the more complex problems as well as problems agreements, dispute resolution that can protect consumers and assurance of security of personal data. It is no less important is the risk of payment and personal information of payment dal am also an important issue that should be on the swatch solution. The purpose of this study is 1) to describe the phenomenon of the use of Mobile Commerce in Indonesia. 2) To determine the form of legal protection for the consumer use of Mobile Commerce. 3) To get the right type of law so as to provide legal protection for consumers Mobile Commerce users. This research is a descriptive qualitative research. Primary and secondary data sources. This research is a normative law. Engineering conducted engineering research library collection or library research. The analysis technique used is deductive analysis techniques. Growing mobile technology and more affordable prices as well as low rates of provider competition also affects the increasing number of mobile users, Indonesia is placed into 4 HP users in the world, the number of mobile phones in Indonesia is estimated at around 250.1 million telephones with a population of 237 556. 363. Indonesian form of legal protection in the use of mobile commerce still a part of the Law No. 11 of 2008 on Information and Electronic Transactions and until now there is no rule of law that specifically regulates mobile commerce. Legal protection model that can be applied to protect consumers of mobile commerce users ensuring that consumers get information about potential security and privacy challenges they may face in m commerce and measures that can be used to limit the risk. Encourage the development of security measures and built security features. To encourage mobile operators to implement data security policies and measures to prevent unauthorized transactions. Provide appropriate methods both time and effectiveness of redress when consumers suffer financial loss.

Keywords: mobile commerce, legal protection, consumer, effectiveness

Procedia PDF Downloads 360
27155 Investigating Breakdowns in Human Robot Interaction: A Conversation Analysis Guided Single Case Study of a Human-Robot Communication in a Museum Environment

Authors: B. Arend, P. Sunnen, P. Caire

Abstract:

In a single case study, we show how a conversation analysis (CA) approach can shed light onto the sequential unfolding of human-robot interaction. Relying on video data, we are able to show that CA allows us to investigate the respective turn-taking systems of humans and a NAO robot in their dialogical dynamics, thus pointing out relevant differences. Our fine grained video analysis points out occurring breakdowns and their overcoming, when humans and a NAO-robot engage in a multimodally uttered multi-party communication during a sports guessing game. Our findings suggest that interdisciplinary work opens up the opportunity to gain new insights into the challenging issues of human robot communication in order to provide resources for developing mechanisms that enable complex human-robot interaction (HRI).

Keywords: human robot interaction, conversation analysis, dialogism, breakdown, museum

Procedia PDF Downloads 299
27154 A Non-Linear Eddy Viscosity Model for Turbulent Natural Convection in Geophysical Flows

Authors: J. P. Panda, K. Sasmal, H. V. Warrior

Abstract:

Eddy viscosity models in turbulence modeling can be mainly classified as linear and nonlinear models. Linear formulations are simple and require less computational resources but have the disadvantage that they cannot predict actual flow pattern in complex geophysical flows where streamline curvature and swirling motion are predominant. A constitutive equation of Reynolds stress anisotropy is adopted for the formulation of eddy viscosity including all the possible higher order terms quadratic in the mean velocity gradients, and a simplified model is developed for actual oceanic flows where only the vertical velocity gradients are important. The new model is incorporated into the one dimensional General Ocean Turbulence Model (GOTM). Two realistic oceanic test cases (OWS Papa and FLEX' 76) have been investigated. The new model predictions match well with the observational data and are better in comparison to the predictions of the two equation k-epsilon model. The proposed model can be easily incorporated in the three dimensional Princeton Ocean Model (POM) to simulate a wide range of oceanic processes. Practically, this model can be implemented in the coastal regions where trasverse shear induces higher vorticity, and for prediction of flow in estuaries and lakes, where depth is comparatively less. The model predictions of marine turbulence and other related data (e.g. Sea surface temperature, Surface heat flux and vertical temperature profile) can be utilized in short term ocean and climate forecasting and warning systems.

Keywords: Eddy viscosity, turbulence modeling, GOTM, CFD

Procedia PDF Downloads 196
27153 Simulation of Concrete Wall Subjected to Airblast by Developing an Elastoplastic Spring Model in Modelica Modelling Language

Authors: Leo Laine, Morgan Johansson

Abstract:

To meet the civilizations future needs for safe living and low environmental footprint, the engineers designing the complex systems of tomorrow will need efficient ways to model and optimize these systems for their intended purpose. For example, a civil defence shelter and its subsystem components needs to withstand, e.g. airblast and ground shock from decided design level explosion which detonates with a certain distance from the structure. In addition, the complex civil defence shelter needs to have functioning air filter systems to protect from toxic gases and provide clean air, clean water, heat, and electricity needs to also be available through shock and vibration safe fixtures and connections. Similar complex building systems can be found in any concentrated living or office area. In this paper, the authors use a multidomain modelling language called Modelica to model a concrete wall as a single degree of freedom (SDOF) system with elastoplastic properties with the implemented option of plastic hardening. The elastoplastic model was developed and implemented in the open source tool OpenModelica. The simulation model was tested on the case with a transient equivalent reflected pressure time history representing an airblast from 100 kg TNT detonating 15 meters from the wall. The concrete wall is approximately regarded as a concrete strip of 1.0 m width. This load represents a realistic threat on any building in a city like area. The OpenModelica model results were compared with an Excel implementation of a SDOF model with an elastic-plastic spring using simple fixed timestep central difference solver. The structural displacement results agreed very well with each other when it comes to plastic displacement magnitude, elastic oscillation displacement, and response times.

Keywords: airblast from explosives, elastoplastic spring model, Modelica modelling language, SDOF, structural response of concrete structure

Procedia PDF Downloads 126
27152 An Approximation of Daily Rainfall by Using a Pixel Value Data Approach

Authors: Sarisa Pinkham, Kanyarat Bussaban

Abstract:

The research aims to approximate the amount of daily rainfall by using a pixel value data approach. The daily rainfall maps from the Thailand Meteorological Department in period of time from January to December 2013 were the data used in this study. The results showed that this approach can approximate the amount of daily rainfall with RMSE=3.343.

Keywords: daily rainfall, image processing, approximation, pixel value data

Procedia PDF Downloads 383
27151 A Next-Generation Blockchain-Based Data Platform: Leveraging Decentralized Storage and Layer 2 Scaling for Secure Data Management

Authors: Kenneth Harper

Abstract:

The rapid growth of data-driven decision-making across various industries necessitates advanced solutions to ensure data integrity, scalability, and security. This study introduces a decentralized data platform built on blockchain technology to improve data management processes in high-volume environments such as healthcare and financial services. The platform integrates blockchain networks using Cosmos SDK and Polkadot Substrate alongside decentralized storage solutions like IPFS and Filecoin, and coupled with decentralized computing infrastructure built on top of Avalanche. By leveraging advanced consensus mechanisms, we create a scalable, tamper-proof architecture that supports both structured and unstructured data. Key features include secure data ingestion, cryptographic hashing for robust data lineage, and Zero-Knowledge Proof mechanisms that enhance privacy while ensuring compliance with regulatory standards. Additionally, we implement performance optimizations through Layer 2 scaling solutions, including ZK-Rollups, which provide low-latency data access and trustless data verification across a distributed ledger. The findings from this exercise demonstrate significant improvements in data accessibility, reduced operational costs, and enhanced data integrity when tested in real-world scenarios. This platform reference architecture offers a decentralized alternative to traditional centralized data storage models, providing scalability, security, and operational efficiency.

Keywords: blockchain, cosmos SDK, decentralized data platform, IPFS, ZK-Rollups

Procedia PDF Downloads 5
27150 Revolutionizing Project Management: A Comprehensive Review of Artificial Intelligence and Machine Learning Applications for Smarter Project Execution

Authors: Wenzheng Fu, Yue Fu, Zhijiang Dong, Yujian Fu

Abstract:

The integration of artificial intelligence (AI) and machine learning (ML) into project management is transforming how engineering projects are executed, monitored, and controlled. This paper provides a comprehensive survey of AI and ML applications in project management, systematically categorizing their use in key areas such as project data analytics, monitoring, tracking, scheduling, and reporting. As project management becomes increasingly data-driven, AI and ML offer powerful tools for improving decision-making, optimizing resource allocation, and predicting risks, leading to enhanced project outcomes. The review highlights recent research that demonstrates the ability of AI and ML to automate routine tasks, provide predictive insights, and support dynamic decision-making, which in turn increases project efficiency and reduces the likelihood of costly delays. This paper also examines the emerging trends and future opportunities in AI-driven project management, such as the growing emphasis on transparency, ethical governance, and data privacy concerns. The research suggests that AI and ML will continue to shape the future of project management by driving further automation and offering intelligent solutions for real-time project control. Additionally, the review underscores the need for ongoing innovation and the development of governance frameworks to ensure responsible AI deployment in project management. The significance of this review lies in its comprehensive analysis of AI and ML’s current contributions to project management, providing valuable insights for both researchers and practitioners. By offering a structured overview of AI applications across various project phases, this paper serves as a guide for the adoption of intelligent systems, helping organizations achieve greater efficiency, adaptability, and resilience in an increasingly complex project management landscape.

Keywords: artificial intelligence, decision support systems, machine learning, project management, resource optimization, risk prediction

Procedia PDF Downloads 10