Search results for: rules extraction
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3070

Search results for: rules extraction

1240 BIM Data and Digital Twin Framework: Preserving the Past and Predicting the Future

Authors: Mazharuddin Syed Ahmed

Abstract:

This research presents a framework used to develop The Ara Polytechnic College of Architecture Studies building “Kahukura” which is Green Building certified. This framework integrates the development of a smart building digital twin by utilizing Building Information Modelling (BIM) and its BIM maturity levels, including Levels of Development (LOD), eight dimensions of BIM, Heritage-BIM (H-BIM) and Facility Management BIM (FM BIM). The research also outlines a structured approach to building performance analysis and integration with the circular economy, encapsulated within a five-level digital twin framework. Starting with Level 1, the Descriptive Twin provides a live, editable visual replica of the built asset, allowing for specific data inclusion and extraction. Advancing to Level 2, the Informative Twin integrates operational and sensory data, enhancing data verification and system integration. At Level 3, the Predictive Twin utilizes operational data to generate insights and proactive management suggestions. Progressing to Level 4, the Comprehensive Twin simulates future scenarios, enabling robust “what-if” analyses. Finally, Level 5, the Autonomous Twin, represents the pinnacle of digital twin evolution, capable of learning and autonomously acting on behalf of users.

Keywords: building information modelling, circular economy integration, digital twin, predictive analytics

Procedia PDF Downloads 41
1239 Characterization of Porosity and Flow in Solid Oxide Fuel Cell with 3D Focused Ion Beam Serial Slicing

Authors: Daniel Phifer, Anna Prokhodtseva

Abstract:

DualBeam (FIB-SEM) has long been the technology of choice to sub-sample and characterize materials at site-specific locations which are difficult or impossible to extract by conventional embedding/polishing methods. Whereas Ga based FIB provides excellent resolution and enables precise material removal, the current is usually limited and only allows the extraction of small material biopsies typically ranging from 5-70um wide. Xe Plasma FIB, by contrast, has around 38x more current and can remove more material at the same time to extract significant sized chunks (100-1000um) of materials for further analysis. This increased volume has enabled time-prohibitive investigations like large grain 3D serial sectioning and EBSD and micro-machining for micro-mechanical testing. Investigation of the pore spaces with 3D modeling can determine the relative characteristics of the materials to help design or select properties for best function. Pore spaces can be described with a tortuosity number which is calculated by modules in the 3D analysis software. Xe Plasma FIB technology provides a workflow with sufficient volume to characterize porosity when both large-volume 3D materials characterization and nanometer resolution is required to understand the system.

Keywords: dual-beam, FIB-SEM, porosity, SOFC, solid oxide fuel cell

Procedia PDF Downloads 205
1238 d-Block Metal Nanoparticles Confined in Triphenylphosphine Oxide Functionalized Core-Crosslinked Micelles for the Application in Biphasic Hydrogenation

Authors: C. Joseph Abou-Fayssal, K. Philippot, R. Poli, E. Manoury, A. Riisager

Abstract:

The use of soluble polymer-supported metal nanoparticles (MNPs) has received significant attention for the ease of catalyst recovery and recycling. Of particular interest are MNPs that are supported on polymers that are either soluble or form stable colloidal dispersion in water, as this allows to combine of the advantages of the aqueous biphasic protocol with the catalytical performances of MNPs. The objective is to achieve good confinement of the catalyst in the nanoreactor cores and, thus, a better catalyst recovery in order to overcome the previously witnessed MNP extraction. Inspired by previous results, we are interested in the design of polymeric nanoreactors functionalized with ligands able to solidly anchor metallic nanoparticles in order to control the activity and selectivity of the developed nanocatalysts. The nanoreactors are core-crosslinked micelles (CCM) synthesized by reversible addition-fragmentation chain transfer (RAFT) polymerization. Varying the nature of the core-linked functionalities allows us to get differently stabilized metal nanoparticles and thus compare their performance in the catalyzed aqueous biphasic hydrogenation of model substrates. Particular attention is given to catalyst recyclability.

Keywords: biphasic catalysis, metal nanoparticles, polymeric nanoreactors, catalyst recovery, RAFT polymerization

Procedia PDF Downloads 99
1237 The Mechanism of Design and Analysis Modeling of Performance of Variable Speed Wind Turbine and Dynamical Control of Wind Turbine Power

Authors: Mohammadreza Heydariazad

Abstract:

Productivity growth of wind energy as a clean source needed to achieve improved strategy in production and transmission and management of wind resources in order to increase quality of power and reduce costs. New technologies based on power converters that cause changing turbine speed to suit the wind speed blowing turbine improve extraction efficiency power from wind. This article introduces variable speed wind turbines and optimization of power, and presented methods to use superconducting inductor in the composition of power converter and is proposed the dc measurement for the wind farm and especially is considered techniques available to them. In fact, this article reviews mechanisms and function, changes of wind speed turbine according to speed control strategies of various types of wind turbines and examines power possible transmission and ac from producing location to suitable location for a strong connection integrating wind farm generators, without additional cost or equipment. It also covers main objectives of the dynamic control of wind turbines, and the methods of exploitation and the ways of using it that includes the unique process of these components. Effective algorithm is presented for power control in order to extract maximum active power and maintains power factor at the desired value.

Keywords: wind energy, generator, superconducting inductor, wind turbine power

Procedia PDF Downloads 325
1236 Efficient Fuzzy Classified Cryptographic Model for Intelligent Encryption Technique towards E-Banking XML Transactions

Authors: Maher Aburrous, Adel Khelifi, Manar Abu Talib

Abstract:

Transactions performed by financial institutions on daily basis require XML encryption on large scale. Encrypting large volume of message fully will result both performance and resource issues. In this paper a novel approach is presented for securing financial XML transactions using classification data mining (DM) algorithms. Our strategy defines the complete process of classifying XML transactions by using set of classification algorithms, classified XML documents processed at later stage using element-wise encryption. Classification algorithms were used to identify the XML transaction rules and factors in order to classify the message content fetching important elements within. We have implemented four classification algorithms to fetch the importance level value within each XML document. Classified content is processed using element-wise encryption for selected parts with "High", "Medium" or “Low” importance level values. Element-wise encryption is performed using AES symmetric encryption algorithm and proposed modified algorithm for AES to overcome the problem of computational overhead, in which substitute byte, shift row will remain as in the original AES while mix column operation is replaced by 128 permutation operation followed by add round key operation. An implementation has been conducted using data set fetched from e-banking service to present system functionality and efficiency. Results from our implementation showed a clear improvement in processing time encrypting XML documents.

Keywords: XML transaction, encryption, Advanced Encryption Standard (AES), XML classification, e-banking security, fuzzy classification, cryptography, intelligent encryption

Procedia PDF Downloads 408
1235 The Effect of Four Local Plant Extract on the Control of Rice Weevil, Sitophilus oryzae L.

Authors: Banaz Sdiq Abdulla

Abstract:

Four local species (Allium sativum, Capsicum annum, Anethum graveolens, and Ocimum basilicum) were evaluated in the laboratory of Biolog Department, College of Education, for their ability to protect stored rice from the infection by weevil Sitophilus oryzae. Aqueous extracts of the plant species were applied as direct admixture of three concentrations levels of 1%, 2.5%, and 5% (W/V) to assess for mortality, adult emergence, and repellency and weight losses. The results showed that Al. sativum extracts was the most effective as it gave the highest mortality (90%)at 5% concentration followed by Capsicum annum (80%) on the 4th day post treatment, the result showed that the plant extract of different concentrations exhibited different level of reduction in adult emergence and different repellency of adults of Sitophilus oryzae. Allium sativum recorded the lowest mean number of adult emergence (8) followed by Capsicum annum (10) at 5% concentration, while Capsicum annum was found to be revealed complete repellent agent (100%) repellency on the 6th hours against Sitophilus oryzae followed by Allium sativum and Anethum graveolens (81.8%). There was a significant (P>0.05) reduction in the weight lossed by the weevils with less damaged recorded on grain treated with Allium sativum and Capsicum annum (1.6%) and (2.3%) respectively.

Keywords: plant extraction, rice, protectant, pest

Procedia PDF Downloads 429
1234 Hybrid Structure Learning Approach for Assessing the Phosphate Laundries Impact

Authors: Emna Benmohamed, Hela Ltifi, Mounir Ben Ayed

Abstract:

Bayesian Network (BN) is one of the most efficient classification methods. It is widely used in several fields (i.e., medical diagnostics, risk analysis, bioinformatics research). The BN is defined as a probabilistic graphical model that represents a formalism for reasoning under uncertainty. This classification method has a high-performance rate in the extraction of new knowledge from data. The construction of this model consists of two phases for structure learning and parameter learning. For solving this problem, the K2 algorithm is one of the representative data-driven algorithms, which is based on score and search approach. In addition, the integration of the expert's knowledge in the structure learning process allows the obtainment of the highest accuracy. In this paper, we propose a hybrid approach combining the improvement of the K2 algorithm called K2 algorithm for Parents and Children search (K2PC) and the expert-driven method for learning the structure of BN. The evaluation of the experimental results, using the well-known benchmarks, proves that our K2PC algorithm has better performance in terms of correct structure detection. The real application of our model shows its efficiency in the analysis of the phosphate laundry effluents' impact on the watershed in the Gafsa area (southwestern Tunisia).

Keywords: Bayesian network, classification, expert knowledge, structure learning, surface water analysis

Procedia PDF Downloads 127
1233 High-Accuracy Satellite Image Analysis and Rapid DSM Extraction for Urban Environment Evaluations (Tripoli-Libya)

Authors: Abdunaser Abduelmula, Maria Luisa M. Bastos, José A. Gonçalves

Abstract:

The modeling of the earth's surface and evaluation of urban environment, with 3D models, is an important research topic. New stereo capabilities of high-resolution optical satellites images, such as the tri-stereo mode of Pleiades, combined with new image matching algorithms, are now available and can be applied in urban area analysis. In addition, photogrammetry software packages gained new, more efficient matching algorithms, such as SGM, as well as improved filters to deal with shadow areas, can achieve denser and more precise results. This paper describes a comparison between 3D data extracted from tri-stereo and dual stereo satellite images, combined with pixel based matching and Wallis filter. The aim was to improve the accuracy of 3D models especially in urban areas, in order to assess if satellite images are appropriate for a rapid evaluation of urban environments. The results showed that 3D models achieved by Pleiades tri-stereo outperformed, both in terms of accuracy and detail, the result obtained from a Geo-eye pair. The assessment was made with reference digital surface models derived from high-resolution aerial photography. This could mean that tri-stereo images can be successfully used for the proposed urban change analyses.

Keywords: 3D models, environment, matching, pleiades

Procedia PDF Downloads 328
1232 Effective Solvents for Proteins Recovery from Microalgae

Authors: Win Nee Phong, Tau Chuan Ling, Pau Loke Show

Abstract:

From an industrial perspective, the exploitation of microalgae for protein source is of great economical and commercial interest due to numerous attractive characteristics. Nonetheless, the release of protein from microalgae is limited by the multiple layers of the rigid thick cell wall that generally contain a large proportion of cellulose. Thus an efficient cell disruption process is required to rupture the cell wall. The conventional downstream processing methods which typically involve several unit operational steps such as disruption, isolation, extraction, concentration and purification are energy-intensive and costly. To reduce the overall cost and establish a feasible technology for the success of the large-scale production, microalgal industry today demands a more cost-effective and eco-friendly technique in downstream processing. One of the main challenges to extract the proteins from microalgae is the presence of rigid cell wall. This study aims to provide some guidance on the selection of the efficient solvent to facilitate the proteins released during the cell disruption process. The effects of solvent types such as methanol, ethanol, 1-propanol and water in rupturing the microalgae cell wall were studied. It is interesting to know that water is the most effective solvent to recover proteins from microalgae and the cost is cheapest among all other solvents.

Keywords: green, microalgae, protein, solvents

Procedia PDF Downloads 257
1231 Creativity in Industrial Design as an Instrument for the Achievement of the Proper and Necessary Balance between Intuition and Reason, Design and Science

Authors: Juan Carlos Quiñones

Abstract:

Time has passed since the industrial design has put murder on a mass-production basis. The industrial design applies methods from different disciplines with a strategic approach, to place humans at the centers of the design process and to deliver solutions that are meaningful and desirable for users and for the market. This analysis summarizes some of the discussions that occurred in the 6th International Forum of Design as a Process, June 2016, Valencia. The aims of this conference were finding new linkages between systems and design interactions in order to define the social consequences. Through knowledge management we are able to transform the intangible aspect by using design as a transforming function capable of converting intangible knowledge into tangible solutions (i.e. products and services demanded by society). Industrial designers use knowledge consciously as a starting point for the ideation of the product. The handling of the intangible becomes more and more relevant over time as different methods emerge for knowledge extraction and subsequent organization. The different methodologies applied to the industrial design discipline and the evolution of the same discipline methods underpin the cultural and scientific background knowledge as a starting point of thought as a response to the needs; the whole thing coming through the instrument of creativity for the achievement of the proper and necessary balance between intuition and reason, design and science.

Keywords: creative process, creativity, industrial design, intangible

Procedia PDF Downloads 286
1230 Unlocking Green Hydrogen Potential: A Machine Learning-Based Assessment

Authors: Said Alshukri, Mazhar Hussain Malik

Abstract:

Green hydrogen is hydrogen produced using renewable energy sources. In the last few years, Oman aimed to reduce its dependency on fossil fuels. Recently, the hydrogen economy has become a global trend, and many countries have started to investigate the feasibility of implementing this sector. Oman created an alliance to establish the policy and rules for this sector. With motivation coming from both global and local interest in green hydrogen, this paper investigates the potential of producing hydrogen from wind and solar energies in three different locations in Oman, namely Duqm, Salalah, and Sohar. By using machine learning-based software “WEKA” and local metrological data, the project was designed to figure out which location has the highest wind and solar energy potential. First, various supervised models were tested to obtain their prediction accuracy, and it was found that the Random Forest (RF) model has the best prediction performance. The RF model was applied to 2021 metrological data for each location, and the results indicated that Duqm has the highest wind and solar energy potential. The system of one wind turbine in Duqm can produce 8335 MWh/year, which could be utilized in the water electrolysis process to produce 88847 kg of hydrogen mass, while a solar system consisting of 2820 solar cells is estimated to produce 1666.223 MWh/ year which is capable of producing 177591 kg of hydrogen mass.

Keywords: green hydrogen, machine learning, wind and solar energies, WEKA, supervised models, random forest

Procedia PDF Downloads 78
1229 The Posthuman Condition and a Translational Ethics of Entanglement

Authors: Shabnam Naderi

Abstract:

Traditional understandings of ethics considered translators, translations, technologies and other agents as separate and prioritized human agents. In fact, ethics was equated with morality. This disengaged understanding of ethics is superseded by an ethics of relation/entanglement in the posthuman philosophy. According to this ethics of entanglement, human and nonhuman agents are in constant ‘intra-action’. The human is not separate from nature, from technology and from other nonhuman entities, and an ethics of translation in this regard cannot be separated from technology and ecology and get defined merely within the realm of human-human encounter. As such, a posthuman ethics offers opportunities for change and responds to the changing nature of reality, it is negotiable and reveals itself as a moment-by-moment practice (i.e. as temporally emergent and beyond determinacy and permanence). Far from the linguistic or cultural, or individual concerns, posthuman translational ethics discusses how the former rigid norms and laws are challenged in a process ontology which puts emphasis on activity and activation and considers ethics as surfacing in activity, not as a predefined set of rules and values. In this sense, traditional ethical principles like faithfulness, accuracy and representation are superseded by principles of privacy, sustainability, multiplicity and decentralization. The present conceptual study, drawing on Ferrando’s philosophical posthumanism (as a post-humanism, as a post-dualism and as a post-anthropocentrism), Deleuze-Guattarian philosophy of immanence and Barad’s physics-philosophy strives to destabilize traditional understandings of translation ethics and bring an ethics that has loose ends and revolves around multiplicity and decentralization into the picture.

Keywords: ethics of entanglement, post-anthropocentrism, post-dualism, post-humanism, translation

Procedia PDF Downloads 74
1228 Portfolio Optimization with Reward-Risk Ratio Measure Based on the Mean Absolute Deviation

Authors: Wlodzimierz Ogryczak, Michal Przyluski, Tomasz Sliwinski

Abstract:

In problems of portfolio selection, the reward-risk ratio criterion is optimized to search for a risky portfolio with the maximum increase of the mean return in proportion to the risk measure increase when compared to the risk-free investments. In the classical model, following Markowitz, the risk is measured by the variance thus representing the Sharpe ratio optimization and leading to the quadratic optimization problems. Several Linear Programming (LP) computable risk measures have been introduced and applied in portfolio optimization. In particular, the Mean Absolute Deviation (MAD) measure has been widely recognized. The reward-risk ratio optimization with the MAD measure can be transformed into the LP formulation with the number of constraints proportional to the number of scenarios and the number of variables proportional to the total of the number of scenarios and the number of instruments. This may lead to the LP models with huge number of variables and constraints in the case of real-life financial decisions based on several thousands scenarios, thus decreasing their computational efficiency and making them hardly solvable by general LP tools. We show that the computational efficiency can be then dramatically improved by an alternative model based on the inverse risk-reward ratio minimization and by taking advantages of the LP duality. In the introduced LP model the number of structural constraints is proportional to the number of instruments thus not affecting seriously the simplex method efficiency by the number of scenarios and therefore guaranteeing easy solvability. Moreover, we show that under natural restriction on the target value the MAD risk-reward ratio optimization is consistent with the second order stochastic dominance rules.

Keywords: portfolio optimization, reward-risk ratio, mean absolute deviation, linear programming

Procedia PDF Downloads 405
1227 Seismic Interpretation and Petrophysical Evaluation of SM Field, Libya

Authors: Abdalla Abdelnabi, Yousf Abushalah

Abstract:

The G Formation is a major gas producing reservoir in the SM Field, eastern, Libya. It is called G limestone because it consists of shallow marine limestone. Well data and 3D-Seismic in conjunction with the results of a previous study were used to delineate the hydrocarbon reservoir of Middle Eocene G-Formation of SM Field area. The data include three-dimensional seismic data acquired in 2009. It covers approximately an area of 75 mi² and with more than 9 wells penetrating the reservoir. Seismic data are used to identify any stratigraphic and structural and features such as channels and faults and which may play a significant role in hydrocarbon traps. The well data are used to calculation petrophysical analysis of S field. The average porosity of the Middle Eocene G Formation is very good with porosity reaching 24% especially around well W 6. Average water saturation was calculated for each well from porosity and resistivity logs using Archie’s formula. The average water saturation for the whole well is 25%. Structural mapping of top and bottom of Middle Eocene G formation revealed the highest area in the SM field is at 4800 ft subsea around wells W4, W5, W6, and W7 and the deepest point is at 4950 ft subsea. Correlation between wells using well data and structural maps created from seismic data revealed that net thickness of G Formation range from 0 ft in the north part of the field to 235 ft in southwest and south part of the field. The gas water contact is found at 4860 ft using the resistivity log. The net isopach map using both the trapezoidal and pyramid rules are used to calculate the total bulk volume. The original gas in place and the recoverable gas were calculated volumetrically to be 890 Billion Standard Cubic Feet (BSCF) and 630 (BSCF) respectively.

Keywords: 3D seismic data, well logging, petrel, kingdom suite

Procedia PDF Downloads 148
1226 Effect of Organic Fertilizers on the Improvement of Soil Microbiological Functioning under Saline Conditions of Arid Regions: Impact on Carbon and Nitrogen Mineralization

Authors: Oustani Mabrouka, Halilat Md Tahar, Hannachi Slimane

Abstract:

This study was conducted on representative and contrasting soils of arid regions. It focuses on the compared influence of two organic fertilizers: poultry manure (PM) and bovine manure (BM) on improving the microbial functioning of non-saline (SS) and saline (SSS) soils, in particularly, the process of mineralization of nitrogen and carbon. The microbiological activity was estimated by respirometric test (CO2–C emissions) and the extraction of two forms of mineral nitrogen (NH4+-N and NO3--N). Thus, after 56 days of incubation under controlled conditions (28 degrees and 80 per cent of the field capacity), the two types of manures showed that the mineralization activity varies according to type of soil and the organic substrate itself. However, the highest cumulative quantities of CO2–C, NH4+–N and NO3-–N obtained at the end of incubation were recorded in non-saline (SS) soil treated with poultry manure with 1173.4, 4.26 and 8.40 mg/100 g of dry soil, respectively. The reductions in rates of release of CO2–C and of nitrification under saline conditions were 21 and 36, 78 %, respectively. The influence of organic substratum on the microbial density shows a stimulating effect on all microbial groups studied. The whole results show the usefulness of two types of manures for the improvement of the microbiological functioning of arid soils.

Keywords: Salinity, Organic matter, Microorganisms, Mineralization, Nitrogen, Carbon, Arid regions

Procedia PDF Downloads 278
1225 Statistical Discrimination of Blue Ballpoint Pen Inks by Diamond Attenuated Total Reflectance (ATR) FTIR

Authors: Mohamed Izzharif Abdul Halim, Niamh Nic Daeid

Abstract:

Determining the source of pen inks used on a variety of documents is impartial for forensic document examiners. The examination of inks is often performed to differentiate between inks in order to evaluate the authenticity of a document. A ballpoint pen ink consists of synthetic dyes in (acidic and/or basic), pigments (organic and/or inorganic) and a range of additives. Inks of similar color may consist of different composition and are frequently the subjects of forensic examinations. This study emphasizes on blue ballpoint pen inks available in the market because it is reported that approximately 80% of questioned documents analysis involving ballpoint pen ink. Analytical techniques such as thin layer chromatography, high-performance liquid chromatography, UV-vis spectroscopy, luminescence spectroscopy and infrared spectroscopy have been used in the analysis of ink samples. In this study, application of Diamond Attenuated Total Reflectance (ATR) FTIR is straightforward but preferable in forensic science as it offers no sample preparation and minimal analysis time. The data obtained from these techniques were further analyzed using multivariate chemometric methods which enable extraction of more information based on the similarities and differences among samples in a dataset. It was indicated that some pens from the same manufactures can be similar in composition, however, discrete types can be significantly different.

Keywords: ATR FTIR, ballpoint, multivariate chemometric, PCA

Procedia PDF Downloads 455
1224 Symmetric Key Encryption Algorithm Using Indian Traditional Musical Scale for Information Security

Authors: Aishwarya Talapuru, Sri Silpa Padmanabhuni, B. Jyoshna

Abstract:

Cryptography helps in preventing threats to information security by providing various algorithms. This study introduces a new symmetric key encryption algorithm for information security which is linked with the "raagas" which means Indian traditional scale and pattern of music notes. This algorithm takes the plain text as input and starts its encryption process. The algorithm then randomly selects a raaga from the list of raagas that is assumed to be present with both sender and the receiver. The plain text is associated with the thus selected raaga and an intermediate cipher-text is formed as the algorithm converts the plain text characters into other characters, depending upon the rules of the algorithm. This intermediate code or cipher text is arranged in various patterns in three different rounds of encryption performed. The total number of rounds in the algorithm is equal to the multiples of 3. To be more specific, the outcome or output of the sequence of first three rounds is again passed as the input to this sequence of rounds recursively, till the total number of rounds of encryption is performed. The raaga selected by the algorithm and the number of rounds performed will be specified at an arbitrary location in the key, in addition to important information regarding the rounds of encryption, embedded in the key which is known by the sender and interpreted only by the receiver, thereby making the algorithm hack proof. The key can be constructed of any number of bits without any restriction to the size. A software application is also developed to demonstrate this process of encryption, which dynamically takes the plain text as input and readily generates the cipher text as output. Therefore, this algorithm stands as one of the strongest tools for information security.

Keywords: cipher text, cryptography, plaintext, raaga

Procedia PDF Downloads 289
1223 Towards Human-Interpretable, Automated Learning of Feedback Control for the Mixing Layer

Authors: Hao Li, Guy Y. Cornejo Maceda, Yiqing Li, Jianguo Tan, Marek Morzynski, Bernd R. Noack

Abstract:

We propose an automated analysis of the flow control behaviour from an ensemble of control laws and associated time-resolved flow snapshots. The input may be the rich database of machine learning control (MLC) optimizing a feedback law for a cost function in the plant. The proposed methodology provides (1) insights into the control landscape, which maps control laws to performance, including extrema and ridge-lines, (2) a catalogue of representative flow states and their contribution to cost function for investigated control laws and (3) visualization of the dynamics. Key enablers are classification and feature extraction methods of machine learning. The analysis is successfully applied to the stabilization of a mixing layer with sensor-based feedback driving an upstream actuator. The fluctuation energy is reduced by 26%. The control replaces unforced Kelvin-Helmholtz vortices with subsequent vortex pairing by higher-frequency Kelvin-Helmholtz structures of lower energy. These efforts target a human interpretable, fully automated analysis of MLC identifying qualitatively different actuation regimes, distilling corresponding coherent structures, and developing a digital twin of the plant.

Keywords: machine learning control, mixing layer, feedback control, model-free control

Procedia PDF Downloads 222
1222 A Ground Structure Method to Minimize the Total Installed Cost of Steel Frame Structures

Authors: Filippo Ranalli, Forest Flager, Martin Fischer

Abstract:

This paper presents a ground structure method to optimize the topology and discrete member sizing of steel frame structures in order to minimize total installed cost, including material, fabrication and erection components. The proposed method improves upon existing cost-based ground structure methods by incorporating constructability considerations well as satisfying both strength and serviceability constraints. The architecture for the method is a bi-level Multidisciplinary Feasible (MDF) architecture in which the discrete member sizing optimization is nested within the topology optimization process. For each structural topology generated, the sizing optimization process seek to find a set of discrete member sizes that result in the lowest total installed cost while satisfying strength (member utilization) and serviceability (node deflection and story drift) criteria. To accurately assess cost, the connection details for the structure are generated automatically using accurate site-specific cost information obtained directly from fabricators and erectors. Member continuity rules are also applied to each node in the structure to improve constructability. The proposed optimization method is benchmarked against conventional weight-based ground structure optimization methods resulting in an average cost savings of up to 30% with comparable computational efficiency.

Keywords: cost-based structural optimization, cost-based topology and sizing, optimization, steel frame ground structure optimization, multidisciplinary optimization of steel structures

Procedia PDF Downloads 341
1221 Network Word Discovery Framework Based on Sentence Semantic Vector Similarity

Authors: Ganfeng Yu, Yuefeng Ma, Shanliang Yang

Abstract:

The word discovery is a key problem in text information retrieval technology. Methods in new word discovery tend to be closely related to words because they generally obtain new word results by analyzing words. With the popularity of social networks, individual netizens and online self-media have generated various network texts for the convenience of online life, including network words that are far from standard Chinese expression. How detect network words is one of the important goals in the field of text information retrieval today. In this paper, we integrate the word embedding model and clustering methods to propose a network word discovery framework based on sentence semantic similarity (S³-NWD) to detect network words effectively from the corpus. This framework constructs sentence semantic vectors through a distributed representation model, uses the similarity of sentence semantic vectors to determine the semantic relationship between sentences, and finally realizes network word discovery by the meaning of semantic replacement between sentences. The experiment verifies that the framework not only completes the rapid discovery of network words but also realizes the standard word meaning of the discovery of network words, which reflects the effectiveness of our work.

Keywords: text information retrieval, natural language processing, new word discovery, information extraction

Procedia PDF Downloads 91
1220 Analysis of Computer Science Papers Conducted by Board of Intermediate and Secondary Education at Secondary Level

Authors: Ameema Mahroof, Muhammad Saeed

Abstract:

The purpose of this study was to analyze the papers of computer science conducted by Board of Intermediate and Secondary Education with reference to Bloom’s taxonomy. The present study has two parts. First, the analysis is done on the papers conducted by Board of Intermediate of Secondary Education on the basis of basic rules of item construction especially Bloom’s (1956). And the item analysis is done to improve the psychometric properties of a test. The sample included the question papers of computer science of higher secondary classes (XI-XII) for the years 2011 and 2012. For item analysis, the data was collected from 60 students through convenient sampling. Findings of the study revealed that in the papers by Board of intermediate and secondary education the maximum focus was on knowledge and understanding level and very less focus was on the application, analysis, and synthesis. Furthermore, the item analysis on the question paper reveals that item difficulty of most of the questions did not show a balanced paper, the items were either very difficult while most of the items were too easy (measuring knowledge and understanding abilities). Likewise, most of the items were not truly discriminating the high and low achievers; four items were even negatively discriminating. The researchers also analyzed the items of the paper through software Conquest. These results show that the papers conducted by Board of Intermediate and Secondary Education were not well constructed. It was recommended that paper setters should be trained in developing the question papers that can measure various cognitive abilities of students so that a good paper in computer science should assess all cognitive abilities of students.

Keywords: Bloom’s taxonomy, question paper, item analysis, cognitive domain, computer science

Procedia PDF Downloads 148
1219 An Ontology-Based Framework to Support Asset Integrity Modeling: Case Study of Offshore Riser Integrity

Authors: Mohammad Sheikhalishahi, Vahid Ebrahimipour, Amir Hossein Radman-Kian

Abstract:

This paper proposes an Ontology framework for knowledge modeling and representation of the equipment integrity process in a typical oil and gas production plant. Our aim is to construct a knowledge modeling that facilitates translation, interpretation, and conversion of human-readable integrity interpretation into computer-readable representation. The framework provides a function structure related to fault propagation using ISO 14224 and ISO 15926 OWL-Lite/ Resource Description Framework (RDF) to obtain a generic system-level model of asset integrity that can be utilized in the integrity engineering process during the equipment life cycle. It employs standard terminology developed by ISO 15926 and ISO 14224 to map textual descriptions of equipment failure and then convert it to a causality-driven logic by semantic interpretation and computer-based representation using Lite/RDF. The framework applied for an offshore gas riser. The result shows that the approach can cross-link the failure-related integrity words and domain-specific logic to obtain a representation structure of equipment integrity with causality inference based on semantic extraction of inspection report context.

Keywords: asset integrity modeling, interoperability, OWL, RDF/XML

Procedia PDF Downloads 186
1218 A Validated UPLC-MS/MS Assay Using Negative Ionization Mode for High-Throughput Determination of Pomalidomide in Rat Plasma

Authors: Muzaffar Iqbal, Essam Ezzeldin, Khalid A. Al-Rashood

Abstract:

Pomalidomide is a second generation oral immunomodulatory agent, being used for the treatment of multiple myeloma in patients with disease refractory to lenalidomide and bortezomib. In this study, a sensitive UPLC-MS/MS assay was developed and validated for high-throughput determination of pomalidomide in rat plasma using celecoxib as an internal standard (IS). Liquid liquid extraction using dichloromethane as extracting agent was employed to extract pomalidomide and IS from 200 µL of plasma. Chromatographic separation was carried on Acquity BEHTM C18 column (50 × 2.1 mm, 1.7 µm) using an isocratic mobile phase of acetonitrile:10 mM ammonium acetate (80:20, v/v), at a flow rate of 0.250 mL/min. Both pomalidomide and IS were eluted at 0.66 ± 0.03 and 0.80 ± 0.03 min, respectively with a total run time of 1.5 min only. Detection was performed on a triple quadrupole tandem mass spectrometer using electrospray ionization in negative mode. The precursor to product ion transitions of m/z 272.01 → 160.89 for pomalidomide and m/z 380.08 → 316.01 for IS were used to quantify them respectively, using multiple reaction monitoring mode. The developed method was validated according to regulatory guideline for bioanalytical method validation. The linearity in plasma sample was achieved in the concentration range of 0.47–400 ng/mL (r2 ≥ 0.997). The intra and inter-day precision values were ≤ 11.1% (RSD, %) whereas accuracy values ranged from - 6.8 – 8.5% (RE, %). In addition, other validation results were within the acceptance criteria and the method was successfully applied in a pharmacokinetic study of pomalidomide in rats.

Keywords: pomalidomide, pharmacokinetics, LC-MS/MS, celecoxib

Procedia PDF Downloads 388
1217 Total-Reflection X-Ray Spectroscopy as a Tool for Element Screening in Food Samples

Authors: Hagen Stosnach

Abstract:

The analytical demands on modern instruments for element analysis in food samples include the analysis of major, trace and ultra-trace essential elements as well as potentially toxic trace elements. In this study total reflection, X-ray fluorescence analysis (TXRF) is presented as an analytical technique, which meets the requirements, defined by the Association of Official Agricultural Chemists (AOAC) regarding the limit of quantification, repeatability, reproducibility and recovery for most of the target elements. The advantages of TXRF are the small sample mass required, the broad linear range from µg/kg up to wt.-% values, no consumption of gases or cooling water, and the flexible and easy sample preparation. Liquid samples like alcoholic or non-alcoholic beverages can be analyzed without any preparation. For solid food samples, the most common sample pre-treatment methods are mineralization, direct deposition of the sample onto the reflector without/with minimal treatment, mainly as solid suspensions or after extraction. The main disadvantages are due to the possible peaks overlapping, which may lower the accuracy of quantitative analysis and the limit in the element identification. This analytical technique will be presented by several application examples, covering a broad range of liquid and solid food types.

Keywords: essential elements, toxic metals, XRF, spectroscopy

Procedia PDF Downloads 132
1216 Using Artificial Intelligence Technology to Build the User-Oriented Platform for Integrated Archival Service

Authors: Lai Wenfang

Abstract:

Tthis study will describe how to use artificial intelligence (AI) technology to build the user-oriented platform for integrated archival service. The platform will be launched in 2020 by the National Archives Administration (NAA) in Taiwan. With the progression of information communication technology (ICT) the NAA has built many systems to provide archival service. In order to cope with new challenges, such as new ICT, artificial intelligence or blockchain etc. the NAA will try to use the natural language processing (NLP) and machine learning (ML) skill to build a training model and propose suggestions based on the data sent to the platform. NAA expects the platform not only can automatically inform the sending agencies’ staffs which records catalogues are against the transfer or destroy rules, but also can use the model to find the details hidden in the catalogues and suggest NAA’s staff whether the records should be or not to be, to shorten the auditing time. The platform keeps all the users’ browse trails; so that the platform can predict what kinds of archives user could be interested and recommend the search terms by visualization, moreover, inform them the new coming archives. In addition, according to the Archives Act, the NAA’s staff must spend a lot of time to mark or remove the personal data, classified data, etc. before archives provided. To upgrade the archives access service process, the platform will use some text recognition pattern to black out automatically, the staff only need to adjust the error and upload the correct one, when the platform has learned the accuracy will be getting higher. In short, the purpose of the platform is to deduct the government digital transformation and implement the vision of a service-oriented smart government.

Keywords: artificial intelligence, natural language processing, machine learning, visualization

Procedia PDF Downloads 172
1215 Antioxidant Activity of Aristolochia longa L. Extracts

Authors: Merouani Nawel, Belhattab Rachid

Abstract:

Aristolochia longa L. (Aristolochiacea) is a native plant of Algeria used in traditional medicine. This study was devoted to the determination of polyphenols, flavonoids, and condensed tannins contents of Aristolochia longa L. after their extraction by using various solvents with different polarities (methanol, acetone and distilled water). These extracts were prepared from stem, leaves, fruits and rhizome. The antioxidant activity was determined using three in vitro assays methods: scavenging effect on DPPH, the reducing power assay and ẞ-carotene bleaching inhibition (CBI). The results obtained indicate that the acetone extracts from the aerial parts presented the highest contents of polyphenols. The results of The antioxidant activity showed that all extracts of Aristolochia longa L., prepared using different solvent, have diverse antioxidant capacities. However, the aerial parts methanol extract exhibited the highest antioxidant capacity of DPPH and reducing power (Respectively 55,04ug/ml±1,29 and 0,2 mg/ml±0,019 ). Nevertheless, the aerial parts acetone extract showed the highest antioxidant capacity in the test of ẞ-carotene bleaching inhibition with 57%. These preliminary results could be used to justify the traditional use of this plant and their bioactive substances could be exploited for therapeutic purposes such as antioxidant and antimicrobial.

Keywords: aristolochia longa l., polyphenols, flavonoids, condensed tannins, antioxidant activity

Procedia PDF Downloads 250
1214 Morphology Operation and Discrete Wavelet Transform for Blood Vessels Segmentation in Retina Fundus

Authors: Rita Magdalena, N. K. Caecar Pratiwi, Yunendah Nur Fuadah, Sofia Saidah, Bima Sakti

Abstract:

Vessel segmentation of retinal fundus is important for biomedical sciences in diagnosing ailments related to the eye. Segmentation can simplify medical experts in diagnosing retinal fundus image state. Therefore, in this study, we designed a software using MATLAB which enables the segmentation of the retinal blood vessels on retinal fundus images. There are two main steps in the process of segmentation. The first step is image preprocessing that aims to improve the quality of the image to be optimum segmented. The second step is the image segmentation in order to perform the extraction process to retrieve the retina’s blood vessel from the eye fundus image. The image segmentation methods that will be analyzed in this study are Morphology Operation, Discrete Wavelet Transform and combination of both. The amount of data that used in this project is 40 for the retinal image and 40 for manually segmentation image. After doing some testing scenarios, the average accuracy for Morphology Operation method is 88.46 % while for Discrete Wavelet Transform is 89.28 %. By combining the two methods mentioned in later, the average accuracy was increased to 89.53 %. The result of this study is an image processing system that can segment the blood vessels in retinal fundus with high accuracy and low computation time.

Keywords: discrete wavelet transform, fundus retina, morphology operation, segmentation, vessel

Procedia PDF Downloads 192
1213 Major Constraints to Adoption of Improved Post-harvest Technologies among Smallholder Farmers in Developing Countries: A Systematic Review

Authors: Muganyizi Jonas Bisheko, G. Rejikumar

Abstract:

Reducing post-harvest losses could be a sustainable solution to enhance the food and income security of smallholder farmers in developing countries. While various research institutions have come up with a number of innovative post-harvest technologies for reducing post-harvest losses, most of them have not been extensively adopted by smallholder farmers. Despite this gap, the synthesized information about the major constraints of post-harvest technology is scarce. This study has been conducted to fill this gap and show the implications of the findings for future post-harvest research. The developed search strategy retrieved 2201 studies. However, after excluding duplicates, title, abstract and full article screening, a total of 41 documents were identified. The major findings are: (i) there is an outstanding deficiency of systematic evidence of the effect of climate change, off-farm income and sources of post-harvest information on the adoption of improved post-harvest technologies; (ii) there is very limited information on adoption constraints pertaining to matters of policy, rules and regulations; (iii) there is very thin literature on behavioral constraints associated with limited adoption of improved post-harvest technologies; (iv) most of the studies focused on post-harvest storage technologies (47%) followed by overall post-harvest management practices (25%), processing technologies (19%) and packaging technologies (3%). Much of the information was found on Cereals (58%), especially maize (44%); (v) geographically, Sub-Saharan Africa accounted for 79% of the reviewed interventions, while South Asia occupied only 21%. The findings of this review are intended to guide various post-harvest technologists and decision-makers in addressing the challenge of huge post-harvest losses.

Keywords: constraints, post-harvest loss, post-harvest technology , smallholder farmer

Procedia PDF Downloads 233
1212 A Tuning Method for Microwave Filter via Complex Neural Network and Improved Space Mapping

Authors: Shengbiao Wu, Weihua Cao, Min Wu, Can Liu

Abstract:

This paper presents an intelligent tuning method of microwave filter based on complex neural network and improved space mapping. The tuning process consists of two stages: the initial tuning and the fine tuning. At the beginning of the tuning, the return loss of the filter is transferred to the passband via the error of phase. During the fine tuning, the phase shift caused by the transmission line and the higher order mode is removed by the curve fitting. Then, an Cauchy method based on the admittance parameter (Y-parameter) is used to extract the coupling matrix. The influence of the resonant cavity loss is eliminated during the parameter extraction process. By using processed data pairs (the amount of screw variation and the variation of the coupling matrix), a tuning model is established by the complex neural network. In view of the improved space mapping algorithm, the mapping relationship between the actual model and the ideal model is established, and the amplitude and direction of the tuning is constantly updated. Finally, the tuning experiment of the eight order coaxial cavity filter shows that the proposed method has a good effect in tuning time and tuning precision.

Keywords: microwave filter, scattering parameter, coupling matrix, intelligent tuning

Procedia PDF Downloads 309
1211 Pharmacogenetic Analysis of Inter-Ethnic Variability in the Uptake Transporter SLCO1B1 Gene in Colombian, Mozambican, and Portuguese Populations

Authors: Mulata Haile Nega, Derebew Fikadu Berhe, Vera Ribeiro Marques

Abstract:

There is no epidemiologic data on this gene polymorphism in several countries. Therefore, this study aimed to assess the genotype and allele frequencies of the gene variant in three countries. This study involved healthy individuals from Colombia, Mozambique, and Portugal. Genomic DNA was isolated from blood samples using the Qiamp DNA Extraction Kit (Qiagen). The isolated DNA was genotyped using Polymerase Chain Reaction (PCR) - Restriction Fragment Length Polymorphism. Microstat and GraphPad quick cal software were used for the Chi-square test and evaluation of Hardy-Weinberg equilibrium, respectively. A total of 181 individuals’ blood sample was analyzed. Overall, TT (74.0%) genotype was the highest, and CC (7.8%) was the lowest. Country wise genotypic frequencies were Colombia 47(70.2%) TT, 12(17.9%) TC and 8(11.9%) CC; Mozambique 47(88.7%) TT, 5(9.4%) TC, and 1(1.9%) CC; and Portugal 40(65.6%) TT, 16(26.2%) TC, and 5(8.2%) CC. The reference (T) allele was highest among Mozambicans (93.4%) compared to Colombians (79.1%) and Portuguese (78.7%). Mozambicans showed statistically significant genotypic and allelic frequency differences compared to Colombians (p<0.01) and Portuguese (p <0.01). Overall and country-wise, the CC genotype was less frequent and relatively high for Colombians and Portuguese populations. This finding may imply statins risk-benefit variability associated with CC genotype among these populations that needs further understanding.

Keywords: c.521T>C, polymorphism, SLCO1B1, SNP, statins

Procedia PDF Downloads 131