Search results for: hybrid techniques
3793 Bio-Inspired Information Complexity Management: From Ant Colony to Construction Firm
Authors: Hamza Saeed, Khurram Iqbal Ahmad Khan
Abstract:
Effective information management is crucial for any construction project and its success. Primary areas of information generation are either the construction site or the design office. There are different types of information required at different stages of construction involving various stakeholders creating complexity. There is a need for effective management of information flows to reduce uncertainty creating complexity. Nature provides a unique perspective in terms of dealing with complexity, in particular, information complexity. System dynamics methodology provides tools and techniques to address complexity. It involves modeling and simulation techniques that help address complexity. Nature has been dealing with complex systems since its creation 4.5 billion years ago. It has perfected its system by evolution, resilience towards sudden changes, and extinction of unadaptable and outdated species that are no longer fit for the environment. Nature has been accommodating the changing factors and handling complexity forever. Humans have started to look at their natural counterparts for inspiration and solutions for their problems. This brings forth the possibility of using a biomimetics approach to improve the management practices used in the construction sector. Ants inhabit different habitats. Cataglyphis and Pogonomyrmex live in deserts, Leafcutter ants reside in rainforests, and Pharaoh ants are native to urban developments of tropical areas. Detailed studies have been done on fifty species out of fourteen thousand discovered. They provide the opportunity to study the interactions in diverse environments to generate collective behavior. Animals evolve to better adapt to their environment. The collective behavior of ants emerges from feedback through interactions among individuals, based on a combination of three basic factors: The patchiness of resources in time and space, operating cost, environmental stability, and the threat of rupture. If resources appear in patches through time and space, the response is accelerating and non-linear, and if resources are scattered, the response follows a linear pattern. If the acquisition of energy through food is faster than energy spent to get it, the default is to continue with an activity unless it is halted for some reason. If the energy spent is rather higher than getting it, the default changes to stay put unless activated. Finally, if the environment is stable and the threat of rupture is low, the activation and amplification rate is slow but steady. Otherwise, it is fast and sporadic. To further study the effects and to eliminate the environmental bias, the behavior of four different ant species were studied, namely Red Harvester ants (Pogonomyrmex Barbatus), Argentine ants (Linepithema Humile), Turtle ants (Cephalotes Goniodontus), Leafcutter ants (Genus: Atta). This study aims to improve the information system in the construction sector by providing a guideline inspired by nature with a systems-thinking approach, using system dynamics as a tool. Identified factors and their interdependencies were analyzed in the form of a causal loop diagram (CLD), and construction industry professionals were interviewed based on the developed CLD, which was validated with significance response. These factors and interdependencies in the natural system corresponds with the man-made systems, providing a guideline for effective use and flow of information.Keywords: biomimetics, complex systems, construction management, information management, system dynamics
Procedia PDF Downloads 1393792 General Purpose Graphic Processing Units Based Real Time Video Tracking System
Authors: Mallikarjuna Rao Gundavarapu, Ch. Mallikarjuna Rao, K. Anuradha Bai
Abstract:
Real Time Video Tracking is a challenging task for computing professionals. The performance of video tracking techniques is greatly affected by background detection and elimination process. Local regions of the image frame contain vital information of background and foreground. However, pixel-level processing of local regions consumes a good amount of computational time and memory space by traditional approaches. In our approach we have explored the concurrent computational ability of General Purpose Graphic Processing Units (GPGPU) to address this problem. The Gaussian Mixture Model (GMM) with adaptive weighted kernels is used for detecting the background. The weights of the kernel are influenced by local regions and are updated by inter-frame variations of these corresponding regions. The proposed system has been tested with GPU devices such as GeForce GTX 280, GeForce GTX 280 and Quadro K2000. The results are encouraging with maximum speed up 10X compared to sequential approach.Keywords: connected components, embrace threads, local weighted kernel, structuring elements
Procedia PDF Downloads 4433791 Investigations of Protein Aggregation Using Sequence and Structure Based Features
Authors: M. Michael Gromiha, A. Mary Thangakani, Sandeep Kumar, D. Velmurugan
Abstract:
The main cause of several neurodegenerative diseases such as Alzhemier, Parkinson, and spongiform encephalopathies is formation of amyloid fibrils and plaques in proteins. We have analyzed different sets of proteins and peptides to understand the influence of sequence-based features on protein aggregation process. The comparison of 373 pairs of homologous mesophilic and thermophilic proteins showed that aggregation-prone regions (APRs) are present in both. But, the thermophilic protein monomers show greater ability to ‘stow away’ the APRs in their hydrophobic cores and protect them from solvent exposure. The comparison of amyloid forming and amorphous b-aggregating hexapeptides suggested distinct preferences for specific residues at the six positions as well as all possible combinations of nine residue pairs. The compositions of residues at different positions and residue pairs have been converted into energy potentials and utilized for distinguishing between amyloid forming and amorphous b-aggregating peptides. Our method could correctly identify the amyloid forming peptides at an accuracy of 95-100% in different datasets of peptides.Keywords: aggregation, amyloids, thermophilic proteins, amino acid residues, machine learning techniques
Procedia PDF Downloads 6183790 Age–Related Changes of the Sella Turcica Morphometry in Adults Older Than 20-25 Years
Authors: Yu. I. Pigolkin, M. A. Garcia Corro
Abstract:
Age determination of unknown dead bodies in forensic personal identification is a complicated process which involves the application of numerous methods and techniques. Skeletal remains are less exposed to influences of environmental factors. In order to enhance the accuracy of forensic age estimation additional properties of bones correlating with age are required to be revealed. Material and Methods: Dimensional examination of the sella turcica was carried out on cadavers with the cranium opened by a circular vibrating saw. The sample consisted of a total of 90 Russian subjects, ranging in age from two months and 87 years. Results: The tendency of dimensional variations throughout life was detected. There were no observed gender differences in the morphometry of the sella turcica. The shared use of the sella turcica depth and length values revealed the possibility to categorize an examined sample in a certain age period. Conclusions: Based on the results of existing methods of age determination, the morphometry of the sella turcica can be an additional characteristic, amplifying the received values, and accordingly, increasing the accuracy of forensic biological age diagnosis.Keywords: age–related changes in bone structures, forensic personal identification, sella turcica morphometry, body identification
Procedia PDF Downloads 2773789 Cross Project Software Fault Prediction at Design Phase
Authors: Pradeep Singh, Shrish Verma
Abstract:
Software fault prediction models are created by using the source code, processed metrics from the same or previous version of code and related fault data. Some company do not store and keep track of all artifacts which are required for software fault prediction. To construct fault prediction model for such company, the training data from the other projects can be one potential solution. The earlier we predict the fault the less cost it requires to correct. The training data consists of metrics data and related fault data at function/module level. This paper investigates fault predictions at early stage using the cross-project data focusing on the design metrics. In this study, empirical analysis is carried out to validate design metrics for cross project fault prediction. The machine learning techniques used for evaluation is Naïve Bayes. The design phase metrics of other projects can be used as initial guideline for the projects where no previous fault data is available. We analyze seven data sets from NASA Metrics Data Program which offer design as well as code metrics. Overall, the results of cross project is comparable to the within company data learning.Keywords: software metrics, fault prediction, cross project, within project.
Procedia PDF Downloads 3463788 Image Encryption Using Eureqa to Generate an Automated Mathematical Key
Authors: Halima Adel Halim Shnishah, David Mulvaney
Abstract:
Applying traditional symmetric cryptography algorithms while computing encryption and decryption provides immunity to secret keys against different attacks. One of the popular techniques generating automated secret keys is evolutionary computing by using Eureqa API tool, which got attention in 2013. In this paper, we are generating automated secret keys for image encryption and decryption using Eureqa API (tool which is used in evolutionary computing technique). Eureqa API models pseudo-random input data obtained from a suitable source to generate secret keys. The validation of generated secret keys is investigated by performing various statistical tests (histogram, chi-square, correlation of two adjacent pixels, correlation between original and encrypted images, entropy and key sensitivity). Experimental results obtained from methods including histogram analysis, correlation coefficient, entropy and key sensitivity, show that the proposed image encryption algorithms are secure and reliable, with the potential to be adapted for secure image communication applications.Keywords: image encryption algorithms, Eureqa, statistical measurements, automated key generation
Procedia PDF Downloads 4873787 Application of Knowledge Discovery in Database Techniques in Cost Overruns of Construction Projects
Authors: Mai Ghazal, Ahmed Hammad
Abstract:
Cost overruns in construction projects are considered as worldwide challenges since the cost performance is one of the main measures of success along with schedule performance. To overcome this problem, studies were conducted to investigate the cost overruns' factors, also projects' historical data were analyzed to extract new and useful knowledge from it. This research is studying and analyzing the effect of some factors causing cost overruns using the historical data from completed construction projects. Then, using these factors to estimate the probability of cost overrun occurrence and predict its percentage for future projects. First, an intensive literature review was done to study all the factors that cause cost overrun in construction projects, then another review was done for previous researcher papers about mining process in dealing with cost overruns. Second, a proposed data warehouse was structured which can be used by organizations to store their future data in a well-organized way so it can be easily analyzed later. Third twelve quantitative factors which their data are frequently available at construction projects were selected to be the analyzed factors and suggested predictors for the proposed model.Keywords: construction management, construction projects, cost overrun, cost performance, data mining, data warehousing, knowledge discovery, knowledge management
Procedia PDF Downloads 3773786 Optimizing Production Yield Through Process Parameter Tuning Using Deep Learning Models: A Case Study in Precision Manufacturing
Authors: Tolulope Aremu
Abstract:
This paper is based on the idea of using deep learning methodology for optimizing production yield by tuning a few key process parameters in a manufacturing environment. The study was explicitly on how to maximize production yield and minimize operational costs by utilizing advanced neural network models, specifically Long Short-Term Memory and Convolutional Neural Networks. These models were implemented using Python-based frameworks—TensorFlow and Keras. The targets of the research are the precision molding processes in which temperature ranges between 150°C and 220°C, the pressure ranges between 5 and 15 bar, and the material flow rate ranges between 10 and 50 kg/h, which are critical parameters that have a great effect on yield. A dataset of 1 million production cycles has been considered for five continuous years, where detailed logs are present showing the exact setting of parameters and yield output. The LSTM model would model time-dependent trends in production data, while CNN analyzed the spatial correlations between parameters. Models are designed in a supervised learning manner. For the model's loss, an MSE loss function is used, optimized through the Adam optimizer. After running a total of 100 training epochs, 95% accuracy was achieved by the models recommending optimal parameter configurations. Results indicated that with the use of RSM and DOE traditional methods, there was an increase in production yield of 12%. Besides, the error margin was reduced by 8%, hence consistent quality products from the deep learning models. The monetary value was annually around $2.5 million, the cost saved from material waste, energy consumption, and equipment wear resulting from the implementation of optimized process parameters. This system was deployed in an industrial production environment with the help of a hybrid cloud system: Microsoft Azure, for data storage, and the training and deployment of their models were performed on Google Cloud AI. The functionality of real-time monitoring of the process and automatic tuning of parameters depends on cloud infrastructure. To put it into perspective, deep learning models, especially those employing LSTM and CNN, optimize the production yield by fine-tuning process parameters. Future research will consider reinforcement learning with a view to achieving further enhancement of system autonomy and scalability across various manufacturing sectors.Keywords: production yield optimization, deep learning, tuning of process parameters, LSTM, CNN, precision manufacturing, TensorFlow, Keras, cloud infrastructure, cost saving
Procedia PDF Downloads 423785 Enhancement of X-Rays Images Intensity Using Pixel Values Adjustments Technique
Authors: Yousif Mohamed Y. Abdallah, Razan Manofely, Rajab M. Ben Yousef
Abstract:
X-Ray images are very popular as a first tool for diagnosis. Automating the process of analysis of such images is important in order to help physician procedures. In this practice, teeth segmentation from the radiographic images and feature extraction are essential steps. The main objective of this study was to study correction preprocessing of x-rays images using local adaptive filters in order to evaluate contrast enhancement pattern in different x-rays images such as grey color and to evaluate the usage of new nonlinear approach for contrast enhancement of soft tissues in x-rays images. The data analyzed by using MatLab program to enhance the contrast within the soft tissues, the gray levels in both enhanced and unenhanced images and noise variance. The main techniques of enhancement used in this study were contrast enhancement filtering and deblurring images using the blind deconvolution algorithm. In this paper, prominent constraints are firstly preservation of image's overall look; secondly, preservation of the diagnostic content in the image and thirdly detection of small low contrast details in diagnostic content of the image.Keywords: enhancement, x-rays, pixel intensity values, MatLab
Procedia PDF Downloads 4893784 Improved Acoustic Source Sensing and Localization Based On Robot Locomotion
Authors: V. Ramu Reddy, Parijat Deshpande, Ranjan Dasgupta
Abstract:
This paper presents different methodology for an acoustic source sensing and localization in an unknown environment. The developed methodology includes an acoustic based sensing and localization system, a converging target localization based on the recursive direction of arrival (DOA) error minimization, and a regressive obstacle avoidance function. Our method is able to augment the existing proven localization techniques and improve results incrementally by utilizing robot locomotion and is capable of converging to a position estimate with greater accuracy using fewer measurements. The results also evinced the DOA error minimization at each iteration, improvement in time for reaching the destination and the efficiency of this target localization method as gradually converging to the real target position. Initially, the system is tested using Kinect mounted on turntable with DOA markings which serve as a ground truth and then our approach is validated using a FireBird VI (FBVI) mobile robot on which Kinect is used to obtain bearing information.Keywords: acoustic source localization, acoustic sensing, recursive direction of arrival, robot locomotion
Procedia PDF Downloads 4953783 Synthesis and Characterisation of Different Blends of Virgin Polyethylene Modified by Naturel Fibres Alfa
Authors: Benalia Kouini
Abstract:
The basic idea of this study is to promote a polyethylene recycle and local vegetable fiber (alfa) in the development and characterization of a new composite material. In this work, different sizes of fiber alfa (<63 microns, between 63 and 125 microns, 125 and 250 microns) were incorporated into the blends (HDPE / recycled HDPE) with different methods elaboration (extruder twin-screw and twin-cylinder mixer). The fiber was modified by sodium hydroxide in order to evaluate the effect of alkaline treatment on the interfacial adhesion and therefore the properties of composites prepared. These were characterized by various techniques: mechanical (tensile and Charpy impact test), Rheological (melt flow), morphological (SEM). The demonstration of the effect of alkali treatment on alfa fiber was examined by FTIR spectroscopy and morphological analysis. The introduction of alfa treated fiber in the (HDPE/recycled HDPE) increased stress, impact strength and Young's modulus on the contrary, the elongation at break decreased. The results of the mechanical properties showed an improvement is better in extrusion twin-screw mixer than two cylinders.Keywords: naturel fiber, alfa, recycling, blends, polyethylene
Procedia PDF Downloads 1433782 Self-Weight Reduction of Tall Structures by Taper Cladding System
Authors: Divya Dharshini Omprakash, Anjali Subramani
Abstract:
Most of the tall structures are constructed using shear walls and tube systems in the recent decades. This makes the structure heavy and less resistant to lateral effects as the height of the structure goes up. This paper aims in the reduction of self-weight in tall structures by the use of Taper Cladding System (TCS) and also enumerates the construction techniques used in TCS. TCS has a tapering clad either fixed at the top or bottom of the structural core at the tapered end. This system eliminates the use of RC structural elements on the exterior of the structure and uses fewer columns only on the interior part to take up the gravity loads in order to reduce the self-weight of the structure. The self-weight reduction by TCS is 50% more compared to the present structural systems. The lateral loads on the hull will be taken care of by the tapered steel frame. Analysis were done to study the structural behaviour of taper cladded buildings subjected to lateral loads. TCS has a great impact in the construction of tall structures in seismic and dense urban areas. An effective construction management can be done by the use of Taper Cladding System. In this paper, sustainability, design considerations and implications of the system has also been discussed.Keywords: Lateral Loads Resistance, reduction of self-weight, sustainable, taper clads
Procedia PDF Downloads 2933781 Analysis and Modeling of Graphene-Based Percolative Strain Sensor
Authors: Heming Yao
Abstract:
Graphene-based percolative strain gauges could find applications in many places such as touch panels, artificial skins or human motion detection because of its advantages over conventional strain gauges such as flexibility and transparency. These strain gauges rely on a novel sensing mechanism that depends on strain-induced morphology changes. Once a compression or tension strain is applied to Graphene-based percolative strain gauges, the overlap area between neighboring flakes becomes smaller or larger, which is reflected by the considerable change of resistance. Tiny strain change on graphene-based percolative strain sensor can act as an important leverage to tremendously increase resistance of strain sensor, which equipped graphene-based percolative strain gauges with higher gauge factor. Despite ongoing research in the underlying sensing mechanism and the limits of sensitivity, neither suitable understanding has been obtained of what intrinsic factors play the key role in adjust gauge factor, nor explanation on how the strain gauge sensitivity can be enhanced, which is undoubtedly considerably meaningful and provides guideline to design novel and easy-produced strain sensor with high gauge factor. We here simulated the strain process by modeling graphene flakes and its percolative networks. We constructed the 3D resistance network by simulating overlapping process of graphene flakes and interconnecting tremendous number of resistance elements which were obtained by fractionizing each piece of graphene. With strain increasing, the overlapping graphenes was dislocated on new stretched simulation graphene flake simulation film and a new simulation resistance network was formed with smaller flake number density. By solving the resistance network, we can get the resistance of simulation film under different strain. Furthermore, by simulation on possible variable parameters, such as out-of-plane resistance, in-plane resistance, flake size, we obtained the changing tendency of gauge factor with all these variable parameters. Compared with the experimental data, we verified the feasibility of our model and analysis. The increase of out-of-plane resistance of graphene flake and the initial resistance of sensor, based on flake network, both improved gauge factor of sensor, while the smaller graphene flake size gave greater gauge factor. This work can not only serve as a guideline to improve the sensitivity and applicability of graphene-based strain sensors in the future, but also provides method to find the limitation of gauge factor for strain sensor based on graphene flake. Besides, our method can be easily transferred to predict gauge factor of strain sensor based on other nano-structured transparent optical conductors, such as nanowire and carbon nanotube, or of their hybrid with graphene flakes.Keywords: graphene, gauge factor, percolative transport, strain sensor
Procedia PDF Downloads 4213780 Cooperative Diversity Scheme Based on MIMO-OFDM in Small Cell Network
Authors: Dong-Hyun Ha, Young-Min Ko, Chang-Bin Ha, Hyoung-Kyu Song
Abstract:
In Heterogeneous network (HetNet) can provide high quality of a service in a wireless communication system by composition of small cell networks. The composition of small cell networks improves cell coverage and capacity to the mobile users.Recently, various techniques using small cell networks have been researched in the wireless communication system. In this paper, the cooperative scheme obtaining high reliability is proposed in the small cell networks. The proposed scheme suggests a cooperative small cell system and the new signal transmission technique in the proposed system model. The new signal transmission technique applies a cyclic delay diversity (CDD) scheme based on the multiple input multiple output-orthogonal frequency division multiplexing (MIMO-OFDM) system to obtain improved performance. The improved performance of the proposed scheme is confirmed by the simulation results.Keywords: adaptive transmission, cooperative communication, diversity gain, OFDM
Procedia PDF Downloads 5093779 Keypoints Extraction for Markerless Tracking in Augmented Reality Applications: A Case Study in Dar As-Saraya Museum
Authors: Jafar W. Al-Badarneh, Abdalkareem R. Al-Hawary, Abdulmalik M. Morghem, Mostafa Z. Ali, Rami S. Al-Gharaibeh
Abstract:
Archeological heritage is at the heart of each country’s national glory. Moreover, it could develop into a source of national income. Heritage management requires socially-responsible marketing that achieves high visitor satisfaction while maintaining high site conservation. We have developed an Augmented Reality (AR) experience for heritage and cultural reservation at Dar-As-Saraya museum in Jordan. Our application of this notion relied on markerless-based tracking approach. This approach uses keypoints extraction technique where features of the environment are identified and defined into the system as keypoints. A set of these keypoints forms a tracker for an augmented object to be displayed and overlaid with a real scene at Dar As-Saraya museum. We tested and compared several techniques for markerless tracking and then applied the best technique to complete a mosaic artifact with AR content. The successful results from our application open the door for applications in open archeological sites where markerless tracking is mostly needed.Keywords: augmented reality, cultural heritage, keypoints extraction, virtual recreation
Procedia PDF Downloads 3403778 A Comparative Study on the Impact of Global Warming of Applying Low Carbon Factor Concrete Products
Authors: Su-Hyun Cho, Chang-U Chae
Abstract:
Environmental impact assessment techniques have been developed as a result of the worldwide efforts to reduce the environmental impact of global warming. By using the quantification method in the construction industry, it is now possible to manage the greenhouse gas is to systematically evaluate the impact on the environment over the entire construction process. In particular, the proportion of greenhouse gas emissions at the production stage of construction material occupied is high, and efforts are needed in particular in the construction field. In this study, intended for concrete products for the construction materials, by using the LCA evaluation method, we compared the results of environmental impact assessment and carbon emissions of developing products that have been applied low-carbon technologies compared to existing products. As a results, by introducing a raw material of industrial waste, showed carbon reduction. Through a comparison of the carbon emission reduction effect of low-carbon technologies, it is intended to provide academic data for the evaluation of greenhouse gases in the construction sector and the development of low-carbon technologies of the future.Keywords: CO₂ emissions, CO₂ reduction, ready-mixed concrete, environmental impact assessment
Procedia PDF Downloads 4813777 Simscape Library for Large-Signal Physical Network Modeling of Inertial Microelectromechanical Devices
Authors: S. Srinivasan, E. Cretu
Abstract:
The information flow (e.g. block-diagram or signal flow graph) paradigm for the design and simulation of Microelectromechanical (MEMS)-based systems allows to model MEMS devices using causal transfer functions easily, and interface them with electronic subsystems for fast system-level explorations of design alternatives and optimization. Nevertheless, the physical bi-directional coupling between different energy domains is not easily captured in causal signal flow modeling. Moreover, models of fundamental components acting as building blocks (e.g. gap-varying MEMS capacitor structures) depend not only on the component, but also on the specific excitation mode (e.g. voltage or charge-actuation). In contrast, the energy flow modeling paradigm in terms of generalized across-through variables offers an acausal perspective, separating clearly the physical model from the boundary conditions. This promotes reusability and the use of primitive physical models for assembling MEMS devices from primitive structures, based on the interconnection topology in generalized circuits. The physical modeling capabilities of Simscape have been used in the present work in order to develop a MEMS library containing parameterized fundamental building blocks (area and gap-varying MEMS capacitors, nonlinear springs, displacement stoppers, etc.) for the design, simulation and optimization of MEMS inertial sensors. The models capture both the nonlinear electromechanical interactions and geometrical nonlinearities and can be used for both small and large signal analyses, including the numerical computation of pull-in voltages (stability loss). Simscape behavioral modeling language was used for the implementation of reduced-order macro models, that present the advantage of a seamless interface with Simulink blocks, for creating hybrid information/energy flow system models. Test bench simulations of the library models compare favorably with both analytical results and with more in-depth finite element simulations performed in ANSYS. Separate MEMS-electronic integration tests were done on closed-loop MEMS accelerometers, where Simscape was used for modeling the MEMS device and Simulink for the electronic subsystem.Keywords: across-through variables, electromechanical coupling, energy flow, information flow, Matlab/Simulink, MEMS, nonlinear, pull-in instability, reduced order macro models, Simscape
Procedia PDF Downloads 1423776 Ceramic Membrane Filtration Technologies for Oilfield Produced Water Treatment
Authors: Mehrdad Ebrahimi, Oliver Schmitz, Axel Schmidt, Peter Czermak
Abstract:
“Produced water” (PW) is any fossil water that is brought to the surface along with crude oil or natural gas. By far, PW is the largest waste stream by volume associated with oil and gas production operations. Due to the increasing volume of waste all over the world in the current decade, the outcome and effect of discharging PW on the environment has lately become a significant issue of environmental concerns. Therefore, there is a need for new technologies for PW treatment due to increase focus on water conservation and environmental regulation. The use of membrane processes for treatment of PW has several advantages over many of the traditional separation techniques. In oilfield produced water treatment with ceramic membranes, process efficiency is characterized by the specific permeate flux and by the oil separation performance. Apart from the membrane properties, the permeate flux during filtration of oily wastewaters is known to be strongly dependent on the constituents of the feed solution, as well as on process conditions, e.g. trans-membrane pressure (TMP) and cross-flow velocity (CFV). The research project presented in these report describes the application of different ceramic membrane filtration technologies for the efficient treatment of oil-field produced water and different model oily solutions.Keywords: ceramic membrane, membrane fouling, oil rejection, produced water treatment
Procedia PDF Downloads 1913775 Static Priority Approach to Under-Frequency Based Load Shedding Scheme in Islanded Industrial Networks: Using the Case Study of Fatima Fertilizer Company Ltd - FFL
Authors: S. H. Kazmi, T. Ahmed, K. Javed, A. Ghani
Abstract:
In this paper static scheme of under-frequency based load shedding is considered for chemical and petrochemical industries with islanded distribution networks relying heavily on the primary commodity to ensure minimum production loss, plant downtime or critical equipment shutdown. A simplistic methodology is proposed for in-house implementation of this scheme using underfrequency relays and a step by step guide is provided including the techniques to calculate maximum percentage overloads, frequency decay rates, time based frequency response and frequency based time response of the system. Case study of FFL electrical system is utilized, presenting the actual system parameters and employed load shedding settings following the similar series of steps. The arbitrary settings are then verified for worst overload conditions (loss of a generation source in this case) and comprehensive system response is then investigated.Keywords: islanding, under-frequency load shedding, frequency rate of change, static UFLS
Procedia PDF Downloads 4913774 Green Synthesis of Silver Nanoparticles from Citrus aurantium Aqueous Pollen Extract and Their Antibacterial Activity
Authors: Mohammad Ali Karimi, Hossein Tavallali, Abdolhamid Hatefi-Mehrjardi
Abstract:
Pollen extract of in vitro plants raised of Citrus aurantium as reducer and stabilizer was assessed for the green synthesis of silver nanoparticles (AgNPs). The synthesis of AgNPs was performed at room temperature assisting in solutions by reduction takes place rapidly for 10 min. Surface plasmon resonance (SPR) peaks in UV–Vis spectra indicated the formation of polydispersive AgNPs. Silver ions concentration, pH, temperature and reaction time were optimized in the synthesis of AgNPs. The nanoparticles obtained were characterized by UV-Vis spectrophotometer, transmission electron microscopy (TEM). X-ray diffraction (XRD) and Fourier transform infrared (FTIR) spectroscopy techniques. The synthesized AgNPs were mostly spherical in shape with an average size of 15 nm. XRD study shows that the AgNPs are crystalline in nature with face-centered cubic (fcc) geometry. It shows the significant antibacterial efficacy against Gram-positive (Staphylococcus aureus) and Gram-negative bacteria (Escherichia coli) by disk diffusion method using Mueller-Hinton Agar.Keywords: green synthesis, Citrus aurantium, silver nanoparticles, antibacterial activity
Procedia PDF Downloads 2903773 N₂O₂ Salphen-Like Ligand and Its Pd(II), Ag(I) and Cu(II) Complexes as Potentially Anticancer Agents: Design, Synthesis, Antimicrobial, CT-DNA Binding and Molecular Docking
Authors: Laila H. Abdel-Rahman, Mohamed Shaker S. Adam, Ahmed M. Abu-Dief, Hanan El-Sayed Ahmed
Abstract:
In this investigation, Cu(II), Pd(II) and Ag(I) complexes with the tetra-dentate DSPH Schiff base ligand were synthesized. The DSPH Schiff base and its complexes were characterized by using different physicochemical and spectral analysis. The results revealed that the metal ions coordinated with DSPH ligand through azomethine nitrogen and phenolic oxygen. Cu(II), Pd(II) and Ag(I) complexes are present in a 1:1 molar ratio. Pd(II) and Ag(I) complexes have square planar geometries while, Cu(II) has a distorted octahedral (Oh) geometry. All investigated complexes are nonelectrolytes. The investigated compounds were tested against different strains of bacteria and fungi. Both prepared compounds showed good results of inhibition against the selected pathogenic microorganism. Moreover, the interaction of investigated complexes with CT-DNA was studied via various techniques and the binding modes are mainly intercalative and grooving modes. Operating Environment MOE package was used to do docking studies for the investigated complexes to explore the potential binding mode and energy. Furthermore, the growth inhibitory effect of the investigated compounds was examined on some cancer cells lines.Keywords: tetradentate, antimicrobial, CT-DNA interaction, docking, anticancer
Procedia PDF Downloads 2473772 Optimization of Biodiesel Production from Palm Oil over Mg-Al Modified K-10 Clay Catalyst
Authors: Muhammad Ayoub, Abrar Inayat, Bhajan Lal, Sintayehu Mekuria Hailegiorgis
Abstract:
Biodiesel which comes from pure renewable resources provide an alternative fuel option for future because of limited fossil fuel resources as well as environmental concerns. The transesterification of vegetable oils for biodiesel production is a promising process to overcome this future crises of energy. The use of heterogeneous catalysts greatly simplifies the technological process by facilitating the separation of the post-reaction mixture. The purpose of the present work was to examine a heterogeneous catalyst, in particular, Mg-Al modified K-10 clay, to produce methyl esters of palm oil. The prepared catalyst was well characterized by different latest techniques. In this study, the transesterification of palm oil with methanol was studied in a heterogeneous system in the presence of Mg-Al modified K-10 clay as solid base catalyst and then optimized these results with the help of Design of Experiments software. The results showed that methanol is the best alcohol for this reaction condition. The best results was achieved for optimization of biodiesel process. The maximum conversion of triglyceride (88%) was noted after 8 h of reaction at 60 ̊C, with a 6:1 molar ratio of methanol to palm oil and 3 wt % of prepared catalyst.Keywords: palm oil, transestrefication, clay, biodiesel, mesoporous clay, K-10
Procedia PDF Downloads 3993771 D6tions: A Serious Game to Learn Software Engineering Process and Design
Authors: Hector G. Perez-Gonzalez, Miriam Vazquez-Escalante, Sandra E. Nava-Muñoz, Francisco E. Martinez-Perez, Alberto S. Nunez-Varela
Abstract:
The software engineering teaching process has been the subject of many studies. To improve this process, researchers have proposed merely illustrative techniques in the classroom, such as topic presentations and dynamics between students on one side or attempts to involve students in real projects with companies and institutions to bring them to a real software development problem on the other hand. Simulators and serious games have been used as auxiliary tools to introduce students to topics that are too abstract when these are presented in the traditional way. Most of these tools cover a limited area of the huge software engineering scope. To address this problem, we have developed D6tions, an educational serious game that simulates the software engineering process and is designed to experiment the different stages a software engineer (playing roles as project leader or as a developer or designer) goes through, while participating in a software project. We describe previous approaches to this problem, how D6tions was designed, its rules, directions, and the results we obtained of the use of this game involving undergraduate students playing the game.Keywords: serious games, software engineering, software engineering education, software engineering teaching process
Procedia PDF Downloads 4983770 Perceived Effects of Nurses’ Work Environment on Quality of Nursing Outcomes in a Tertiary Hospital, Nigeria
Authors: Ifeoluwapo Oluwafunke Kolawole, Prisca Olabisi Adejumo
Abstract:
Background/Objectives: A healthy work environment increases the well-being of nurses, quality of patient care, and that of the institution. This study assessed the perceived effects of the work environment on the quality of nursing outcomes. Methods: This descriptive cross-sectional study utilized consecutive sampling techniques to recruit 192 nurses. An online questionnaire was used with the aid of QuestionPro software. This software allows only one response per participant. The link to the survey was sent to them via WhatsApp, and Facebook respectively. The data were analyzed with SPSS 23. Descriptive and inferential analyses were done. Results: From the study, only about 58% of them were satisfied with the work environment, and 10% perceived nursing care to be of high quality. Workload (93%), communication practices, and culture (90%) constitute the leading factors that affect nurses' work environment. Nurses' work environment affects the perception of care quality (p<0.05). Participants' perceived quality of nursing care was found to be influenced by their age, rank, and years spent in the hospital (p<0.05) respectively. Conclusion: Nurses' participation in decision-making, appropriate recognition, staffing, and equipment adequacy will enhance satisfaction and retention.Keywords: work environment, quality of care, nursing outcomes, satisfaction
Procedia PDF Downloads 83769 The Patterns Designation by the Inspiration from Flower at Suan Sunandha Palace
Authors: Nawaporn Srisarankullawong
Abstract:
This research is about the creating the design by the inspiration of the flowers, which were once planted in Suan Sunandha Palace. The researcher have conducted the research regarding the history of Suan Sunandha Palace and the flowers which have been planted in the palace’s garden, in order to use this research to create the new designs in the future. The objective are as follows; 1. To study the shape and the pattern of the flowers in Suan Sunandha Palace, in order to select a few of them as the model to create the new design. 2. In order to create the flower design from the flowers in Suan Sunandha Palace by using the current photograph of the flowers which were once used to be planted inside the palace and using adobe Illustrator and Adobe Photoshop programs to create the patterns and the model. The result of the research: From the research, the researcher had selected three types of flowers to crate the pattern model; they are Allamanda, Orchids and Flamingo Plant. The details of the flowers had been reduced in order to show the simplicity and create the pattern model to use them for models, so three flowers had created three pattern models and they had been developed into six patterns, using universal artist techniques, so the pattern created are modern and they can be used for further decoration.Keywords: patterns design, Suan Sunandha Palace, pattern of the flowers, visual arts and design
Procedia PDF Downloads 3763768 Towards a Distributed Computation Platform Tailored for Educational Process Discovery and Analysis
Authors: Awatef Hicheur Cairns, Billel Gueni, Hind Hafdi, Christian Joubert, Nasser Khelifa
Abstract:
Given the ever changing needs of the job markets, education and training centers are increasingly held accountable for student success. Therefore, education and training centers have to focus on ways to streamline their offers and educational processes in order to achieve the highest level of quality in curriculum contents and managerial decisions. Educational process mining is an emerging field in the educational data mining (EDM) discipline, concerned with developing methods to discover, analyze and provide a visual representation of complete educational processes. In this paper, we present our distributed computation platform which allows different education centers and institutions to load their data and access to advanced data mining and process mining services. To achieve this, we present also a comparative study of the different clustering techniques developed in the context of process mining to partition efficiently educational traces. Our goal is to find the best strategy for distributing heavy analysis computations on many processing nodes of our platform.Keywords: educational process mining, distributed process mining, clustering, distributed platform, educational data mining, ProM
Procedia PDF Downloads 4583767 Reimagining Landscapes: Psychological Responses and Behavioral Shifts in the Aftermath of the Lytton Creek Fire
Authors: Tugba Altin
Abstract:
In an era where the impacts of climate change resonate more pronouncedly than ever, communities globally grapple with events bearing both tangible and intangible ramifications. Situating this within the evolving landscapes of Psychological and Behavioral Sciences, this research probes the profound psychological and behavioral responses evoked by such events. The Lytton Creek Fire of 2021 epitomizes these challenges. While tangible destruction is immediate and evident, the intangible repercussions—emotional distress, disintegration of cultural landscapes, and disruptions in place attachment (PA)—require meticulous exploration. PA, emblematic of the emotional and cognitive affiliations individuals nurture with their environments, emerges as a cornerstone for comprehending how environmental cataclysms influence cultural identity and bonds to land. This study, harmonizing the core tenets of an interpretive phenomenological approach with a hermeneutic framework, underscores the pivotal nature of this attachment. It delves deep into the realm of individuals' experiences post the Lytton Creek Fire, unraveling the intricate dynamics of PA amidst such calamity. The study's methodology deviates from conventional paradigms. Instead of traditional interview techniques, it employs walking audio sessions and photo elicitation methods, granting participants the agency to immerse, re-experience, and vocalize their sentiments in real-time. Such techniques shed light on spatial narratives post-trauma and capture the otherwise elusive emotional nuances, offering a visually rich representation of place-based experiences. Central to this research is the voice of the affected populace, whose lived experiences and testimonies form the nucleus of the inquiry. As they renegotiate their bonds with transformed environments, their narratives reveal the indispensable role of cultural landscapes in forging place-based identities. Such revelations accentuate the necessity of integrating both tangible and intangible trauma facets into community recovery strategies, ensuring they resonate more profoundly with affected individuals. Bridging the domains of environmental psychology and behavioral sciences, this research accentuates the intertwined nature of tangible restoration with the imperative of emotional and cultural recuperation post-environmental disasters. It advocates for adaptation initiatives that are rooted in the lived realities of the affected, emphasizing a holistic approach that recognizes the profundity of human connections to landscapes. This research advocates the interdisciplinary exchange of ideas and strategies in addressing post-disaster community recovery strategies. It not only enriches the climate change discourse by emphasizing the human facets of disasters but also reiterates the significance of an interdisciplinary approach, encompassing psychological and behavioral nuances, for fostering a comprehensive understanding of climate-induced traumas. Such a perspective is indispensable for shaping more informed, empathetic, and effective adaptation strategies.Keywords: place attachment, community recovery, disaster response, restorative landscapes, sensory response, visual methodologies
Procedia PDF Downloads 663766 Predication Model for Leukemia Diseases Based on Data Mining Classification Algorithms with Best Accuracy
Authors: Fahd Sabry Esmail, M. Badr Senousy, Mohamed Ragaie
Abstract:
In recent years, there has been an explosion in the rate of using technology that help discovering the diseases. For example, DNA microarrays allow us for the first time to obtain a "global" view of the cell. It has great potential to provide accurate medical diagnosis, to help in finding the right treatment and cure for many diseases. Various classification algorithms can be applied on such micro-array datasets to devise methods that can predict the occurrence of Leukemia disease. In this study, we compared the classification accuracy and response time among eleven decision tree methods and six rule classifier methods using five performance criteria. The experiment results show that the performance of Random Tree is producing better result. Also it takes lowest time to build model in tree classifier. The classification rules algorithms such as nearest- neighbor-like algorithm (NNge) is the best algorithm due to the high accuracy and it takes lowest time to build model in classification.Keywords: data mining, classification techniques, decision tree, classification rule, leukemia diseases, microarray data
Procedia PDF Downloads 3273765 A Two Level Load Balancing Approach for Cloud Environment
Authors: Anurag Jain, Rajneesh Kumar
Abstract:
Cloud computing is the outcome of rapid growth of internet. Due to elastic nature of cloud computing and unpredictable behavior of user, load balancing is the major issue in cloud computing paradigm. An efficient load balancing technique can improve the performance in terms of efficient resource utilization and higher customer satisfaction. Load balancing can be implemented through task scheduling, resource allocation and task migration. Various parameters to analyze the performance of load balancing approach are response time, cost, data processing time and throughput. This paper demonstrates a two level load balancer approach by combining join idle queue and join shortest queue approach. Authors have used cloud analyst simulator to test proposed two level load balancer approach. The results are analyzed and compared with the existing algorithms and as observed, proposed work is one step ahead of existing techniques.Keywords: cloud analyst, cloud computing, join idle queue, join shortest queue, load balancing, task scheduling
Procedia PDF Downloads 4373764 Towards an Intelligent Ontology Construction Cost Estimation System: Using BIM and New Rules of Measurement Techniques
Authors: F. H. Abanda, B. Kamsu-Foguem, J. H. M. Tah
Abstract:
Construction cost estimation is one of the most important aspects of construction project design. For generations, the process of cost estimating has been manual, time-consuming and error-prone. This has partly led to most cost estimates to be unclear and riddled with inaccuracies that at times lead to over- or under-estimation of construction cost. The development of standard set of measurement rules that are understandable by all those involved in a construction project, have not totally solved the challenges. Emerging Building Information Modelling (BIM) technologies can exploit standard measurement methods to automate cost estimation process and improves accuracies. This requires standard measurement methods to be structured in ontologically and machine readable format; so that BIM software packages can easily read them. Most standard measurement methods are still text-based in textbooks and require manual editing into tables or Spreadsheet during cost estimation. The aim of this study is to explore the development of an ontology based on New Rules of Measurement (NRM) commonly used in the UK for cost estimation. The methodology adopted is Methontology, one of the most widely used ontology engineering methodologies. The challenges in this exploratory study are also reported and recommendations for future studies proposed.Keywords: BIM, construction projects, cost estimation, NRM, ontology
Procedia PDF Downloads 553