||Nowadays, more and more people join social networks, such as Facebook, Linkedin, and Livespace, to share information and to monitor or participate in different activities. This gives people a great opportunity to obtain useful information from these social network data. Meanwhile, the information stored in these social networks are under high risk of attack by various malicious users, in other words, people's privacy could be easily breached via some domain knowledge. Thus, for a service provider, such as Facebook and Linkedin, how to publish a privacy preserving graph becomes an important problem. It is essential to protect users' privacy and at the same time provide 'useful' data. Targeting the privacy preserving graph publication problem, in this thesis, we propose graph publishing models, which cover different aspects of graph publication issues. 1) We propose a novel privacy preserving graph construction technique based on adding noise nodes. This new graph construction algorithm provides privacy protection for the individuals in the graph and their attributes as well as maintaining good graph utility. 2)We further propose a personalized protection framework for social networks, which publishes graphs with the consideration of users' personalized privacy settings. 3) After observing the lack of related work on the weighted graph model, which is general for online social network application, we propose graph protection models which protect individuals when the weights on the relationships are considered. 4) To solve the privacy leakage when the algorithm which generates the published graph is known by the attacker, we propose a new protecting model which protects both the sensitive labels and sensitive links in a graph. 5) We propose a general fine-grained adjusting framework to publish a privacy protected and utility preserved graph. Using this framework, the data publisher gets a trade-off between privacy and utility according to the data publisher's customized preference. The protected privacy and preserved utilities can be quantified. We use protecting the privacy of a weighted graph as an example to demonstrate the implementation of this framework.