博客
关于我
强烈建议你试试无所不能的chatGPT,快点击我
Convert UIImage to NSString (and vice-versa)
阅读量:6693 次
发布时间:2019-06-25

本文共 5210 字,大约阅读时间需要 17 分钟。

I need a method to convert a UIImage in a NSString and then convert the NSString back to a UIImage.

image name? path? url? image data as base64? What do you try to do?

 

 

Convert it to a binary stream instead (NSData). This will depend on the format of your UIImage. If it's a JPEG/PNG for instance, you do:

NSData*data1 =UIImageJPEGRepresentation(image,1.0); NSData*data2 =UIImagePNGRepresentation(image);

UPDATE: Converting the binary data to NSString is a bad idea, that is why we have the classNSData. The OP wants to be able to send it as a data stream and then reconstruct it again;NSString will not be needed for this.

 

 

 

I got a valid NSString using [UIImagePNGRepresentation(image) base64Encoding]. Now, all I need is a way to convert a Base 64 encoded NSString in a NSData.

All you need is in NSString: -dataUsingEncoding: and -initWithData:encoding:

Convert to PNG or JPEG using UIImagePNGRepresentation or UIImageJPEGRepresentation, which will return an NSData, and then convert the NSData to a string (not sure how you want to do that mapping). How about just dealing with the NSData? You can read/write that to a file.

 

I don't want to write to the disk because the I/O latency is too high.

 

 

 

my problem is: i get from webservices an image. this is stored in NSString. and now i want show it as UIImage. But I dont know how to do it.

how i can convert data from NSString in NSData?

you can use the below methods for converting string to data

[NSData dataFromBase64String:yourString]; [NSData dataFromBase64EncodedString:yourString]

 

This question is related to Iphone SDK, NSData and UIImage.

I am trying to create an image from the Avatar Data returned from the xmpp like the following:

Away due to idle.
0
away
a3f549fa9705e7ead2905de0b6a804227ecdd404
a3f549fa9705e7ead2905de0b6a804227ecdd404

So in this case, I assume that a3f549fa9705e7ead2905de0b6a804227ecdd404 is the photo data. So How can I transfer this into NSData?

I think if I can get the NSData object, I can easily create the UIImage, right?


I think "a3f549fa9705e7ead2905de0b6a804227ecdd404" is the photo data this is my codes:

NSString* command =@"a3f549fa9705e7ead2905de0b6a804227ecdd404"; command =[command stringByReplacingOccurrencesOfString:@" " withString:@""]; NSMutableData*commandToSend=[[NSMutableData alloc] init]; unsignedchar whole_byte; char byte_chars[3]={
'\0','\0','\0'}; int i; for(i=0; i <[command length]/2; i++){
    byte_chars[0]=[command characterAtIndex:i*2];     byte_chars[1]=[command characterAtIndex:i*2+1];     whole_byte = strtol(byte_chars, NULL,16);     [commandToSend appendBytes:&whole_byte length:1]; } UIImage*image =[UIImage imageWithData: commandToSend];

However, it doesn't work. Anyone knows what's wrong with it?

In XMPPPresence.m add this method

-

 (NSString*)photo {
        NSXMLElement*xElement =[self elementForName:@"x" xmlns:@"vcard- temp:x:update"];         NSString*photoHash =[[xElement elementForName:@"photo"] stringValue];         return photoHash; }

// In XMPPStream's delegate:

-(void)xmppStream:(XMPPStream*)stream didReceivePresence: (XMPPPresence*)presence {
        NSString*photoHash =[presence photo];         if([photoHash length]>0){
  // in case when there's no photo hash                 XMPPJID *rosterJID =[presence from];                 BOOL requestPhoto =...// determine if you need to request new photo or nor                 if(requestPhoto){
                        NSXMLElement*iqAvatar =[NSXMLElement elementWithName:@"iq"];                         NSXMLElement*queryAvatar =[NSXMLElement elementWithName:@"vCard" xmlns:@"vcard-temp"];                         [iqAvatar addAttributeWithName:@"type" stringValue:@"get"];                         [iqAvatar addAttributeWithName:@"to" stringValue:[rosterJID full]];                         [iqAvatar addChild:queryAvatar];                         XMPPIQ *avatarRequestIQ =[XMPPIQ iqFromElement:iqAvatar];                         [stream sendElement:avatarRequestIQ];                 }         } }

// And when buddy will send photo, it will be in vcard BASE64-encoded. // You will receive it as IQ:

-(BOOL)xmppStream:(XMPPStream*)stream didReceiveIQ:(XMPPIQ *)iq {
        XMPPElement*vCardPhotoElement =(XMPPElement*)[[iq elementForName:@"vCard"] elementForName:@"PHOTO"];         if(vCardPhotoElement !=nil){
                // avatar data                 NSString*base64DataString =[[vCardPhotoElement elementForName:@"BINVAL"] stringValue];                 NSData*imageData =[NSData dataFromBase64String:base64DataString];   // you need to get NSData BASE64 category                 UIImage*avatarImage =[UIImage imageWithData:imageData];                 XMPPJID *senderJID =[iq from];                 [self xmppStream:stream didReceiveImage:avatarImage forBuddy:senderJID];   // this is my custom delegate method where I save new avatar to cache         }         return NO; }

Hope this will help you.

 

 

 

That is the picture hash you now have to send a vcard request which will contain the same hash for verification and binval containing the picture data in base64

 

 

转载地址:http://gicoo.baihongyu.com/

你可能感兴趣的文章
k8s使用deployment升级
查看>>
第一个Indigo Service
查看>>
提高代码质量-工具篇
查看>>
kvm虚拟化学习笔记(十六)之kvm虚拟化存储池配置
查看>>
★Kali信息收集~ 5.The Harvester:邮箱挖掘器
查看>>
TNS-12502: TNS:listener received no CONNECT_DATA from client
查看>>
我的友情链接
查看>>
常见的内存错误及其对策
查看>>
阿里云域名配置与解析
查看>>
Go环境变量
查看>>
高性能Web服务之tomcat基础应用详解(一)
查看>>
Python虚拟环境:Vitualenv
查看>>
反思~~~~~~思绪有点乱
查看>>
android-------非常好的图片加载框架和缓存库(Picasso)
查看>>
Titanium, PhoneGap, Sencha Touch, jQuery Mobile – Clearing up confusion
查看>>
eclipse如何部署Web工程到tomcat中
查看>>
在CentOS7上安装JDK1.8
查看>>
搜索和网页排名的数学原理
查看>>
Xcode项目中同一个名称不同位置 简单修改
查看>>
java设计模式-建造者模式
查看>>